0% found this document useful (0 votes)
20 views

Cost Function - Coursera

The cost function measures the accuracy of a hypothesis function by taking the average difference between the predicted outputs from the hypothesis and the actual outputs from the data. It calculates the mean squared error between the hypothesis's predicted values and the actual values. The cost function, also known as the squared error function or mean squared error function, provides a single measure of how well the hypothesis predicts the values in the training data.

Uploaded by

Shubhank Shukla
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views

Cost Function - Coursera

The cost function measures the accuracy of a hypothesis function by taking the average difference between the predicted outputs from the hypothesis and the actual outputs from the data. It calculates the mean squared error between the hypothesis's predicted values and the actual values. The cost function, also known as the squared error function or mean squared error function, provides a single measure of how well the hypothesis predicts the values in the training data.

Uploaded by

Shubhank Shukla
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

10/22/2017 Cost Function | Coursera

Back to Week 1 Lessons Prev Next

Cost Function
We can measure the accuracy of our hypothesis function by using a cost function. This
takes an average di erence (actually a fancier version of an average) of all the results of the
hypothesis with inputs from x's and the actual output y's.

m m
2
1 1 2
^
J( 0 , 1 ) = yi yi = (h (xi ) y i )
2m ( )
2m
i=1 i=1

To break it apart, it is 1

2
x where x is the mean of the squares of h (xi ) yi , or the
di erence between the predicted value and the actual value.

This function is otherwise called the "Squared error function", or "Mean squared error".
The mean is halved ( 12 ) as a convenience for the computation of the gradient descent, as
the derivative term of the square function will cancel out the 1

2
term. The following image
summarizes what the cost function does:

https://www.coursera.org/learn/machine-learning/supplement/nhzyF/cost-function 1/2

You might also like