0% found this document useful (0 votes)
31 views13 pages

Chapter 11 LeastSquaresEstimation

Uploaded by

mrizwan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
31 views13 pages

Chapter 11 LeastSquaresEstimation

Uploaded by

mrizwan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 13

Probability and Statistics for Computer Scientists

Third Edition, By Michael Baron

Section 11.1: Least squares


estimation
CIS
2033. Computational Probability and Statistics
Pei Wang
Regression models
A regression model relates a response (or
dependent) variable Y to one or several
predictors (or independent) variables X(1),
…, X(k)
Regression of Y on X(1), …, X(k) is the
conditional expectation
G(x(1), …, x(k)) = E[Y | X(1) = x(1), …, X(k) =
x(k)]
We only consider the cases of k = 1, that is,
G(x) = E[Y | X = x]
Regression example: linear
Regression example: non-
linear
Overfitting a model
Overfitting a model: to fit a regression
line too closely to the observed data
often lead to poor predictions
Linear regression
The simple linear regression model for a
bivariate dataset (x1, y1), . . . , (xn, yn) is
Yi = α + βxi + Zi, for i = 1, . . ., n,
where Z1, . . . , Zn are independent random
variables with zero expectation
The ith residual ri is the distance between
the ith point and the estimated regression
line:
Method of least squares
Choose α and β to minimize the total residual

+
Parameters estimation (1)
To get α and β from (x1, y1), . . . , (xn,
yn):
Parameters estimation (2)
Solve the previous equations:

Both estimators are unbiased


Parameters estimation (3)
Another equivalent method to
estimate the parameters in y = b0 +
b1x is to let
Regression and correlation
Regression and correlation
(2)
The estimated slope (β or b1) is
proportional
to the sample regression coefficient r

β > 0: X and Y are positively correlated


β < 0: X and Y are negatively correlated
β = 0: Y is a constant, uncorrelated to X
Summary
1. Linear regression
2. The method of least squares
3. Regression and correlation

You might also like