0% found this document useful (0 votes)
180 views39 pages

Econometrics Lecture 3 Simple Linear Regression (SLR) For Cross Sectional Data Part 2

This document provides an overview of simple linear regression (SLR) including the sample regression function, model, error term, residuals, methods of estimation including method of moments and method of least squares, properties of ordinary least squares (OLS) estimation including unbiasedness of the estimators, how to estimate the variance of the error term, the R-squared measure of goodness of fit, and the procedure to prove the Gauss-Markov theorem for SLR. Key aspects covered include the normal equations, first-order conditions, conditional/unconditional expectations and variances, and degrees of freedom.

Uploaded by

Maksim Nelepa
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
180 views39 pages

Econometrics Lecture 3 Simple Linear Regression (SLR) For Cross Sectional Data Part 2

This document provides an overview of simple linear regression (SLR) including the sample regression function, model, error term, residuals, methods of estimation including method of moments and method of least squares, properties of ordinary least squares (OLS) estimation including unbiasedness of the estimators, how to estimate the variance of the error term, the R-squared measure of goodness of fit, and the procedure to prove the Gauss-Markov theorem for SLR. Key aspects covered include the normal equations, first-order conditions, conditional/unconditional expectations and variances, and degrees of freedom.

Uploaded by

Maksim Nelepa
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 39

Lecture 3

Simple Linear Regression (SLR)

for cross sectional data (part 2)


Sample Regression Model and Function
 
Sample Regression Function (SRF) or sample regression line (SRL)

SRF + where is the fitted or predicted y


PRF

Sample Regression Model (SRM)

SRM = + Residual

PRM Error term


Error term and residual
y = Salary Sample regression
function

yi
  residual
  error term
𝑦
 ^
𝑖

  E(|

Pop regression function

xi = 8%
 
= roe (%)
Methods of estimation

 
Method of moments ()

Choose the estimates by requiring the sample to exhibit


some desirable properties of the population.

Method of least squares ()

Choose the estimates to minimize the sum of squared


residuals.
Example: Return to education
 

Let innate ability () be an omitted variable


captured in the error term

Is unbiased?
  Method of moments ()
 
Obtain the “sample counterpart” of the zero conditional
mean property of the error term , i.e.

•  by the law of iterative expectations (LTE)


1. Method of moments
  conditional mean (pop)
Zero Sample counterpart

, or =0

1st normal equation

or =0

2nd normal equation


  OLS estimator for and in SLR

 
 OLS estimator for and in SLR

 
Data input to econometric package
 + +

Intercept/constant
regressor

1 1

2
2 1
1
3 1
3 1
4 1
4 1
2. Method of least squares
 

Given a set of sample data

Least squares procedure searches for the pair of


that minimizes
Method of least squares
 

Plot the as a function of


Review quadratic function.
 
attains its minimize value at
  Point A
𝑺𝑺𝑹 ( 𝜷^𝟎 , ^
𝜷 𝟏)

 
 ^
𝜷𝟏 ^
𝛽1
B
 
^
𝛽0  ^
𝜷𝟎 A
 ^
𝜷  
𝟎
 
Consider any point on the “floor”

If is fixed at , becomes a quadratic function of and attains its


global min at some
at

If is fixed at , becomes a quadratic function of and attains its


global min at some
at

If happens to be the global min of function, i.e. , then and !


Method of least squares
 

Consider any point on the “floor”

First order conditions


Method of least squares
  the point , then
If

This point satisfies both FOC:

These two are the same as the two in the


Notation convention adopted in this course
 
Explicitly state whether you are taking conditional or
unconditional expectation or variance.

If it is conditional, state what is it conditional on


• e.g.

If it is unconditional, state “unconditional”


• e.g. Unconditional

Do not just write or


Theorem 2.1 Unbiasedness of OLS estimators
 
Under SLR.1 to SLR.4, OLS estimator is unbiased
conditional on a given set of sample values of

See the proof in the handout.


4 Properties of OLS estimation for SLR
  The residuals have a zero mean (or sum to zero)
1.

2. The residuals are uncorrelated with the indep var

3. The sample regression line passes through


the “middle of sample data point”, i.e. the point

4.
𝒚=𝑮𝑷𝑨
 

 𝒙=𝑮𝑹𝑬
4 Properties of OLS estimation SLR
 

2 normal eqts or FOCs 


R-squared: A measure of goodness of fit
 

is the fraction of total variation of from its own mean, as


measured by , that can be by the indep var(s).

0 ≤ ≤ 1 as long as the model has an intercept.

• is aka coefficient of determination.


R-squared: A measure of goodness of fit

  can be shown that is equal to the “square” of sample


It
correlation between and in SLR.

where
Estimate the variance of the error term 2
  natural estimator for is the variance of the residuals.
A

Degrees of freedom reflects the 2 restrictions on sample


data , in order to get and which are needed to obtain

is standard error of the regression or standard error of the


estimate.

must be
Estimate the variance of the error term 2

 Theorem 2.3 (Unbiasdness of )

Under SLR.1 – SLR.5, c


  Conditional variance of
 
Unconditional variance of

Conditional variance of on a given set of sample values


Theorem 2.2 (Variances of OLS estimators)
 
Theorem 2.2 (Variances of OLS estimators)

Under SLR.1 – SLR.5, c

• See the proof in the handout.


𝐸 ( 𝑦|𝑥 )=𝛽 0 +𝛽 1 𝑥
 
𝐸 E 𝐸 ( 𝑦|𝑥 )=𝛽 0 +𝛽 1 𝑥
 
  Estimator for
 
Recall:
A natural estimator for is which is an unbiased estimator:

Given Theorem 2.3,


 Standard ^
  error   of   𝛽 1

This is the ,

Don’t confuse with standard error of the regression


which is
Procedure to proof Gauss Markov Theorem for SLR
 
Consider any linear estimator , let = 3

If (OLS estimator can satisfy)

is unbiased under
Procedure to proof Gauss Markov Theorem for SLR
 

Under SLR.1 to SLR.5

Minimize:
and

It can be shown that


 This is BLUE and is OLS
Appendix
Quick review: Quadratic function
A  variable is a quadratic function of if

• = , where are constants,  0

• Examples:

> 0, is a parabola and has a global minimum value


< 0, is an inverted parabola and has a global maximum value

attains its global max/ min at and


Quick review: Partial derivative

 
Consider a multivariate function

The partial derivative of with respect to () is the derivative of


with respect to while holding other at some fixed values ( other
are held as constants in evaluating

 +
Warnings about notations

 
Some authors use different notations for and from those
in our textbook.

• They refer to as Sum of Squared Regression (SSR)

• Their SSR is our SSE

• They refer to as Sum of Squared Error (SSE)

• Their SSE is our SSR

You might also like