0% found this document useful (0 votes)
598 views

Basic Econometrics - Lecture Notes

1. Multicollinearity occurs when there is a strong linear relationship between two or more predictor variables in a regression model. This can result in unstable and unpredictable estimates of regression coefficients. 2. The consequences of multicollinearity include estimates remaining best linear unbiased estimates but having larger standard errors and wider confidence intervals, as well as insignificant coefficients and highly sensitive estimates. 3. Multicollinearity can be detected by having a high R-squared but insignificant individual coefficients, high pairwise correlations among predictors, and variance inflation factors above 10. Remedies include doing nothing, pooling panel data, dropping variables, or adding new data.

Uploaded by

bhavan123
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
598 views

Basic Econometrics - Lecture Notes

1. Multicollinearity occurs when there is a strong linear relationship between two or more predictor variables in a regression model. This can result in unstable and unpredictable estimates of regression coefficients. 2. The consequences of multicollinearity include estimates remaining best linear unbiased estimates but having larger standard errors and wider confidence intervals, as well as insignificant coefficients and highly sensitive estimates. 3. Multicollinearity can be detected by having a high R-squared but insignificant individual coefficients, high pairwise correlations among predictors, and variance inflation factors above 10. Remedies include doing nothing, pooling panel data, dropping variables, or adding new data.

Uploaded by

bhavan123
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Basic Econometrics

PGDMB15(2014-16)
Lecture 10 Notes
Multicollinearity

Multicollinearity

Consider the following multiple regression equation:

Yi = 0 + 1 X1i + 2 X2i + +ui


P

c1 =
Recall

P
P
P
P
( yi x1i )( x22i )( yi x2i )( x1i x2i )
P
P
P
( x21i )( x22i )( x1i x2i )2

P
P
c1 =
( x21i )( x22i )] Thus,

rY 1 SY
S1

 

r
S r
Y 2 SY 12
1

2
1r12

y x
P i 21i
x
1i

 P
 P

y x
x x
P i 22i
P1i2 2i

x
2i
P
( x1i x2i )2
P 2
2
x
x
1i
2i

1 P

1i

[dividing num and deno by

(rY 1 rY 2 r12 )SY


2 )S
(1r12
1

c1 = rY 1 SY = Cov(Y,X
P 2 1)
If r12 = 0
S1
x1i
c1 is indeterminate. [Problem of perfect/exact collinearity]
if r12 = 1 then
 
c1
Let us now look at the V ar
 
2

c1 =
P 2
Recall, V ar
2
 (1r12 ) x12
1
In general, V ar bj = 1R2 P x2 , where V IF = 1R
1 where Rj2 = R2 in the regression (auxiliary
2
(
(
j)
j)
j
regression) of Xj on all other remaining regressors.

1.1

Consequences

Estimates remain BLUE.


Larger standard errors wider CI
Insignificant coefficients (low t-stats)
Highly sensitive estimates

1.2

Detection

High R2 but individual coefficients are insignificant


High pairwise correlation among regressors.
VIF

1.3

Remedies

Do nothing school
Pooling data in case of panel dataset
Dropping variables (omitted variable bias problem might creep in)
1

Add new data (may not work always)


Now work on the sample dataset cars.xls. The variables are as follows:
MPG: miles per gallon
CYL: No. of cylinders
ENG: Engine displacement
WGT: Vehicle weight

You might also like