0% found this document useful (0 votes)
59 views5 pages

Assignment 2 Roll No 02.

This document contains the results of regression analyses examining the relationship between variables Y, X2, and X3. There is very high multicollinearity between the variables. Regressions were run omitting each variable individually, finding that X2 and X3 were both positively correlated with Y when analyzed separately. An auxiliary regression of X2 on X3 found nearly perfect correlation between the two variables, with an extremely high t-statistic. In summary, the variables are highly correlated but regressions identified positive relationships between each of X2, X3 and the dependent variable Y when analyzed individually.

Uploaded by

Anonymous Hx7MTi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
59 views5 pages

Assignment 2 Roll No 02.

This document contains the results of regression analyses examining the relationship between variables Y, X2, and X3. There is very high multicollinearity between the variables. Regressions were run omitting each variable individually, finding that X2 and X3 were both positively correlated with Y when analyzed separately. An auxiliary regression of X2 on X3 found nearly perfect correlation between the two variables, with an extremely high t-statistic. In summary, the variables are highly correlated but regressions identified positive relationships between each of X2, X3 and the dependent variable Y when analyzed individually.

Uploaded by

Anonymous Hx7MTi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Assignment no.

2
Name: Roqia
Roll no. BECF15M002
Subject: Applied Economics
Submitted to: Sir Falak Sher

Department of Economics
University of Sargodha
Multicollinerity:
Correlation Matrix:
Y X2 X3
Y 1.000000 0.857369 0.857438
X2 0.857369 1.000000 0.999995
X3 0.857438 0.999995 1.000000

Interpretation:
The results are symmetrical, while the diagonal element are equal to 1 because they are correlation
coefficient of the same series. This correlation matrix shows that Y is highly positively correlated with
both X2 and X3 are nearly the same variable (the correlation coefficient is equal to 0.999995, i.e. very
close to 1). There will be a very high possibility of the negative effects of multicollinerity.

Regression Results (full model)

Dependent Variable: X2
Method: Least Squares
Date: 03/12/19 Time: 09:21
Sample: 1 25
Included observations: 25

Variable Coefficient Std. Error t-Statistic Prob.

C -0.117288 0.117251 -1.000310 0.3276


X3 0.250016 0.000164 1521.542 0.0000

R-squared 0.999990 Mean dependent var 159.4320


Adjusted R-squared 0.999990 S.D. dependent var 81.46795
S.E. of regression 0.262305 Akaike info criterion 0.237999
Sum squared resid 1.582488 Schwarz criterion 0.335509
Log likelihood -0.974992 Hannan-Quinn criter. 0.265045
F-statistic 2315090. Durbin-Watson stat 2.082420
Prob(F-statistic) 0.000000

Interpretation:
This table shows that the effect of x2 on Y is negative and the effect of X3 is positive, while both
variables appears to be insignificant results considering the fact that both variables are highly
correlated with Y.

Regression result (omitting x3)

Dependent Variable: Y
Method: Least Squares
Date: 03/12/19 Time: 09:39
Sample: 1 25
Included observations: 25

Variable Coefficient Std. Error t-Statistic Prob.

C 36.71861 18.56953 1.977358 0.0601


X2 0.832012 0.104149 7.988678 0.0000

R-squared 0.735081 Mean dependent var 169.3680


Adjusted R-squared 0.723563 S.D. dependent var 79.05857
S.E. of regression 41.56686 Akaike info criterion 10.36910
Sum squared resid 39739.49 Schwarz criterion 10.46661
Log likelihood -127.6138 Hannan-Quinn criter. 10.39615
F-statistic 63.81897 Durbin-Watson stat 2.921548
Prob(F-statistic) 0.000000

Interpretation:
Estimating the model including only X2 and respecifing the equation by excluding X3 variable
we get the result that shows that X2 is positive and statistically significant (with a t-statistics of
7.98).
Regression results (omittingx2)
Dependent Variable: Y
Method: Least Squares
Date: 03/12/19 Time: 09:42
Sample: 1 25
Included observations: 25

Variable Coefficient Std. Error t-Statistic Prob.

C 36.60968 18.57637 1.970766 0.0609


X3 0.208034 0.026033 7.991106 0.0000

R-squared 0.735199 Mean dependent var 169.3680


Adjusted R-squared 0.723686 S.D. dependent var 79.05857
S.E. of regression 41.55758 Akaike info criterion 10.36866
Sum squared resid 39721.74 Schwarz criterion 10.46617
Log likelihood -127.6082 Hannan-Quinn criter. 10.39570
F-statistic 63.85778 Durbin-Watson stat 2.916396
Prob(F-statistic) 0.000000

Interpretation:
In this table re- estimating the model including only X3 and omitting X2, we get the result that
shows that X3 is highly significant and positive.

Auxiliary Regression Result (Regression X2 to X3)

Dependent Variable: X2
Method: Least Squares
Date: 03/16/19 Time: 22:22
Sample: 1 25
Included observations: 25

Variable Coefficient Std. Error t-Statistic Prob.

C -0.117288 0.117251 -1.000310 0.3276


X3 0.250016 0.000164 1521.542 0.0000

R-squared 0.999990 Mean dependent var 159.4320


Adjusted R-squared 0.999990 S.D. dependent var 81.46795
S.E. of regression 0.262305 Akaike info criterion 0.237999
Sum squared resid 1.582488 Schwarz criterion 0.335509
Log likelihood -0.974992 Hannan-Quinn criter. 0.265045
F-statistic 2315090. Durbin-Watson stat 2.082420
Prob(F-statistic) 0.000000

Interpretation:
Finally, running an auxiliary of X2 on a constant and X3 yields the results shown in above table
that the value of the t- statistics is extremely high(1521.542!) while R2 is nearly 1.

You might also like