TestExercise 3.ipynb - Colab
TestExercise 3.ipynb - Colab
ipynb - Colab
1
2 import pandas as pd
3 import numpy as np
4
5 df = pd.read_excel("TestExercise-3.xlsx")
1 df.head()
1 X.head()
https://colab.research.google.com/drive/13b68CWylarwD28XhRqBUi5TJbyIUZQ-d?authuser=0#scrollTo=bT8oUQ0zCyBZ&printMode=true 1/8
17/07/2024, 10:30 TestExercise 3.ipynb - Colab
Kurtosis: 3.980 Cond. No. 102.
==============================================================================
Notes:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
1 #Lets drop UNEMPL as it has the highest p-value among all the parameters and fit again
2
3 X = X.drop(["UNEMPL"], axis=1)
4 est = sm.OLS(y, X).fit()
5 print(est.summary())
Notes:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
Notes:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
https://colab.research.google.com/drive/13b68CWylarwD28XhRqBUi5TJbyIUZQ-d?authuser=0#scrollTo=bT8oUQ0zCyBZ&printMode=true 2/8
17/07/2024, 10:30 TestExercise 3.ipynb - Colab
OLS Regression Results
==============================================================================
Dep. Variable: INTRATE R-squared: 0.633
Model: OLS Adj. R-squared: 0.631
Method: Least Squares F-statistic: 282.3
Date: Wed, 17 Jul 2024 Prob (F-statistic): 6.18e-141
Time: 04:37:10 Log-Likelihood: -1454.3
No. Observations: 660 AIC: 2919.
Df Residuals: 655 BIC: 2941.
Df Model: 4
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [0.025 0.975]
------------------------------------------------------------------------------
const -0.2136 0.231 -0.923 0.356 -0.668 0.241
INFL 0.7448 0.057 13.149 0.000 0.634 0.856
PCE 0.3110 0.059 5.311 0.000 0.196 0.426
PERSINC 0.2569 0.059 4.327 0.000 0.140 0.373
HOUST -0.0215 0.004 -4.893 0.000 -0.030 -0.013
==============================================================================
Omnibus: 27.399 Durbin-Watson: 0.100
Prob(Omnibus): 0.000 Jarque-Bera (JB): 33.853
Skew: 0.416 Prob(JB): 4.46e-08
Kurtosis: 3.733 Cond. No. 62.7
==============================================================================
Notes:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
1
2
3 # Start with just a constant
4 X = pd.DataFrame(np.ones(len(df)), columns=['const'])
5 est = sm.OLS(y, X).fit()
6 print(est.summary())
7
8 # Add INFLATION
9 X['INFL'] = df['INFL']
10 est = sm.OLS(y, X).fit()
11 print(est.summary())
12
13
Notes:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
OLS Regression Results
==============================================================================
Dep. Variable: INTRATE R-squared: 0.560
Model: OLS Adj. R-squared: 0.559
Method: Least Squares F-statistic: 836.6
Date: Wed, 17 Jul 2024 Prob (F-statistic): 2.47e-119
Time: 04:46:43 Log-Likelihood: -1514.2
No. Observations: 660 AIC: 3032.
Df Residuals: 658 BIC: 3041.
Df Model: 1
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [0.025 0.975]
------------------------------------------------------------------------------
const 1.6421 0.159 10.352 0.000 1.331 1.954
INFL 0.9453 0.033 28.925 0.000 0.881 1.010
==============================================================================
https://colab.research.google.com/drive/13b68CWylarwD28XhRqBUi5TJbyIUZQ-d?authuser=0#scrollTo=bT8oUQ0zCyBZ&printMode=true 3/8
17/07/2024, 10:30 TestExercise 3.ipynb - Colab
Omnibus: 5.019 Durbin-Watson: 0.063
Prob(Omnibus): 0.081 Jarque-Bera (JB): 4.841
Skew: 0.193 Prob(JB): 0.0889
Kurtosis: 3.166 Cond. No. 8.46
==============================================================================
Notes:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
1 # It has a good p-value so we keep it, now adding PROD variable column
2
3 X['PROD'] = df['PROD']
4 est = sm.OLS(y, X).fit()
5 print(est.summary())
Notes:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
1 #Even prod has a good p-value, we keep it too, now lets add UNEMPL
2
3 X['UNEMPL'] = df['UNEMPL']
4 est = sm.OLS(y, X).fit()
5 print(est.summary())
Notes:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
1 # Even unempl has a good p-value, and prod has a p -value less than our threshold of 0.05, so lets keep it too
2 # Lets add COMMPRI
3
4 X['COMMPRI'] = df['COMMPRI']
5 est = sm.OLS(y, X).fit()
6 print(est.summary())
https://colab.research.google.com/drive/13b68CWylarwD28XhRqBUi5TJbyIUZQ-d?authuser=0#scrollTo=bT8oUQ0zCyBZ&printMode=true 4/8
17/07/2024, 10:30 TestExercise 3.ipynb - Colab
Model: OLS Adj. R-squared: 0.595
Method: Least Squares F-statistic: 243.3
Date: Wed, 17 Jul 2024 Prob (F-statistic): 6.16e-128
Time: 04:46:46 Log-Likelihood: -1484.5
No. Observations: 660 AIC: 2979.
Df Residuals: 655 BIC: 3001.
Df Model: 4
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [0.025 0.975]
------------------------------------------------------------------------------
const 1.1943 0.172 6.949 0.000 0.857 1.532
INFL 0.9009 0.035 25.487 0.000 0.831 0.970
PROD -0.0415 0.041 -1.018 0.309 -0.121 0.039
UNEMPL 0.4349 0.092 4.742 0.000 0.255 0.615
COMMPRI -0.0061 0.003 -1.965 0.050 -0.012 -4.01e-06
==============================================================================
Omnibus: 42.015 Durbin-Watson: 0.066
Prob(Omnibus): 0.000 Jarque-Bera (JB): 56.210
Skew: 0.536 Prob(JB): 6.22e-13
Kurtosis: 3.946 Cond. No. 65.7
==============================================================================
Notes:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
1 # Lets remove prod given its high p value and add PCE
2
3 X['PCE'] = df['PCE']
4 #Removing PROD
5 X = X.drop(["PROD"], axis=1)
6 est = sm.OLS(y, X).fit()
7 print(est.summary())
Notes:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
https://colab.research.google.com/drive/13b68CWylarwD28XhRqBUi5TJbyIUZQ-d?authuser=0#scrollTo=bT8oUQ0zCyBZ&printMode=true 5/8
17/07/2024, 10:30 TestExercise 3.ipynb - Colab
PCE 0.1855 0.061 3.049 0.002 0.066 0.305
PERSINC 0.2658 0.061 4.349 0.000 0.146 0.386
==============================================================================
Omnibus: 12.857 Durbin-Watson: 0.083
Prob(Omnibus): 0.002 Jarque-Bera (JB): 13.800
Skew: 0.286 Prob(JB): 0.00101
Kurtosis: 3.418 Cond. No. 94.5
==============================================================================
Notes:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
Notes:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
c.) Solving for part c, computing the values for our models a and Taylors rule
1 # Getting the model from Part a and regressing it again, keeping onlt the INFL, PCE, PERSINC, HOUST parameters from the d
2
3 y = df["INTRATE"]
4 X = pd.DataFrame(np.ones(len(df)), columns=['const'])
5 X = pd.concat([X, df.drop(["INTRATE"], axis=1)], axis=1)
6 X = X.drop(["OBS"], axis=1)
7 X = X.drop(["PROD"], axis=1)
8 X = X.drop(["UNEMPL"], axis=1)
9 X = X.drop(["COMMPRI"], axis=1)
10
11
12 est = sm.OLS(y, X).fit()
13 print(est.summary())
https://colab.research.google.com/drive/13b68CWylarwD28XhRqBUi5TJbyIUZQ-d?authuser=0#scrollTo=bT8oUQ0zCyBZ&printMode=true 6/8
17/07/2024, 10:30 TestExercise 3.ipynb - Colab
INFL 0.7448 0.057 13.149 0.000 0.634 0.856
PCE 0.3110 0.059 5.311 0.000 0.196 0.426
PERSINC 0.2569 0.059 4.327 0.000 0.140 0.373
HOUST -0.0215 0.004 -4.893 0.000 -0.030 -0.013
==============================================================================
Omnibus: 27.399 Durbin-Watson: 0.100
Prob(Omnibus): 0.000 Jarque-Bera (JB): 33.853
Skew: 0.416 Prob(JB): 4.46e-08
Kurtosis: 3.733 Cond. No. 62.7
==============================================================================
Notes:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
1 # Lets now regeress o the Taylor rule, which he chooses X as only a constant + INFL + PROD only
2
3 X = pd.DataFrame(np.ones(len(df)), columns=['const'])
4 X['INFL'] = df['INFL']
5 X['PROD'] = df['PROD']
6 est = sm.OLS(y, X).fit()
7 print(est.summary())
Notes:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
c.) Hence the values of R squared, AIC and BIC are better for our model than the Taylors rule
1
2
3 # RESET test
4 from statsmodels.stats.diagnostic import linear_reset
5 print("______________________________________________")
6 reset_test = linear_reset(est, power=2)
7 print(reset_test)
8 print("______________________________________________")
9
10 # Chow break test
11 from statsmodels.formula.api import ols
12 from statsmodels.stats.api import anova_lm
13 df['dummy'] = (df.index >= 20).astype(int)
14 model1 = ols('INTRATE ~ INFL + PROD', data=df).fit()
15 model2 = ols('INTRATE ~ INFL + PROD + dummy + INFL:dummy + PROD:dummy', data=df).fit()
16 chow_test = anova_lm(model1, model2)
17 print(chow_test)
18 print("______________________________________________")
19
20 # Forecast test
21 from statsmodels.tsa.stattools import adfuller
22 pre_break = df['INTRATE'][:20]
23 post_break = df['INTRATE'][20:]
24 adf_pre = adfuller(pre_break)
25 adf_post = adfuller(post_break)
26 print('Pre-break ADF statistic:', adf_pre[0])
27 print('Post-break ADF statistic:', adf_post[0])
28 print("______________________________________________")
29
30 # Jarque-Bera test
https://colab.research.google.com/drive/13b68CWylarwD28XhRqBUi5TJbyIUZQ-d?authuser=0#scrollTo=bT8oUQ0zCyBZ&printMode=true 7/8
17/07/2024, 10:30 TestExercise 3.ipynb - Colab
31 from statsmodels.stats.stattools import jarque_bera
32 jb_test = jarque_bera(est.resid)
33 print(jb_test)
34 print("______________________________________________")
35
______________________________________________
<Wald test (chi2): statistic=2.5371195394336548, p-value=0.11119747865833446, df_denom=1>
______________________________________________
df_resid ssr df_diff ss_diff F Pr(>F)
0 657.0 3671.394806 0.0 NaN NaN NaN
1 654.0 3669.745555 3.0 1.649251 0.097973 0.961138
______________________________________________
Pre-break ADF statistic: -0.89272514653163
Post-break ADF statistic: -2.603062206099372
______________________________________________
(12.444043308438582, 0.0019852277136483674, 0.32571817051428936, 3.1677538653763313)
______________________________________________
https://colab.research.google.com/drive/13b68CWylarwD28XhRqBUi5TJbyIUZQ-d?authuser=0#scrollTo=bT8oUQ0zCyBZ&printMode=true 8/8