PPP Models - GARCH & NARNN - Ipynb - Colaboratory
PPP Models - GARCH & NARNN - Ipynb - Colaboratory
ipynb - Colaboratory
NB: The residuals from the base model would be used to train the other model
The existing models serve as a Benchmark ( GARCH, ARIMA and SARIMA, Exponential
Smoothing)
1 # Load Libraries
2 import numpy as np # Mathematical Computations
3 import pandas as pd # Data Manipulation
4 import matplotlib.pyplot as plt # Data Visualization
5 import seaborn as sns # Advance Data Visualizations
6 import plotly.express as px # Advance and Interactive Data Visualizations
Date
https://colab.research.google.com/drive/12zwsFRYplo5DU6lIq6vfaYs75v6Sez4M#scrollTo=xFe4ZkpeBp26&uniqifier=3&printMode=true 1/13
1/3/24, 6:10 PM PPP Models - GARCH & NARNN.ipynb - Colaboratory
1 # Rename columns for convinience
2 df = df.rename(columns = {'GASOLINE': 'Gasoline', 'DIESEL': 'Diesel'})
3 df.columns
Collecting pmdarima
Downloading pmdarima-2.0.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manyl
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.1/2.1 MB 9.5 MB/s eta 0:00:00
Requirement already satisfied: joblib>=0.11 in /usr/local/lib/python3.10/dist-packages (f
Requirement already satisfied: Cython!=0.29.18,!=0.29.31,>=0.29 in /usr/local/lib/python3
Requirement already satisfied: numpy>=1.21.2 in /usr/local/lib/python3.10/dist-packages (
Requirement already satisfied: pandas>=0.19 in /usr/local/lib/python3.10/dist-packages (f
Requirement already satisfied: scikit-learn>=0.22 in /usr/local/lib/python3.10/dist-packa
Requirement already satisfied: scipy>=1.3.2 in /usr/local/lib/python3.10/dist-packages (f
Requirement already satisfied: statsmodels>=0.13.2 in /usr/local/lib/python3.10/dist-pack
Requirement already satisfied: urllib3 in /usr/local/lib/python3.10/dist-packages (from p
Requirement already satisfied: setuptools!=50.0.0,>=38.6.0 in /usr/local/lib/python3.10/d
Requirement already satisfied: packaging>=17.1 in /usr/local/lib/python3.10/dist-packages
Requirement already satisfied: python-dateutil>=2.8.1 in /usr/local/lib/python3.10/dist-p
Requirement already satisfied: pytz>=2020.1 in /usr/local/lib/python3.10/dist-packages (f
Requirement already satisfied: threadpoolctl>=2.0.0 in /usr/local/lib/python3.10/dist-pac
Requirement already satisfied: patsy>=0.5.4 in /usr/local/lib/python3.10/dist-packages (f
Requirement already satisfied: six in /usr/local/lib/python3.10/dist-packages (from patsy
Installing collected packages: pmdarima
Successfully installed pmdarima-2.0.4
https://colab.research.google.com/drive/12zwsFRYplo5DU6lIq6vfaYs75v6Sez4M#scrollTo=xFe4ZkpeBp26&uniqifier=3&printMode=true 2/13
1/3/24, 6:10 PM PPP Models - GARCH & NARNN.ipynb - Colaboratory
1 from statsmodels.tsa.stattools import adfuller
2 # Perform the ADF test
3 def perform_adf_test(ds):
4 result = adfuller(ds)
5 # Extract and print test statistics
6 adf_statistic = result[0]
7 p_value = result[1]
8 critical_values = result[4]
9
10 print(f'ADF Statistic: {adf_statistic}')
11 print(f'p-value: {p_value}')
12 print(f'Critical Values: {critical_values}')
13
14 # Check the p-value against the significance level (e.g., 0.05)
15 if p_value <= 0.05:
16 print("Reject the null hypothesis. The time series is likely stationary.")
17 else:
18 print("Fail to reject the null hypothesis. The time series may not be stationary.")
19
20 perform_adf_test(df[product])
https://colab.research.google.com/drive/12zwsFRYplo5DU6lIq6vfaYs75v6Sez4M#scrollTo=xFe4ZkpeBp26&uniqifier=3&printMode=true 3/13
1/3/24, 6:10 PM PPP Models - GARCH & NARNN.ipynb - Colaboratory
https://colab.research.google.com/drive/12zwsFRYplo5DU6lIq6vfaYs75v6Sez4M#scrollTo=xFe4ZkpeBp26&uniqifier=3&printMode=true 4/13
1/3/24, 6:10 PM PPP Models - GARCH & NARNN.ipynb - Colaboratory
Warnings:
[1] Covariance matrix calculated using the outer product of gradients (complex-step).
SARIMAX Results
==============================================================================
Dep. Variable: Gasoline No. Observations: 60
Model: ARIMA(0, 1, 2) Log Likelihood -71.805
Date: Wed, 03 Jan 2024 AIC 149.611
Time: 17:32:54 BIC 155.844
Sample: 01-01-2019 HQIC 152.044
- 12-01-2023
Covariance Type: opg
==============================================================================
coef std err z P>|z| [0.025 0.975]
------------------------------------------------------------------------------
ma.L1 0.2512 0.068 3.721 0.000 0.119 0.384
ma.L2 -0.2579 0.151 -1.705 0.088 -0.554 0.039
sigma2 0.6648 0.042 15.793 0.000 0.582 0.747
===================================================================================
Ljung-Box (L1) (Q): 0.05 Jarque-Bera (JB): 1182.36
Prob(Q): 0.82 Prob(JB): 0.00
Heteroskedasticity (H): 23.10 Skew: 3.51
Prob(H) (two-sided): 0.00 Kurtosis: 23.77
===================================================================================
Warnings:
[1] Covariance matrix calculated using the outer product of gradients (complex-step).
/usr/local/lib/python3.10/dist-packages/statsmodels/tsa/base/tsa_model.py:473: ValueWarni
self._init_dates(dates, freq)
/usr/local/lib/python3.10/dist-packages/statsmodels/tsa/base/tsa_model.py:473: ValueWarni
self._init_dates(dates, freq)
/usr/local/lib/python3.10/dist-packages/statsmodels/tsa/base/tsa_model.py:473: ValueWarni
self._init_dates(dates, freq)
1 arima_resid = pd.DataFrame(arima_results.resid)
2 arima_resid.plot(kind = "kde", figsize=(20,6))
https://colab.research.google.com/drive/12zwsFRYplo5DU6lIq6vfaYs75v6Sez4M#scrollTo=xFe4ZkpeBp26&uniqifier=3&printMode=true 5/13
1/3/24, 6:10 PM PPP Models - GARCH & NARNN.ipynb - Colaboratory
<Axes: ylabel='Density'>
Since the data has an AR component, we would fit the ARCH Model on the residuals of the ARIMA
model.
https://colab.research.google.com/drive/12zwsFRYplo5DU6lIq6vfaYs75v6Sez4M#scrollTo=xFe4ZkpeBp26&uniqifier=3&printMode=true 6/13
1/3/24, 6:10 PM PPP Models - GARCH & NARNN.ipynb - Colaboratory
Iteration: 7, Func. Count: 46, Neg. LLF: 1772.517900132466
Iteration: 8, Func. Count: 52, Neg. LLF: 57.32423354514197
Iteration: 9, Func. Count: 57, Neg. LLF: 57.46116408610714
Iteration: 10, Func. Count: 63, Neg. LLF: 57.28839336266442
Iteration: 11, Func. Count: 68, Neg. LLF: 1772.5140329794292
Iteration: 12, Func. Count: 75, Neg. LLF: 57.288380645096915
Optimization terminated successfully (Exit mode 0)
Current function value: 57.28837974861303
Iterations: 13
Function evaluations: 75
Gradient evaluations: 12
Zero Mean - ARCH Model Results
==============================================================================
Dep. Variable: 0 R-squared: 0.000
Mean Model: Zero Mean Adj. R-squared: 0.017
Vol Model: ARCH Log-Likelihood: -57.2884
Distribution: Normal AIC: 122.577
Method: Maximum Likelihood BIC: 130.954
No. Observations: 60
Date: Wed, Jan 03 2024 Df Residuals: 60
Time: 17:59:52 Df Model: 0
Volatility Model
=============================================================================
coef std err t P>|t| 95.0% Conf. Int.
-----------------------------------------------------------------------------
omega 0.1130 4.473e-02 2.527 1.149e-02 [2.538e-02, 0.201]
alpha[1] 1.0000 0.396 2.522 1.166e-02 [ 0.223, 1.777]
alpha[2] 6.5873e-14 2.325e-02 2.833e-12 1.000 [-4.557e-02,4.557e-02]
alpha[3] 1.0795e-13 8.136e-03 1.327e-11 1.000 [-1.595e-02,1.595e-02]
=============================================================================
https://colab.research.google.com/drive/12zwsFRYplo5DU6lIq6vfaYs75v6Sez4M#scrollTo=xFe4ZkpeBp26&uniqifier=3&printMode=true 7/13
1/3/24, 6:10 PM PPP Models - GARCH & NARNN.ipynb - Colaboratory
Iteration: 12, Func. Count: 72, Neg. LLF: 56.860447795643275
Iteration: 13, Func. Count: 77, Neg. LLF: 56.81596564847961
Iteration: 14, Func. Count: 82, Neg. LLF: 56.8117858529232
Iteration: 15, Func. Count: 87, Neg. LLF: 56.8116901249659
Iteration: 16, Func. Count: 92, Neg. LLF: 59.88033886994014
Iteration: 17, Func. Count: 100, Neg. LLF: 56.91129629217719
Optimization terminated successfully (Exit mode 0)
Current function value: 56.81168551834978
Iterations: 18
Function evaluations: 103
Gradient evaluations: 17
Constant Mean - GARCH Model Results
==============================================================================
Dep. Variable: resid R-squared: 0.000
Mean Model: Constant Mean Adj. R-squared: 0.000
Vol Model: GARCH Log-Likelihood: -56.8117
Distribution: Normal AIC: 121.623
Method: Maximum Likelihood BIC: 130.001
No. Observations: 60
Date: Wed, Jan 03 2024 Df Residuals: 59
Time: 18:04:45 Df Model: 1
Mean Model
===========================================================================
coef std err t P>|t| 95.0% Conf. Int.
---------------------------------------------------------------------------
mu 0.0678 4.784e-02 1.417 0.156 [-2.597e-02, 0.162]
Volatility Model
=============================================================================
coef std err t P>|t| 95.0% Conf. Int.
-----------------------------------------------------------------------------
omega 0.1003 3.882e-02 2.584 9.771e-03 [2.422e-02, 0.176]
alpha[1] 1.0000 0.366 2.730 6.327e-03 [ 0.282, 1.718]
beta[1] 1.5508e-13 8.794e-03 1.763e-11 1.000 [-1.724e-02,1.724e-02]
=============================================================================
https://colab.research.google.com/drive/12zwsFRYplo5DU6lIq6vfaYs75v6Sez4M#scrollTo=xFe4ZkpeBp26&uniqifier=3&printMode=true 8/13
1/3/24, 6:10 PM PPP Models - GARCH & NARNN.ipynb - Colaboratory
1 # prompt: Write codes to evaluate the GARCH Model
2
3 # Evaluate the GARCH Model
4 # Calculate Residuals
5 residuals = garch_result.resid
6
7 # Calculate the Root Mean Squared Error (RMSE)
8 rmse = np.sqrt(np.mean(residuals**2))
9 print(f"RMSE: {rmse}")
10
11 # Calculate the Mean Absolute Error (MAE)
12 mae = np.mean(np.abs(residuals))
13 print(f"MAE: {mae}")
14
15 # Calculate the Mean Absolute Percentage Error (MAPE)
16 mape = np.mean(np.abs(residuals / arch_residuals)) * 100
17 print(f"MAPE: {mape}%")
18
19 # Plot the Residuals
20 plt.figure(figsize=(12, 6))
21 plt.plot(residuals, label='Residuals', alpha=0.6)
22 plt.title(f'Residuals of GARCH Model for {product}')
23 plt.xlabel('Year')
24 plt.ylabel('Residuals')
25 plt.legend()
26 plt.show()
27
RMSE: 1.0142621472049587
MAE: 0.4755606334037595
MAPE: 145.18138560695348%
https://colab.research.google.com/drive/12zwsFRYplo5DU6lIq6vfaYs75v6Sez4M#scrollTo=xFe4ZkpeBp26&uniqifier=3&printMode=true 9/13
1/3/24, 6:10 PM PPP Models - GARCH & NARNN.ipynb - Colaboratory
1 import statsmodels.api as sm
2 from statsmodels.stats.diagnostic import acorr_ljungbox
3
4 # Residual Analysis
5 squared_residuals = residuals**2
6
7 # Calculate standardized squared residuals
8 standardized_squared_residuals = (residuals ** 2) / (garch_result.conditional_volatility ** 2)
9
10
11 # Diagnostic Tests
12 # Autocorrelation in squared residuals (Ljung-Box test)
13 print(acorr_ljungbox(standardized_squared_residuals, lags=10, boxpierce=True))
https://colab.research.google.com/drive/12zwsFRYplo5DU6lIq6vfaYs75v6Sez4M#scrollTo=xFe4ZkpeBp26&uniqifier=3&printMode=true 10/13
1/3/24, 6:10 PM PPP Models - GARCH & NARNN.ipynb - Colaboratory
1 # Extract residuals
2 garch_residuals = garch_result.resid
3 # Plot residuals
4 plt.figure(figsize=(10, 4))
5 plt.plot(garch_residuals, label='GARCH Residuals')
6 plt.title('Residuals of GARCH Model')
7 plt.xlabel('Time')
8 plt.ylabel('Residuals')
9 plt.legend()
10 plt.show()
https://colab.research.google.com/drive/12zwsFRYplo5DU6lIq6vfaYs75v6Sez4M#scrollTo=xFe4ZkpeBp26&uniqifier=3&printMode=true 11/13
1/3/24, 6:10 PM PPP Models - GARCH & NARNN.ipynb - Colaboratory
NARNN MODEL
keyboard_arrow_down
Epoch 1/100
2/2 [==============================] - 10s 42ms/step - loss: 1.0298
Epoch 2/100
2/2 [==============================] - 0s 24ms/step - loss: 1.0240
Epoch 3/100
2/2 [==============================] - 0s 26ms/step - loss: 1.0166
Epoch 4/100
2/2 [==============================] - 0s 20ms/step - loss: 1.0096
Epoch 5/100
2/2 [==============================] - 0s 16ms/step - loss: 1.0060
Epoch 6/100
2/2 [==============================] - 0s 25ms/step - loss: 0.9984
Epoch 7/100
2/2 [==============================] - 0s 27ms/step - loss: 0.9935
Epoch 8/100
2/2 [==============================] - 0s 16ms/step - loss: 0.9872
Epoch 9/100
2/2 [==============================] - 0s 17ms/step - loss: 0.9807
Epoch 10/100
2/2 [==============================] - 0s 23ms/step - loss: 0.9744
Epoch 11/100
2/2 [==============================] - 0s 19ms/step - loss: 0.9664
Epoch 12/100
2/2 [==============================] - 0s 29ms/step - loss: 0.9567
Epoch 13/100
2/2 [==============================] - 0s 19ms/step - loss: 0.9490
Epoch 14/100
2/2 [==============================] - 0s 21ms/step - loss: 0.9355
Epoch 15/100
2/2 [==============================] - 0s 20ms/step - loss: 0.9262
Epoch 16/100
2/2 [==============================] - 0s 23ms/step - loss: 0.9164
Epoch 17/100
https://colab.research.google.com/drive/12zwsFRYplo5DU6lIq6vfaYs75v6Sez4M#scrollTo=xFe4ZkpeBp26&uniqifier=3&printMode=true 12/13
1/3/24, 6:10 PM PPP Models - GARCH & NARNN.ipynb - Colaboratory
2/2 [==============================] - 0s 20ms/step - loss: 0.8984
Epoch 18/100
2/2 [==============================] - 0s 11ms/step - loss: 0.8840
Epoch 19/100
2/2 [==============================] - 0s 10ms/step - loss: 0.8729
Epoch 20/100
2/2 [==============================] - 0s 10ms/step - loss: 0.8499
Epoch 21/100
2/2 [==============================] - 0s 12ms/step - loss: 0.8302
Epoch 22/100
2/2 [==============================] - 0s 12ms/step - loss: 0.8135
Epoch 23/100
2/2 [==============================] - 0s 10ms/step - loss: 0.7892
Epoch 24/100
2/2 [==============================] - 0s 11ms/step - loss: 0.7639
Epoch 25/100
2/2 [==============================] - 0s 10ms/step - loss: 0.7336
Epoch 26/100
2/2 [==============================] - 0s 10ms/step - loss: 0.7076
Epoch 27/100
2/2 [==============================] - 0s 18ms/step - loss: 0.6580
Epoch 28/100
2/2 [==============================] - 0s 11ms/step - loss: 0.6259
Epoch 29/100
2/2 [==============================] - 0s 10ms/step - loss: 0.5784
1 # Make predictions
2 predictions = model.predict(residuals_scaled)
3
4 # Rescale the predictions back to the original range
5 #predictions = scaler.inverse_transform(predictions)
6
7 df2 = df
https://colab.research.google.com/drive/12zwsFRYplo5DU6lIq6vfaYs75v6Sez4M#scrollTo=xFe4ZkpeBp26&uniqifier=3&printMode=true 13/13