0% found this document useful (0 votes)
35 views10 pages

Asm4 2013345148

The document discusses analyzing stock return data for a Vietnamese company called ACB from 2015 to 2023. It loads the stock data, calculates daily returns, and analyzes volatility using standard deviation. It then fits autoregressive, GARCH, EGARCH, and GJR-GARCH models to the return data and compares their results.

Uploaded by

K59 Le Gia Han
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
35 views10 pages

Asm4 2013345148

The document discusses analyzing stock return data for a Vietnamese company called ACB from 2015 to 2023. It loads the stock data, calculates daily returns, and analyzes volatility using standard deviation. It then fits autoregressive, GARCH, EGARCH, and GJR-GARCH models to the return data and compares their results.

Uploaded by

K59 Le Gia Han
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

1

import pandas as pd
import numpy as np
from vnstock import *
from matplotlib import pyplot as plt
import seaborn as sns
import math
import warnings
warnings.filterwarnings('ignore')

start_date = "2015-09-01"
end_date = "2023-05-29"
print(start_date)
print(end_date)

2015-09-01
2023-05-29

df = stock_historical_data("ACB", start_date, end_date)


df

Open High Low Close Volume TradingDate

0 4137.0 4182.0 4091.0 4115.0 131193 2015-09-03

1 4046.0 4111.0 4021.0 4069.0 129860 2015-09-04

2 4069.0 4159.0 4068.0 4115.0 140700 2015-09-07

3 4115.0 4421.0 4188.0 4319.0 301747 2015-09-08

4 4342.0 4473.0 4381.0 4433.0 702756 2015-09-09

... ... ... ... ... ... ...

1925 21050.0 21384.0 20883.0 21175.0 21726379 2023-05-23

1926 21300.0 21300.0 20841.0 21008.0 11051426 2023-05-24

1927 21050.0 21050.0 20799.0 20966.0 8792333 2023-05-25

1928 21050.0 21050.0 20841.0 20883.0 8563241 2023-05-26

1929 20966.0 21133.0 20883.0 21008.0 16577082 2023-05-29

1930 rows × 6 columns

df = df.set_index("TradingDate")

df["Return"] = df["Close"].pct_change()
df = df.fillna(0)
df

Open High Low Close Volume Return

TradingDate

2015-09-03 4137.0 4182.0 4091.0 4115.0 131193 0.000000

2015-09-04 4046.0 4111.0 4021.0 4069.0 129860 -0.011179

2015-09-07 4069.0 4159.0 4068.0 4115.0 140700 0.011305

2015-09-08 4115.0 4421.0 4188.0 4319.0 301747 0.049575

2015-09-09 4342.0 4473.0 4381.0 4433.0 702756 0.026395

... ... ... ... ... ... ...

2023-05-23 21050.0 21384.0 20883.0 21175.0 21726379 0.011996

2023-05-24 21300.0 21300.0 20841.0 21008.0 11051426 -0.007887

2023-05-25 21050.0 21050.0 20799.0 20966.0 8792333 -0.001999

2023-05-26 21050.0 21050.0 20841.0 20883.0 8563241 -0.003959

2023-05-29 20966.0 21133.0 20883.0 21008.0 16577082 0.005986

1930 rows × 6 columns

# Plot the return and see how volatile it is


plt.plot(df['Return'], color = 'blue')
plt.show()

# Calculate daily std of returns


std_daily = df['Return'].std()
print('Daily volatility: ', '{:.2f}'.format(std_daily))
# Convert daily volatility to monthly volatility, assume that the average trading days per month are 21 days
std_monthly = math.sqrt(21) * std_daily
print ('Monthly volatility: ', '{:.2f}'.format(std_monthly))

# Convert daily volatility to annaul volatility, assume that the average trading days per year are 252 days
std_annual = math.sqrt(252) * std_daily
print ('Annual volatility: ', '{:.2f}'.format(std_annual))

Daily volatility: 0.02


Monthly volatility: 0.09
Annual volatility: 0.30

import arch

3a

from statsmodels.tsa.ar_model import AutoReg


from statsmodels.tsa.stattools import arma_order_select_ic

maxlag = 10
order = arma_order_select_ic(df["Return"], ic='aic', trend='c', max_ar=10, max_ma=0)
optimal_lag = order['aic_min_order'][0]
print("Optimal lag order:", optimal_lag)

Optimal lag order: 2

model = AutoReg(df["Return"], lags=optimal_lag)


result_ar = model.fit()

ar_resid = result_ar.resid

standardize_ar_resid = ar_resid / std_daily

sns.histplot(x=standardize_ar_resid, kde=True, color="orange")


plt.show()
The histogram looks very similar to the normal distribution. We can use the assumption of Student's t - distribution.

3b

from arch import arch_model


basic_gm = arch_model(df["Return"], p = 1, q = 1, mean = 'constant' , vol = 'GARCH' , dist = 't')
gm_result = basic_gm.fit(disp="off")
print(gm_result.summary())

Constant Mean - GARCH Model Results


====================================================================================
Dep. Variable: Return R-squared: 0.000
Mean Model: Constant Mean Adj. R-squared: 0.000
Vol Model: GARCH Log-Likelihood: 4366.97
Distribution: Standardized Student's t AIC: -8723.94
Method: Maximum Likelihood BIC: -8696.11
No. Observations: 1930
Date: Sun, Jun 11 2023 Df Residuals: 1929
Time: 22:41:49 Df Model: 1
Mean Model
============================================================================
coef std err t P>|t| 95.0% Conf. Int.
----------------------------------------------------------------------------
mu 9.8384e-04 2.582e-04 3.810 1.391e-04 [4.777e-04,1.490e-03]
Volatility Model
============================================================================
coef std err t P>|t| 95.0% Conf. Int.
----------------------------------------------------------------------------
omega 9.6542e-04 3.824e-05 25.249 1.176e-140 [8.905e-04,1.040e-03]
alpha[1] 0.8153 0.157 5.195 2.052e-07 [ 0.508, 1.123]
beta[1] 0.1806 1.455e-02 12.412 2.262e-35 [ 0.152, 0.209]
Distribution
========================================================================
coef std err t P>|t| 95.0% Conf. Int.
------------------------------------------------------------------------
nu 18.7094 0.170 109.787 0.000 [ 18.375, 19.043]
========================================================================

Covariance estimator: robust

egarch_gm = arch_model(df["Return"], p = 1, q = 1, mean = 'constant' , vol = 'EGARCH' , dist = 't')


egarch_result = egarch_gm.fit(disp="off")
print(egarch_result.summary())

Constant Mean - EGARCH Model Results


====================================================================================
Dep. Variable: Return R-squared: 0.000
Mean Model: Constant Mean Adj. R-squared: 0.000
Vol Model: EGARCH Log-Likelihood: 5233.26
Distribution: Standardized Student's t AIC: -10456.5
Method: Maximum Likelihood BIC: -10428.7
No. Observations: 1930
Date: Sun, Jun 11 2023 Df Residuals: 1929
Time: 22:41:49 Df Model: 1
Mean Model
=============================================================================
coef std err t P>|t| 95.0% Conf. Int.
-----------------------------------------------------------------------------
mu 2.9937e-04 2.848e-04 1.051 0.293 [-2.588e-04,8.575e-04]
Volatility Model
===========================================================================
coef std err t P>|t| 95.0% Conf. Int.
---------------------------------------------------------------------------
omega -0.2016 7.017e-02 -2.873 4.072e-03 [ -0.339,-6.404e-02]
alpha[1] 0.2583 3.385e-02 7.630 2.350e-14 [ 0.192, 0.325]
beta[1] 0.9713 8.760e-03 110.875 0.000 [ 0.954, 0.988]
Distribution
========================================================================
coef std err t P>|t| 95.0% Conf. Int.
------------------------------------------------------------------------
nu 3.6896 0.324 11.377 5.464e-30 [ 3.054, 4.325]
========================================================================

Covariance estimator: robust

gjr_gm = arch_model(df["Return"], p = 1, q = 1, o = 1, mean = 'constant' , vol = 'GARCH' , dist = 't')


gjr_result = gjr_gm.fit(disp="off")
print(gjr_result.summary())
Constant Mean - GJR-GARCH Model Results
====================================================================================
Dep. Variable: Return R-squared: 0.000
Mean Model: Constant Mean Adj. R-squared: 0.000
Vol Model: GJR-GARCH Log-Likelihood: -154339.
Distribution: Standardized Student's t AIC: 308691.
Method: Maximum Likelihood BIC: 308724.
No. Observations: 1930
Date: Sun, Jun 11 2023 Df Residuals: 1929
Time: 22:41:49 Df Model: 1
Mean Model
============================================================================
coef std err t P>|t| 95.0% Conf. Int.
----------------------------------------------------------------------------
mu 546.5800 32.820 16.654 2.834e-62 [4.823e+02,6.109e+02]
Volatility Model
=============================================================================
coef std err t P>|t| 95.0% Conf. Int.
-----------------------------------------------------------------------------
omega 1.8869e-05 1.134e-04 0.166 0.868 [-2.034e-04,2.412e-04]
alpha[1] 0.8318 2.001e-02 41.574 0.000 [ 0.793, 0.871]
gamma[1] -0.8311 1.999e-02 -41.583 0.000 [ -0.870, -0.792]
beta[1] 6.7646e-04 9.025e-02 7.496e-03 0.994 [ -0.176, 0.178]
Distribution
========================================================================
coef std err t P>|t| 95.0% Conf. Int.
------------------------------------------------------------------------
nu 38.8808 3.066 12.681 7.550e-37 [ 32.871, 44.890]
========================================================================

Covariance estimator: robust

In-sample: The likelihood value of EGARCH Model is bigger than the likelihood value of GJR-GARCH Model (5233.26>-154339). Hence,
we can conclude that the EGARCH Model is better than the GJR-GARCH Model.

training_df = df["Return"][0:-101]
testing_df = df["Return"][-101:-1]

window = 30
actual_std = testing_df.rolling(window).std()

gjr_gm_training = arch_model(df["Return"], p = 1, q = 1, o = 1, mean = 'constant' , vol = 'GARCH' , dist = 't')


forecasts = {}
for i in range(1,101):

gjr_gm_training_result = gjr_gm_training.fit(first_obs = i,
last_obs = i + len(training_df) - 1, update_freq = 5, disp="off")

temp_result = gjr_gm_training_result.forecast(horizon = 1).variance


fcast = temp_result.iloc[i + len(training_df) - 1]
forecasts[fcast.name] = fcast

forecast_var = pd.DataFrame(forecasts).T

C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin


g: The optimizer returned code 8. The message is:
Positive directional derivative for linesearch
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 4. The message is:
Inequality constraints incompatible
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 4. The message is:
Inequality constraints incompatible
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 8. The message is:
Positive directional derivative for linesearch
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 4. The message is:
Inequality constraints incompatible
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 4. The message is:
Inequality constraints incompatible
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 4. The message is:
Inequality constraints incompatible
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 4. The message is:
Inequality constraints incompatible
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 8. The message is:
Positive directional derivative for linesearch
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 4. The message is:
Inequality constraints incompatible
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 8. The message is:
Positive directional derivative for linesearch
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 8. The message is:
Positive directional derivative for linesearch
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 4. The message is:
Inequality constraints incompatible
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 4. The message is:
Inequality constraints incompatible
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 4. The message is:
Inequality constraints incompatible
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 4. The message is:
Inequality constraints incompatible
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 8. The message is:
Positive directional derivative for linesearch
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 4. The message is:
Inequality constraints incompatible
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 8. The message is:
Positive directional derivative for linesearch
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 8. The message is:
Positive directional derivative for linesearch
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 4. The message is:
Inequality constraints incompatible
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 4. The message is:
Inequality constraints incompatible
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 4. The message is:
Inequality constraints incompatible
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 4. The message is:
Inequality constraints incompatible
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 4. The message is:
Inequality constraints incompatible
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 4. The message is:
Inequality constraints incompatible
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 4. The message is:
Inequality constraints incompatible
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 8. The message is:
Positive directional derivative for linesearch
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 8. The message is:
Positive directional derivative for linesearch
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 4. The message is:
Inequality constraints incompatible
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 4. The message is:
Inequality constraints incompatible
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 4. The message is:
Inequality constraints incompatible
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 4. The message is:
Inequality constraints incompatible
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 8. The message is:
Positive directional derivative for linesearch
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 4. The message is:
Inequality constraints incompatible
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 8. The message is:
Positive directional derivative for linesearch
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(

egarch_gm_training = arch_model(df["Return"], p = 1, q = 1, mean = 'constant' , vol = 'EGARCH' , dist = 't')


forecasts_egarch = {}
for i in range(1,101):

egarch_gm_training_result = egarch_gm_training.fit(first_obs = i,
last_obs = i + len(training_df) - 1, update_freq = 5, disp="off")

temp_result_egarch = egarch_gm_training_result.forecast(horizon = 1).variance


fcast_egarch = temp_result_egarch.iloc[i + len(training_df) - 1]
forecasts_egarch[fcast_egarch.name] = fcast_egarch

forecast_var_egarch = pd.DataFrame(forecasts_egarch).T

C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin


g: The optimizer returned code 4. The message is:
Inequality constraints incompatible
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 4. The message is:
Inequality constraints incompatible
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 9. The message is:
Iteration limit reached
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 9. The message is:
Iteration limit reached
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 9. The message is:
Iteration limit reached
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 9. The message is:
Iteration limit reached
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 9. The message is:
Iteration limit reached
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 9. The message is:
Iteration limit reached
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 9. The message is:
Iteration limit reached
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 4. The message is:
Inequality constraints incompatible
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 9. The message is:
Iteration limit reached
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 9. The message is:
Iteration limit reached
See scipy.optimize.fmin_slsqp for code meaning.
warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 9. The message is:
Iteration limit reached
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 9. The message is:
Iteration limit reached
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 4. The message is:
Inequality constraints incompatible
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 9. The message is:
Iteration limit reached
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 9. The message is:
Iteration limit reached
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 9. The message is:
Iteration limit reached
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 9. The message is:
Iteration limit reached
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 9. The message is:
Iteration limit reached
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 9. The message is:
Iteration limit reached
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 4. The message is:
Inequality constraints incompatible
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 4. The message is:
Inequality constraints incompatible
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 9. The message is:
Iteration limit reached
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 9. The message is:
Iteration limit reached
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 4. The message is:
Inequality constraints incompatible
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 4. The message is:
Inequality constraints incompatible
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 9. The message is:
Iteration limit reached
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 9. The message is:
Iteration limit reached
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 9. The message is:
Iteration limit reached
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 9. The message is:
Iteration limit reached
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(
C:\Users\ASUS\anaconda3\envs\Student Resources\lib\site-packages\arch\univariate\base.py:766: ConvergenceWarnin
g: The optimizer returned code 9. The message is:
Iteration limit reached
See scipy.optimize.fmin_slsqp for code meaning.

warnings.warn(

plt.plot(forecast_var_egarch["h.1"], color="brown", label="predicted_var")


plt.plot(actual_std**2, color="orange", label="actual_var")
plt.title('Volatility of Time Series by EGARCH')
plt.xlabel('Time')
plt.ylabel('Volatility')
plt.grid(True)
plt.legend()
plt.show()

plt.plot(forecast_var["h.1"], color="brown", label="predicted_var")


plt.plot(actual_std**2, color="orange", label="actual_var")
plt.title('Volatility of Time Series by GJR-GARCH')
plt.xlabel('Time')
plt.ylabel('Volatility')
plt.grid(True)
plt.legend()
plt.show()

Loading [MathJax]/jax/output/CommonHTML/fonts/TeX/fontdata.js

You might also like