EII3002 Exercise 7 & 8
EII3002 Exercise 7 & 8
EII3002 Exercise 7 & 8
Partial autocorrelation measures linear association between two series controlling for
the effects of one or more additional series. Hence the two types of correlation,
although related, are nevertheless very different, and they may well be of different
sign.
a) 𝛾(𝑡, 𝜏) = 𝛼
b) 𝛾(𝑡, 𝜏) = 𝑒 −𝛼𝜏
c) 𝛾(𝑡, 𝜏) = 𝛼𝜏, and
𝛼
d) 𝛾(𝑡, 𝜏) =
𝜏
1
3. Consider the following sample of time series data with 36 values.
Yt
120
100
80
60
40
20
0
1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35
𝑇
2
∑(𝑦𝑡 − 𝑦
̅) = 29478.97222
𝑖=1
2
𝑇
𝑇
∑ [(𝑦𝑡 − 𝑦
̅)(𝑦𝑡−6 − 𝑦
̅)] = 743.30
𝑡=6+1
3032.00
𝜌̂(1) = = 0.1029
29478.97
2911.22
𝜌̂(2) = = 0.0988
29478.97
−1260.72
𝜌̂(3) = = −0.0428
29478.97
−915.98
𝜌̂(4) = = −0.0311
29478.97
−5409.00
𝜌̂(5) = = −0.1835
29478.97
743.30
𝜌̂(6) = = 0.0252
29478.97
c) Use the Ljung-Box Q-statistic to test the null hypothesis that the series is a
serially uncorrelated process up to displacement 6 (m = 6).
Joint hypothesis testing:
𝐻0 : 𝜌(1) = 𝜌(2) = 𝜌(3) = 𝜌(4) = 𝜌(5) = 𝜌(6)
𝐻1 : 𝐴𝑡 𝑙𝑒𝑎𝑠𝑡 𝑜𝑛𝑒 𝜌(𝜏) ≠ 0
Test statistic:
𝑚
𝜌̂(𝜏)2
𝑄𝐿𝐵 (𝜏) = 𝑇(𝑇 + 2) ∑ ( )
𝑇−𝜏
𝜏=1
3
(0.1029)2 (0.0988)2 (−0.0428)2 (−0.0311)2 (−0.1835)2
𝑄𝐿𝐵 (6) = 36(36 + 2)( + + + +
36−1 36−2 36−3 36−4 36−5
(0.0252)2
+ = 2.43877793
36−6
Critical value:
2 2
𝜒𝛼,𝑚 = 𝜒0.05,6 = 12.592
Decision:
Since the test statistic is smaller than the critical value,
2
𝑄𝐿𝐵 = 2.4388 < 𝜒0.05,6 = 12.592
Do not reject the null hypothesis, 𝐻0 : 𝜌(1) = 𝜌(2) = 𝜌(3) = 𝜌(4) = 𝜌(5) = 𝜌(6).
Conclusion:
The series is a serially uncorrelated process.
̂ (2)−𝜌
𝜌 ̂ (1)2 0.0988−0.10292
𝑝̂ (2) = 1−𝜌̂ (1)2
= 1−0.10292
= 0.0892
4
e) Given that partial autocorrelations 𝑝̂ (3) = −0.062, 𝑝̂ (4) = −0.030, 𝑝̂ (5) = −0.171,
𝑝̂ (6) = 0.065. Draw a correlogram for the Autocorrelation Function (ACF) and the
Partial Autocorrelation Function (PACF) up to displacement 6 along with their
two-standard-error bands.
Correlogram.
2 2
Two-standard-error-bands: ± =± = ±0.3333
√𝑇 √36
f) Based on your findings in parts (c) and (e), does the series has a strong cyclical
component?
Since the series is a serially uncorrelated process, it doesn’t have a strong
cyclical component.
5
4. Consider another sample of time series data with 36 values.
y
250
200
150
100
50
0
1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37
6
𝑇
Sample autocorrelation:
42185.3528
𝜌̂(1) = = 0.641240616
65787.088
25540.1764
𝜌̂(2) = = 0.38822476
65787.088
9533.19543
𝜌̂(3) = = 0.144909825
65787.088
4303.27449
𝜌̂(4) = = 0.065412144
65787.088
−181.37358
𝜌̂(5) = = −0.002756978
65787.088
686.61258
𝜌̂(6) = = 0.104330695
65787.088
7
c) Use the Ljung-Box Q-statistic to test the null hypothesis that the series is a
serially uncorrelated process up to displacement 6 (m = 6).
Critical value:
2 2
𝜒𝛼,𝑚 = 𝜒0.05,6 = 12.592
Decision:
Since the test statistic is smaller than the critical value,
2
𝑄𝐿𝐵 = 23.5080681 > 𝜒0.05,6 = 12.592
Reject the null hypothesis, 𝐻0 : 𝜌(1) = 𝜌(2) = 𝜌(3) = 𝜌(4) = 𝜌(5) = 𝜌(6).
At least one of 𝜌(𝜏) ≠ 0
Conclusion:
The series is not a serially uncorrelated process.
8
e) Given that partial autocorrelations 𝑝̂ (3) = −0.151, 𝑝̂ (4) = 0.074, 𝑝̂ (5) = −0.044,
𝑝̂ (6) = 0.209. Draw a correlogram for the Autocorrelation Function (ACF) and the
Partial Autocorrelation Function (PACF) up to displacement 6 along with their
two-standard-error bands.
Q4
200
160
120
80
40
0
5 10 15 20 25 30 35
2 2
Two-standard-error-bands: ± =± = ±0.3333
√𝑇 √36
Date: 11/27/23 Time: 21:01
Sample: 1 36
Included observations: 36
Autocorrelation Partial Correlation AC PAC Q-Stat Prob
f) Based on your findings in parts (c) and (e), does the series has a strong cyclical
component?
The series is not a serially uncorrelated process, which means that it has a
strong cyclical component.
9
Exercise 8: Based on lecture 8.
5. Explain the theoretical pattern of population ACF and PACF for the following models:
a) MA(1)
ACF: non-zero value at lag 1 then cuts off to zero.
PACF: Exponential decreasing.
b) MA(2)
ACF: Non-zero value at lag 1 and lag 2 then cuts off to zero.
PACF: Exponential decreasing.
c) AR(1)
ACF: Exponential decreasing.
PACF: Non-zero value at lag 1 then cuts off to zero.
d) AR(2)
ACF: Exponential decreasing.
PACF: Non-zero value at lag 1 and lag 2 then cuts off to zero.
10
6. Given below are sample ACF and PACF for four different series. Identify an ARMA
model that might be useful in describing each series.
Series 1:
-1
-2
-3
25 50 75 100 125 150
11
Dependent Variable: S1
Method: ARMA Maximum Likelihood (OPG - BHHH)
Date: 12/19/23 Time: 17:22
Sample: 1 150
Included observations: 150
Convergence achieved after 14 iterations
Coefficient covariance computed using outer product of gradients
1.0
0.5
AR roots
0.0
-0.5
-1.0
-1.5
-1 0 1
The roots lie inside the circle indicate that the estimated model covariance is
stationary.
12
STEP 4: Residual diagnostic.
View – residual diagnostic – correlogram Q
13
Series 2:
If not sure, can estimate the model and compare the AIC and SIC. Choose the
smallest AIC and SIC.
14
Dependent Variable: S2
Method: ARMA Maximum Likelihood (OPG - BHHH)
Date: 12/19/23 Time: 17:48
Sample: 1 150
Included observations: 150
Convergence achieved after 12 iterations
Coefficient covariance computed using outer product of gradients
AIC and SIC produced by MA (1) is smaller than AIC and SIC produced by AR (2).
Therefore, use MA (1).
1.0
0.5
MA roots
0.0
-0.5
-1.0
-1.5
-1 0 1
The unit roots lie inside the circle indicating that MA (1) is invertible.
15
Step 4: Residual diagnostic.
Date: 12/19/23 Time: 17:53
Sample: 1 150
Q-statistic probabilities adjusted for 1 ARMA term
All the p-values are greater than 0.05, do not reject H0. Therefore, the residual is a
white noise process and MA (1) is adequate on this series.
16
Series 3:
The sample ACF has non-zero value at lag 1, lag 2, lag 3, lag 4, and the sample
PACF is decreasing. Use MA (4).
17
Step 2: Estimate the model – compare AIC and SIC.
Dependent Variable: S3
Method: ARMA Maximum Likelihood (OPG - BHHH)
Date: 12/19/23 Time: 19:01
Sample: 1 150
Included observations: 150
Convergence achieved after 13 iterations
Coefficient covariance computed using outer product of gradients
Dependent Variable: S3
Method: ARMA Maximum Likelihood (OPG - BHHH)
Date: 12/19/23 Time: 19:00
Sample: 1 150
Included observations: 150
Convergence achieved after 16 iterations
Coefficient covariance computed using outer product of gradients
The AIC and SIC produced by AR (2) are smaller than AIC and SIC produced
by MA (4). Therefore, choose AR (2).
18
Step 3: Check the stability.
S3: Inverse Roots of AR/MA Polynomial(s)
1.5
1.0
0.5
AR roots
0.0
-0.5
-1.0
-1.5
-1 0 1
All the roots lie inside the circle indicating that the estimated model
covariance is stationary.
19
Series 4:
The sample ACF is decreasing and the sample PACF is spike at lag 1 and lag 2, can
use AR (2). However, it also looks like the sample ACF is spiking at lag 1 and lag 2,
and the sample PACF is decreasing, can use MA (2). We are not sure whether to
choose AR (2) or MA (2). Therefore, we should estimate the model and choose the
model that produces the lower AIC and SIC.
Dependent Variable: S3
Method: ARMA Maximum Likelihood (OPG - BHHH)
Date: 12/19/23 Time: 19:08
Sample: 1 150
Included observations: 150
Convergence achieved after 13 iterations
Coefficient covariance computed using outer product of gradients
20
Dependent Variable: S3
Method: ARMA Maximum Likelihood (OPG - BHHH)
Date: 12/19/23 Time: 19:09
Sample: 1 150
Included observations: 150
Convergence achieved after 11 iterations
Coefficient covariance computed using outer product of gradients
The AIC and SIC produced by AR (2) are smaller than the AIC and SIC
produced by MA (2). Therefore, we should use AR (2). However, we still need
to test the residual.
1.0
0.5
AR roots
0.0
-0.5
-1.0
-1.5
-1 0 1
All unit roots lie inside the circle indicating that the estimated model
covariance is stationary.
21
Step 4: Residual diagnostic.
All p-values are greater than a = 0.05, do not reject H0, indicates that MA (2) is
adequate to series 4.
22
7. Write the following models (with zero mean) in terms of the backshift operator, and
then without the backshift operator.
a) MA (1):
Backshift operator: 𝑦𝑡 = (1 + 𝜃1 𝐿)𝜀𝑡
Without backshift operator: 𝑦𝑡 = 𝜀𝑡 + 𝜃1 𝜀𝑡−1
b) MA (2):
Backshift operator: 𝑦𝑡 = (1 + 𝜃1 𝐿 + 𝜃2 𝐿2 )𝜀𝑡
Without backshift operator: 𝑦𝑡 = 𝜀𝑡 + 𝜃1 𝜀𝑡−1 + 𝜃2 𝜀𝑡−2
c) AR (1):
Backshift operator: (1 − 𝜙1 𝐿)𝑦𝑡 = 𝜀𝑡
Without backshift operator: 𝑦𝑡 = 𝜙1 𝑦𝑡−1 + 𝜀𝑡
d) AR (2):
Backshift operator: (1 − 𝜙1 𝐿−𝜙2 𝐿2 )𝑦𝑡 = 𝜀𝑡
Without backshift operator: 𝑦𝑡 = 𝜙1 𝑦𝑡−1 + 𝜙2 𝑦𝑡−2 + 𝜀𝑡
e) ARMA (1,1)
Backshift operator: (1 − 𝜙1 𝐿)𝑦𝑡 = (1 + 𝜃1 𝐿)𝜀𝑡
Without backshift operator: 𝑦𝑡 = 𝜙1 𝑦𝑡−1 + 𝜀𝑡 + 𝜃1 𝜀𝑡−1
23
8. Derive theoretical covariance and autocorrelation functions at displacements 1 and 2
for MA (2) with zero-mean process. What would you expect to be the value of
autocorrelations beyond displacement 2?
1𝜃 +𝜃 𝜃
1 2 𝜃
2
𝜌(1) = 1+𝜃 2 +𝜃 2 , 𝜌(2) = 1+𝜃 2 +𝜃 2
1 2 1 2
𝜌(𝜏) = 0, 𝜏 > 2
24