0% found this document useful (0 votes)
121 views16 pages

Solutions

[1] The document provides examples for a time series course, including autoregressive (AR) and autoregressive moving average (ARMA) processes. [2] It gives the steps to find the autocorrelation function of an AR(2) process and shows the region of stationary for an AR(2) process. [3] It shows that the autocovariance generating function of a stationary process is equal to the product of the coefficient polynomial and its inverse, which can be used to derive autocovariances for ARMA models. It provides an example for an ARMA(1,1) process.

Uploaded by

Kirezi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
121 views16 pages

Solutions

[1] The document provides examples for a time series course, including autoregressive (AR) and autoregressive moving average (ARMA) processes. [2] It gives the steps to find the autocorrelation function of an AR(2) process and shows the region of stationary for an AR(2) process. [3] It shows that the autocovariance generating function of a stationary process is equal to the product of the coefficient polynomial and its inverse, which can be used to derive autocovariances for ARMA models. It provides an example for an ARMA(1,1) process.

Uploaded by

Kirezi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

Lent Term 2001 Richard Weber

Time Series — Examples Sheet


This is the examples sheet for the M. Phil. course in Time Series. A copy can be
found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/
Throughout, unless otherwise stated, the sequence {ǫt } is white noise, variance σ 2 .

1
1. Find the Yule-Walker equations for the AR(2) process

Xt = 13 Xt−1 + 29 Xt−2 + ǫt .

Hence show that it has autocorrelation function


16 2 |k| 5 1 |k|
 
ρk = 21 3 + 21 − 3 , k ∈ Z.

[ The Yule-Walker equations are

ρk − 13 ρk−1 − 92 ρk−2 = 0 , k ≥ 2.

On trying ρk = Aλk , we require λ2 − 13 λ − 2


9 = 0. This has roots 2
3 and − 13 , so
2 |k|
|k|
+ B − 13

ρk = A 3 ,
1 2
where ρ0 = A + B = 1. We also require ρ1 = 3 + 9 ρ1 . Hence ρ1 = 37 , and thus we
require 32 A − 13 B = 37 . These give A = 21
16
,B= 5
21 . ]

2
2. Let Xt = A cos(Ωt+U ), where A is an arbitrary constant, Ω and U are independent
random variables, Ω has distribution function F over [0, π], and U is uniform over
[0, 2π]. Find the autocorrelation function and spectral density function of {Xt }.
Hence show that, for any positive definite set of covariances {γk }, there exists a
process with autocovariances {γk } such that every realization is a sine wave.
[Use the following definition:
R π{γkikω
} are positive definite if there exists a nondecreasing
function F such that γk = −π e dF (ω).]

[
Z 2π
E[Xt | Ω] = 1

A cos(Ωt + u) du = 1

A sin(Ωt + u)|2π
0 =0
0

Z π Z 2π
1
E[Xt+sXt ] = 2π A cos(Ω(t + s) + u)A cos(Ωt + u) du dF (Ω)
0 0
Z π Z 2π
= 1
4π A2[cos(Ω(2t + s) + 2u) + cos(Ωs)] du dF (Ω)
0 0
Z π
= 1
2 A2 cos(Ωs)dF (Ω)
Z0 π
1
A2 eiΩs + e−iΩs dF (Ω)
 
= 4
0Z
π
1 2
= 4A eiΩs dF̄ (Ω)
−π

where we define over the range [−π, π] the nondecreasing function F̄ , by F̄ (−Ω) =
F (π) − F (Ω) and F̄ (Ω) = F (Ω) + F (π) − 2F (0), Ω ∈ [0, π]. ]

3
3. Find the spectral density function of the AR(2) process

Xt = φ1 Xt−1 + φ2 Xt−2 + ǫt .

What conditions on (φ1, φ2) are required for this process to be an indeterministic
second order stationary? Sketch in the (φ1, φ2 ) plane the stationary region.

[ We have
2
fX (ω) 1 − φ1 eiω − φ2 e2iω = σ 2 /π

Hence
σ2
fX (ω) =
π [1 + φ21 + φ22 + 2(−φ1 + φ1 φ2 ) cos(ω) − 2φ2 cos(2ω)]

The Yule-Walker equations have solution of the form ρk = Aλk1 + Bλk2 where λ1 , λ2
are roots of
g(λ) = λ2 − φ1 λ − φ2 = 0.
h p i
2
The roots are λ = φ1 ± φ1 + 4φ2 /2. To be indeterministic second order station-
ary these roots must have modulus less that 1. If φ21 + 4φ2 > 0 then the roots are real
and lie in the range [−1, 1] if and only if g(−1) > 0 and g(1) > 0, i.e., φ1 + φ2 < 1,
φ1 − φ2 > −1. If φ21 + 4φ2 < 0 then the roots are complex and their product must be
less than 1, i.e., φ2 > −1. The union of these two regions, corresponding to possible
(φ1 , φ2) for real and imaginary roots, is simply the triangular region

φ1 + φ2 < 1, φ1 − φ2 > −1, φ2 ≥ −1 .

4
4. For a stationary process define the covariance generating function

X
g(z) = γk z k , |z| < 1 .
k=−∞

Suppose {Xt } satisfies X = C(B)ǫ, that is, it has the Wold representation

X
Xt = cr ǫt−r ,
r=0
P∞ P∞
where {cr } are constants satisfying 0 c2r < ∞ and C(z) = r=0 cr z
r
. Show that

g(z) = C(z)C(z −1)σ 2 .

Explain how this can be used to derive autocovariances for the ARMA(p, q) model.
Hence show that for ARMA(1, 1), ρ22 = ρ1 ρ3 . How might this fact be useful?

[ We have
"∞ ∞
#
X X
γk = EXt Xt+k = E cr ǫt−r cs ǫt+k−s
r=0 s=0

X
= σ2 cr ck+r
r=0

Now

X ∞
X
−1 r
C(z)C(z ) = cr z cs z −s
r=0 s=0

The coefficients of z k and z −k are clearly

ck c0 + ck+1 c1 + ck+2c3 + · · ·

from which the result follows.


For the ARMA(p, q) model φ(B)X = θ(B)ǫ or
θ(B)
X= ǫ
φ(B)
where φ and θ are polynomials of degrees p and q in z. Hence
θ(z)
C(z) =
φ(z)
5
and γk can be found as the coefficient of z k in the power series expansion of
σ 2 θ(z)θ(1/z)/φ(z)φ(1/z). For ARMA(1, 1) this is

σ 2 (1 + θz)(1 + θz −1 )(1 + φz + φ2 z 2 + · · · )(1 + φz −1 + φ2 z −2 + · · · )

from which we have



γ1 = θ(1 + φ2 + φ4 + · · · )

+ (φ + φ3 + φ5 + · · · )(1 + θ2 ) + θ(φ2 + φ4 + φ6 + · · · ) σ 2

θ + φ(1 + θ2) + φ2 θ 2
= σ
1 − φ2
and similarly

θ + φ(1 + θ2) + φ2 θ 2 θ + φ(1 + θ2) + φ2 θ 2


γ2 = φ σ γ3 = φ2 σ
1 − φ2 1 − φ2
Hence ρ22 = ρ1 ρ3 . This might be used as a diagnostic to test the appropriateness of
an ARMA(1, 1) model, by reference to the correlogram, where we would expect to
see r22 = r1r3. ]

6
5. Consider the ARMA(2, 1) process defined as

Xt = φ1 Xt−1 + φ2 Xt−2 + ǫt + θ1ǫt−1 .

Show that the coefficients of the Wold representation satisfy the difference equation

ck = φ1 ck−1 + φ2 ck−2, k ≥ 2,

and hence that


ck = Az1−k + Bz2−k ,
where z1 and z2 are zeros of φ(z) = 1 − φ1 z − φ2 z 2 , and A and B are constants.
Explain how in principle one could find A and B.
P∞
[ The recurrence is produced by substituting Xt = r=0 cr ǫt−r into the defining
equation, and similarly for Xt−1 and Xt−2, multiplying by ǫt−k , k ≥ 2, and taking
expected value.
The general solution to such a second order linear recurrence relation is of the form
given and we find A and B by noting that

Xt = φ1 (φ1 Xt−2 + φ2 Xt−3 + ǫt−1 + θ1ǫt−2) + φ2 Xt−2 + ǫt + θ1ǫt−1

so that c0 = 1 and c1 = θ1 + φ1 . Hence A + B = 1 and Az1−1 + Bz2−1 = θ1 + φ1 . These


can be solved for A and B. ]

7
6. Suppose
Yt = Xt + ǫt , Xt = αXt−1 + ηt ,
where {ǫt } and {ηt} are independent white noise sequences with common variance
σ 2 . Show that the spectral density function of {Yt } is
σ 2 2 − 2α cos ω + α2
 
fY (ω) = .
π 1 − 2α cos ω + α2

For what values of p, d, q is the autocovariance function of {Yt } identical to that of


an ARIMA(p, d, q) process?

[
1
fY (ω) = fX (ω) + fǫ(ω) = fη (ω) + fǫ (ω)
|1 − αeiω |2
σ2 σ 2 2 − 2α cos ω + α2
   
1
= +1 = .
π 1 − 2α cos ω + α2 π 1 − 2α cos ω + α2
We recognise this as the spectral density of an ARMA(1, 1) model. E.g., Zt −αZt−1 =
ξt − θξt−1, choosing θ and σξ2 such that

(1 − 2θ cos ω + θ2 )σξ2 = (σ 2/π)(2 − 2α cos ω + α2 )

I.e., choosing θ such that (1 + θ2 )/θ = (2 + α2 )/α. ]

8
7. Suppose X1 , . . . , XT are values of a time series. Prove that
T −1
( )
X
γ̂0 + 2 γ̂k = 0 ,
k=1

where γ̂k is the usual estimator of the kth order autocovariance,


T
1 X
γ̂k = (Xt − X̄)(Xt−k − X̄) .
T
t=k+1
PT
Hint: Consider 0 = t=1 (Xt − X̄).
Hence deduce that not all ordinates of the correlogram can have the same sign.
Suppose f (·) is the spectral density and I(·) the periodogram. Suppose f is contin-
uous and f (0) 6= 0. Does EI(2π/T ) → f (0) as T → ∞?

[ The results follow directly from


" T #2
1 X
(Xt − X̄) = 0 .
T t=1

Note that formally,


T −1
X
I(0) = γ̂0 + 2 γ̂k = 0 .
k=1
so it might appear that EI(2π/T ) → I(0) 6= f (0) as T → ∞. However, this would
be mistaken. It is a theorem that as T → ∞, I(ωj ) ∼ f (ωj )χ22/2. So for large T ,
EI(2π/T ) ≈ f (0). ]

9
8. Suppose I(·) is the periodogram of ǫ1 , . . . , ǫT , where these are i.i.d. N (0, 1) and
T = 2m + 1. Let ωj , ωk be two distinct Fourier frequencies, Show that I(ωj ) and
I(ωk ) are independent random variables. What are their distributions?
If it is suspected that {ǫt } departs from white noise because of the presence of a
single harmonic component at some unknown frequency ω a natural test statistic is
the maximum periodogram ordinate

T = max I(ωj ) .
j=1,...,m

Show that under the hypothesis that {ǫt } is white noise


 m
P (T > t) = 1 − 1 − exp −πt/σ 2

.

[ The independence of I(ωj ) and I(ωk ) was proved in lectures. Their distributions
are (σ 2/2π)χ22, which is equivalent to the exponential distribution with mean σ 2/π.
Hence the probability that the maximum is less than t is the probability that all are,
i.e.,  m
P (T < t) = 1 − exp −πt/σ 2

.
]

10
9. Complete this sketch of the fast Fourier transform. From data X0, . . . , XT , with
T = 2M − 1, we want to compute the 2M −1 ordinates of the periodogram
2
T
1 X
it2πj/2M
I(ωj ) = Xt e , j = 1, . . . , 2M −1 .
πT t=0

This requires order T multiplications for each j and so order T 2 multiplications in


all. However,
it2πj/2M it2πj/2M M
X X X
Xt e = Xt e + Xt eit2πj/2
t=0,1,...,2M −1 t=0,2,...,2M −2 t=1,3,...,2M −1
M M
X X
= X2tei2t2πj/2 + X2t+1ei(2t+1)2πj/2
t=0,1,...,2M−1 −1 t=0,1,...,2M−1 −1
X M−1 M
X M−1
= X2teit2πj/2 + ei2πj/2 X2t+1eit2πj/2 .
t=0,1,...,2M−1 −1 t=0,1,...,2M−1 −1

Note that the value of either sum on the right hand side at j = k is the complex
conjugate of its value at j = (2M −1 − k); so these sums need only be computed for
j = 1, . . . , 2M −2. Thus we have two sums, each of which is similar to the sum on
the left hand side, but for a problem half as large. Suppose the computational effort
required to work out each right hand side sum (for all 2M −2 values of j) is Θ(M − 1).
The sum on the left hand side is obtained (for all 2M −1 values of j) by combining
the right hand sums, with further computational effort of order 2M −1. Explain

Θ(M) = a2M −1 + 2Θ(M − 1) .

Hence deduce that I(·) can be computed (by the FFT) in time T log2 T .

[ The derivation of the recurrence for Θ(M) should be obvious. We have Θ(1) = 1,
and hence Θ(M) = aM2M = O(T log2 T ). ]

11
10. Suppose we have the ARMA(1, 1) process

Xt = φXt−1 + ǫt + θǫt−1 ,

with |φ| < 1, |θ| < 1, φ + θ = 6 0, observed up to time T , and we want to calculate
k-step ahead forecasts X̂T,k , k ≥ 1.
Derive a recursive formula to calculate X̂T,k for k = 1 and k = 2.

X̂T,1 = φXT + ǫ̂T +1 + θǫ̂T = φXT + θ(XT − X̂T −1,1)


X̂T,2 = φX̂T,1 + ǫ̂T +2 + θǫ̂T +1 = φX̂T,1

12
11. Consider the stationary scalar-valued process {Xt } generated by the moving
average, Xt = ǫt − θǫt−1.
Determine the linear least-square predictor of Xt , in terms of Xt−1 , Xt−2, . . . .

[ We can directly apply our results to give

X̂t−1,1 = −θǫ̂t−1
= −θ[Xt−1 − X̂t−2,1]
= −θXt−1 + θX̂t−2,1
= −θXt−1 + θ[−θXt−2 + θX̂t−3,1]
= −θXt−1 − θ2 Xt−2 − θ3 Xt−3 − · · ·

Alternatively, take the linear predictor as X̂t−1,1 = ∞


P
r=1 ar Xt−r and seek to minimize
2
E[Xt − X̂t−1,1] . We have
" ∞
#2
X
E[Xt − X̂t−1,1]2 = E ǫt − θǫt−1 − ar (ǫt−r − θǫt−r−1 )
r=1
= σ 1 + (θ + a1 ) + (θa1 − a2 )2 + (θa2 − a3 )2 + · · ·
2 2
 

Note that all terms, but the first, are minimized to 0 by taking ar = −θr . ]

13
12. Consider the ARIMA(0, 2, 2) model

(I − B)2X = (I − 0.81B + 0.38B 2)ǫ

where {ǫt } is white noise with variance 1.


(a) With data up to time T , calculate the k-step ahead optimal forecast of X̂T,k for
all k ≥ 1. By giving a general formula relating X̂T,k , k ≥ 3 , to X̂T,1 and X̂T,2 ,
determine the curve on which all these forecasts lie.
[ The model is

Xt = 2Xt−1 − Xt−2 + ǫt − 0.81ǫt−1 + 0.38ǫt−2 .

Hence

X̂T,1 = 2XT − XT −1 + ǫ̂T +1 − 0.81ǫ̂T + 0.38ǫ̂T −1


= 2XT − XT −1 − 0.81[XT − X̂T −1,1] + 0.38[XT −1 − X̂T −2,1]

and similarly

X̂T,2 = 2X̂T,1 − XT + 0.38[XT − X̂T −1,1]


X̂T,k = 2X̂T,k−1 − X̂T,k−2, k ≥ 3.

This implies that the forecasts lie on a straight line. ]

(b) Suppose now that T = 95. Calculate numerically the forecasts X̂95,k , k = 1, 2, 3
and their mean squared prediction errors when the last five observations are X91 =
15.1, X92 = 15.8, X93 = 15.9, X94 = 15.2, X95 = 15.9.
[You will need estimates for ǫ94 and ǫ95. Start by assuming ǫ91 = ǫ92 = 0, then
calculate ǫ̂93 = ǫ93 = X93 − X̂92,1, and so on, until ǫ94 and ǫ95 are obtained.]

[ Using the above formulae we obtain

t Xt X̂t,1 X̂t,2 X̂t,3 ǫt


91 15.1 0.000 0.000
92 15.8 16.500 17.200 17.900 0.000
93 15.9 16.486 16.844 17.202 −0.600
94 15.2 15.314 14.939 14.564 −1.286
95 15.9 15.636 15.596 15.555 0.586
Now ∞ ∞
X X
XT +k = cr ǫT +k−r and X̂T,k = cr ǫT +k−r .
r=0 r=k

14
Thus
h i2 k−1
X
2
E XT +k − X̂T,k = σǫ c2r .
r=0

where σǫ2 = 1. Now

XT = ǫT + (2 − 0.81)ǫT −1 + (−2(0.81) − 1 + 0.38)ǫT −2 + · · ·

Hence the mean square errors of X̂95,1, X̂95,2, X̂95,3 are respectively 1, 1.416, 5.018. ]

15
13. Consider the state space model,

Xt = St + vt,
St = St−1 + wt ,

where Xt and St are both scalars, Xt is observed, St is unobserved, and {vt}, {wt } are
Gaussian white noise sequences with variances V and W respectively. Write down
the Kalman filtering equations for Ŝt and Pt .
Show that Pt ≡ P (independently of t) if and only if P 2 + P W = W V , and show
that in this case the Kalman filter for Ŝt is equivalent to exponential smoothing.

[ This is the same as Section 8.4 of the notes.Ft = 1, Gt = 1, Vt = V , Wt = W .


Rt = Pt−1 + W . So if (St−1 | X1 , . . . , Xt−1) ∼ N Ŝt−1, Pt−1 then (St | X1 , . . . , Xt) ∼
 
N Ŝt , Pt , where
Ŝt = Ŝt−1 + Rt (V + Rt )−1(Xt − Ŝt−1)
Rt2 V Rt V (Pt−1 + W )
Pt = Rt − = = .
V + Rt V + Rt V + Pt−1 + W

Pt is constant if Pt = P , where P is the positive root of P 2 + W P − W V = 0.


In this case Ŝt behaves like Ŝt = (1 − α) ∞ r
P
r=0 α Xt−r , where α = V /(V + W + P ).
This is simple exponential smoothing. ]

16

You might also like