0% found this document useful (0 votes)
3 views4 pages

5 Theory

The document presents exercises related to ARMA processes and their properties, focusing on stationarity and invertibility. It includes theoretical exercises, solutions for autocovariance functions, and optimal prediction methods for AR(2) and MA(q) processes. The exercises are designed for students in a mathematics and systems analysis course at Aalto University in Fall 2023.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views4 pages

5 Theory

The document presents exercises related to ARMA processes and their properties, focusing on stationarity and invertibility. It includes theoretical exercises, solutions for autocovariance functions, and optimal prediction methods for AR(2) and MA(q) processes. The exercises are designed for students in a mathematics and systems analysis course at Aalto University in Fall 2023.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Prediction and Time Series Analysis P. Pere / J.

Pere
Department of Mathematics and Systems Analysis Fall 2023
Aalto University Exercise 5.

5. Theoretical exercises

Demo exercises
Throughout these exercises, assume that E[xt−v ϵt ] = 0 for all v ≥ 1. In addition, assume that
(ϵt )t∈T ∼ i.i.d.(0, σ 2 ), such that σ 2 < +∞.
5.1 Consider the following ARMA processes:

3
xt + xt−1 = ϵt − ϵt−1 (1)
4
5 1
xt + xt−2 = ϵt − ϵt−1 + ϵt−2 (2)
6 6
1 4
xt − ϵt − xt−4 − ϵt−2 = 0 (3)
16 9
Which of the processes are (weakly) stationary? Which of the processes are invertible?
Solution. An ARMA process is stationary, if the zeros of the autoregressive polynomial
lie outside the closed unit disk. An ARMA process is invertible, if the zeros of the moving
average polynomial lie outside the closed unit disk.
(1) AR polynomial:
3
1+ L=0
4
4
L=−
3
4
− >1
3
The process is stationary. MA polynomial:
1−L=0
L=1
The process is not invertible.
(2) AR polynomial:
1 + L2 = 0
L = ±i
|L| = 1
The process is not stationary. MA polynomial:
5 1
1 − L + L2 = 0
6 6
q
5 1
6
± 36
L= 2 = 3 or 2
6
The process is invertible.

1/4
Prediction and Time Series Analysis P. Pere / J. Pere
Department of Mathematics and Systems Analysis Fall 2023
Aalto University Exercise 5.

(3) AR polynomial:
1 4
1− L =0
16
L = ±2 or
L = ±2i,

which gives,
|L| = 2
and thus the process is stationary. MA polynomial:
4
1 + L2 = 0
9
3
L=± i
2
3 3
± i =
2 2

The process is invertible.

5.2 Let γ(·) be the autocovariance function of a weakly stationary process xt . Show that
the following properties hold.
(i) γ(0) ≥ 0,
(ii) |γ(τ )| ≤ γ(0), for every τ ∈ T ,
(iii) γ(τ ) = γ(−τ ), for every τ ∈ T .
Solution. Recall that a weakly stationary process xt satises Var(xt ) < ∞, for every
t ∈ T . The autocovariance function of the process is

γ(τ ) = Cov (xt , xt−τ ) = E ((xt − E(xt )) (xt−τ − E(xt−τ ))) , t, τ ∈ T.

The rst property holds, since

γ(0) = Var(xt ) = E (xt − E(xt ))2 ≥ 0.




The second property is obtained by using the Cauchy-Schwartz inequality.

|E ((xt − E(xt )) (xt−τ − E(xt−τ )))|2 ≤ E (xt − E(xt ))2 E (xt−τ − E(xt−τ ))2
 

⇒ |Cov (xt , xt−τ )|2 ≤ Var(xt )Var(xt−τ ) = Var(xt )2


⇒ |γ(τ )| ≤ γ(0),

2/4
Prediction and Time Series Analysis P. Pere / J. Pere
Department of Mathematics and Systems Analysis Fall 2023
Aalto University Exercise 5.

since the variance of a stationary process is time invariant, we have that,

Var(xt )Var(xt−τ ) = Var(xt )2 .

We get the third property from,

γ(−τ ) = Cov (xt , xt+τ ) = Cov (xt+τ , xt ) = Cov (xt , xt−τ ) = γ(τ ),

since the autocovariance of a stationary process depends only on the time interval
between the two random variables.

5.3 Derive the optimal 3-step prediction for the stationary AR(2) process

xt = ϕ1 xt−1 + ϕ2 xt−2 + εt , εt ∼ iid(0, σ 2 ),

in the sense of mean squared error, when the process has been observed up to the time
t. Assume that xt and εs are independent when s > t. What is the recursive formula of
s-step prediction?
Solution. The 1-step prediction is

x̂t+1|t = E (xt+1 |xt , xt−1 , . . .)


= E (ϕ1 xt + ϕ2 xt−1 + εt+1 |xt , xt−1 , . . .) = ϕ1 xt + ϕ2 xt−1 .

The 2-step prediction can be obtained by using the 1-step prediction.

x̂t+2|t = E (ϕ1 xt+1 |xt , . . .) + E (ϕ2 xt |xt , . . .) + 0


= ϕ1 (ϕ1 xt + ϕ2 xt−1 ) + ϕ2 xt = ϕ21 + ϕ2 xt + ϕ1 ϕ2 xt−1 .


Similarly, the 3-step prediction is

x̂t+3|t = E (ϕ1 xt+2 |xt , . . .) + E (ϕ2 xt+1 |xt , . . .)


= ϕ1 ϕ21 + ϕ2 xt + ϕ1 ϕ2 xt−1 + ϕ1 ϕ2 xt + ϕ22 xt−1
 

= ϕ31 + 2ϕ1 ϕ2 xt + ϕ21 ϕ2 + ϕ22 xt−1 .


 

The s-step does not admit a closed form representation, but it can be written recursively
as

x̂t+s|t = ϕ1 x̂t−1+s|t + ϕ2 x̂t−2+s|t .

3/4
Prediction and Time Series Analysis P. Pere / J. Pere
Department of Mathematics and Systems Analysis Fall 2023
Aalto University Exercise 5.

Homework
5.4 Consider the following ARMA processes:

xt − xt−1 + xt−2 = ϵt + ϵt−1 − 6ϵt−2 , (4)


1 4 1
xt + xt−1 = ϵt + ϵt−1 + ϵt−2 , (5)
2 3 3
1
xt − xt−1 = ϵt + ϵt−12 . (6)
2
Which of the processes are stationary? Which of the processes are invertible?

5.5 Derive the optimal s-step prediction for the invertible MA(q ) process,
q
εt ∼ iid(0, σ 2 ),
X
xt = θi Li εt ,
i=0
θ0 = 1,

in the sense of mean squared error, when the process εt has been observed up to point
of time t.

4/4

You might also like