Handout2 Arma
Handout2 Arma
Laura Mayoral
where
∞
1. ψ 0 = 1 and ∑ψ j2 < ∞,
j=0
Φ p (L)Z t = Θ q (L)at
€ AR(p) MA(q)
MA(q) processes
Moving Average of order 1, MA(1)
Z t = µ + at + θat −1 → MA(1)
- Expectation
€ E(Z t ) = µ + E(at ) + θE(at −1) = µ
€
- Variance
Var(Z t ) = E(Z t − µ) 2 = E (at + θat −1 ) 2 =
= E(at2 + θ 2 at2−1 + 2θat at −1 ) = σ a2 (1+ θ 2 )
€
Autocovariance
1st. order
€ E(Z t − µ)(Z t −1 − µ) = E (at + θat −1 )(at −1 + θat −2 ) =
= E(at at −1 + θat2−1 + θat at −2 + θ 2 at −1at −2 ) = θσ a2
-Autocovariance of higher order
- Autocorrelation
€
γ1 θσ 2 θ
ρ1 = = 2 2
=
γ 0 (1+ θ )σ 1+ θ 2
ρ j = 0 j >1
€
Partial autocorrelation
MA(1): Stationarity and Ergodicity
Stationarity
MA(1) process is always covariance-stationary because
E ( Z t ) = µ Var ( Z t ) = (1 + θ 2 )σ 2
γ1 θσ 2 θ
ρ1 = = =
γ 0 (1+ θ 2 )σ 2 1+ θ 2
ρ j = 0 j >1
Ergodicity
MA(1) process
€ is ergodic for first and second moments because
∞
∑ j
γ = σ 2
(1+ θ 2
) + θσ 2
<∞
j =0
Z t = µ + at + θ1at −1 + θ 2 at −2 + + θ q at − q
First and Second moments of a MA(q)
E(Z t ) = µ
γ 0 = var(Z t ) = (1+ θ12 + θ 22 + + θ q2 )σ a2
γ j = E(at + θ1at−1 + + θ q at−q )(at− j + θ1at− j−1 + + θq at− j−q )
'(θ j + θ j +1θ1 + θ j +2θ2 + + θ qθ q− j )σ 2 for j ≤ q
γj =(
)0 for j > q
γ j θ j + θ j +1θ1 + θ j +2θ2 + + θ qθ q− j
ρj = = q
γ0 2
∑i θ
i=1
Example MA(2)
θ1 + θ1θ 2 θ2
ρ1 = 2
ρ2 = ρ3 = ρ4 = = ρk = 0
1+ θ1 + θ 2 2
1+ θ12 + θ 22
Invertibility: definition
j= 0
and
∞
at = ∑ π€j Z t − j, t = 0,±1,...
j =0
Theorem:
Let {Zt} be a MA(q). Then {Zt} is invertible if and
only if θ (x) ≠ 0 for all x ∈ C such that | x |≤ 1.
1) Is it stationary? Is it ergodic?
∑ψ ψ i i+ j
i= 0
ρj = ∞
∑ i
ψ 2
i= 0
€
MA(infinite): stationarity condition
∑ψ 2
i <∞
i= 0
Proposition 1. ∞ ∞
2
∑ψ
i =0
i < ∞ ⇒ ∑ψ < ∞
i =0
i
(absolutely (square
summable) summable)
Proposition 2.
∞ ∞
∑ψ i < ∞ ⇒ ∑ γi < ∞
i= 0 i= 0
Proof 1. å i
y < ¥ Þ å i <¥
y 2
i =0 i =0
∞
If ∑ ψ i < ∞ ⇒ ∃ N < ∞ such that ψ i < 1 ∀i ≥ N
i= 0
∞ ∞
2
ψ i2 < ψ i ∀i ≥ N ⇒ ∑ψ i < ∑ ψ i
i= N i= N
Now,
∞ N −1 ∞ N −1 ∞
2 2 2 2
∑ i ∑ i ∑ i ∑ i + ∑|ψi |
ψ = ψ + ψ < ψ
i= 0 i= 0 i= N i= 0 i= N
(1) (2)
(1) It is finite because N is finite
(2) It is finite because is absolutely summable
The picture can't be displayed.
€
then
Proof 2.
∞
γ j = σ 2 ∑ψ iψ i + j
i =0
∞ ∞
γ j = σ 2 ∑ψ iψ i + j ≤ σ 2 ∑ ψ iψ i + j
i =0 i =0
∞ ∞ ∞ ∞ ∞
2 2
∑ j
γ
j =0
≤ σ ∑∑ i i+ j
ψ ψ = σ ∑∑ ψ i ψ i+ j =
j =0 i =0 j =0 i =0
∞ ∞ ∞
= σ 2 ∑ ψ i ∑ ψ i+ j < σ 2 ∑ ψ i M < σ 2 M 2 < ∞
i =0 j =0 i =0
∞
because by assumption ∑ψ i+ j <M
j =0
AR(p) processes
AR(1) process
Z t = c + φZ t −1 + at
AR(1): Stationarity
Z t = c + φc + φ 2 Z t− 2 + φat−1 + at =
= c(1+ φ + φ 2 + ) + at + φat−1 + φ 2 at− 2 +
geometric progression
MA( ∞ )
if f < 1 Þ
€ 1
(1)1 + f + f + ! = 2
bounded sequence
1-f
¥ ¥
1
(2)åy 2
j =å f 2j
= < ¥ if f < 1
j =0 j =0 1-f 2
∑ψ 2
j < ∞ is a sufficient condition for stationarity
j =0
AR(1): First and second order moments
1− φ2 a
Autocovariance of a stationary AR(1)
[ ]
= φE ( Z t−1 − µ)( Z t− j − µ) + at ( Z t− j − µ) = φγ j−1
g j = fg j -1 j ³1
€
Autocorrelation of a stationary AR(1)
γj γ j−1
ρj = =φ = φρ j−1 j ≥1
γo γ0
ρ j = φ 2 ρ j− 2 = φ 3 ρ j− 3 = = φ j ρ 0 = φ j
AR(1): Partial autocorrelation function
φ11 = ρ `1 = φ
1 ρ1
ρ1 ρ 2 ρ 2 − ρ12 φ 2 − φ 2
φ 22 = = 2
= 2
=0
1 ρ1 1− ρ1 1− ρ1
ρ1 1
φ kk = 0 k ≥ 2
€
AR(1): Ergodicity
Iterating we obtain
Z t = a t + φ1a t + ...+ φ1k a t -k + φ1Z t -k -1 .
If φ1 < 1 we showed that
∞
Z t = ∑ φ1 j at − j
j= 0
It is customary to restrict
€ attention to AR(1)
processes with f1 < 1
representations.
Causality and Stationarity, III
€
Let 1/α1 and 1/α 2 be the roots of the AR polynomial
such that 1- φ1L − φ 2 L2 = (1 − α1L)(1 − α 2 L).
Then, φ1 = α1 + α 2 and φ 2 = −(α1α 2 )
Z t is stationary iff : α i < 1,i = {1,2}.
First and Second order moments
Mean of AR(2)
E(Z t ) = c + φ1 E (Z t ) + φ 2 E(Z t ) ⇒
c
E(Z t ) = µ =
1− φ1 − φ 2
Variance
γ 0 = E(Z t − µ€) 2 = φ1 E(Z t−1 − µ)(Z t − µ) + φ 2 E(Z t−2 − µ)( Z t − µ) + E(Z t − µ)at
γ 0 = φ1γ −1 + φ 2γ −2 + σ 2 a
γ 0 = φ1ρ1γ 0 + φ 2 ρ 2γ 0 + σ 2 a
σ 2a
γ0 =
1− φ1ρ1 − φ 2 ρ 2
Autocorrelation function
γ j = E ( Z t − µ )( Z t − j − µ ) = φ1γ j −1 + φ2γ j −2 j ≥1
Difference equation:
€
ρ j = φ1ρ j−1 + φ 2 ρ j−2 j ≥1
φ1 &
ρ1 = #
€ j =1 ρ = φ ρ +φ ρ & 1 − φ2
#
1 1 0 2 1
%→ 2 %
j = 2 ρ 2 = φ1 ρ1 + φ2 ρ0 $ φ1
ρ2 = + φ2 #
1 − φ2 #$
j = 3 ρ3 = φ1 ρ 2 + φ2 ρ1
Partial Autocorrelations
φ1 ρ 2 − ρ12
φ11 = ρ1 = ; φ 22 = 2 ; φ 33 = 0
1− φ 2 1− ρ1
€
(complex roots)
AR(p) process
Z t = c + φ1Z t −1 + φ2 Z t −2 + .......φ p Z t − p + at
Causality
Autocorrelation Function
ρ k = φ1ρ k−1 + φ 2 ρ k−2 + ......φ p ρ k− p
ρ1 = φ1ρ 0 + φ 2 ρ1 + ......φ p ρ p−1 %
' System of equations.
ρ 2 = φ1ρ11 + φ 2 ρ 0 + ......φ p ρ p−2 ' The first p
& autocorrelations:
'
p unknowns and p equations
ρ p = φ1ρ p−1 + φ 2 ρ p−2 + ......φ p ρ 0 '(
ACF decays as mixture of exponentials and/or damped sine waves,
Depending on real/complex roots
€
PACF
φ kk = 0 for k > p
Relationship between AR(p) and MA(q)
Stationary AR(p)
Invertible MA(q)
€
ARMA(p,q) Processes
ARMA(p,q)
F p ( L) Z t = Q q ( L)at
Invertibility ® roots of Q q ( x) = 0 x >1
Stationarity ® roots of F p ( x) = 0 x >1
F p ( L)
Pure AR representation ® P ( L) Z t = Z t = at
Q q ( L)
Q q ( L)
Pure MA representation ® Z t = at = Y ( L)at
F p ( L)
ARMA(1,1)
(1 − φL) Z t = (1 − θL)at
stationarity → φ < 1
invertibility → θ < 1
pure AR form → Π (L)Z t = at π j = (φ − θ )θ j −1 j ≥1
pure MA form → Z t = Ψ (L)at ψ j = (φ − θ )φ j −1 j ≥1
ACF of ARMA(1,1)
Z t Z t −k = φZ t −1Z t −k + at Z t −k − θat −1Z t −k
taking expectations
γ k = φγ k −1 + E(at Z t −k ) − θE(at −1Z t −k )
€
'1 k =0
)
) (φ − θ )(1 − φθ )
ρk = ( 2 k =1
) 1 + θ − 2φθ
)
*φρ k −1 k ≥2
PACF
€ MA(1) ⊂ ARMA(1,1)
exponential decay