0% found this document useful (0 votes)
28 views58 pages

Handout2 Arma

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
28 views58 pages

Handout2 Arma

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 58

Introduction to Time Series Analysis

Handout 2: Stationary Processes. Wold


Decomposition and ARMA processes

Laura Mayoral

IAE and BGSE


IDEA, Winter 2019
• This lecture introduces the basic linear
models for stationary processes.

• Most economic variables are non-stationary.

• However, stationary linear models are used


as building blocks in more complicated
nonlinear and/or non-stationary models.
Roadmap
§ The Wold decomposition

§ From the Wold decomposition to the


ARMA representation

§ MA processes and invertibility

§ AR processes, stationarity and causality

§ ARMA, invertibility and causality.


The Wold Decomposition
Wold theorem in words:

Any stationary process {Zt} can be expressed


as a sum of two components:
- a stochastic component: a linear
combination of lags of a white noise
process.
- a deterministic component, uncorrelated
with the latter stochastic component.
The Wold Decomposition
If {Zt} is a nondeterministic stationary time
series, then:

Zt = ∑ψ j at −j + Vt = Ψ(L)at + Vt ,
j =0

where

1. ψ 0 = 1 and ∑ψ j2 < ∞,
j=0

2. at = Zt - P(Zt | Zt -1 ,Zt -2 ,...), where P(. | .) denotes linear projection.


3. {at } is WN(0,σ 2 ), with σ 2 > 0,
3. Cov(as , Vt ) = 0 for all s and t,
4. The ψ i 's and the a's are unique.
5. {Vt } is deterministic.
Importance of the Wold Decomposition
• This theorem implies that any stationary process can be written
as a linear combination of a lagged values of a white noise
process (this is the MA(∞) representation).
• By inverting the corresponding polynomial, we can obtain a
representation of Zt that depends on past values of the variable
and the contemporaneous value of the white noise.
• This is the AR(∞) representation of Zt.
• We will see that the AR representation can be estimated using
standard methods: OLS!
• Problem: we might need to estimate a lot of parameters.
• ARMA models: they are an approximation to former
representations that tries to be more parsimonious (=less
parameters)
Birth of ARMA(p,q) models
Under general conditions the infinite lag polynomial of the Wold
Decomposition can be approximated by the ratio of two finite lag
polynomials: Q q ( L)
Y ( L) »
F p ( L)
Therefore
Θ q (L)
Z t = Ψ(L)at ≈ at ,
Φ p (L)

Φ p (L)Z t = Θ q (L)at

(1 − φ1L − ... − φ p Lp )Z t = (1 + θ1L + ... + θ q Lq )at

Z t − φ1Z t−1 − ... − φ p Z t− p = at + θ1at−1 + ... + θ q at− q

€ AR(p) MA(q)
MA(q) processes
Moving Average of order 1, MA(1)

Let {at } a zero-mean white noise process at → (0, σ a2 )

Z t = µ + at + θat −1 → MA(1)

- Expectation
€ E(Z t ) = µ + E(at ) + θE(at −1) = µ

- Variance
Var(Z t ) = E(Z t − µ) 2 = E (at + θat −1 ) 2 =
= E(at2 + θ 2 at2−1 + 2θat at −1 ) = σ a2 (1+ θ 2 )

Autocovariance
1st. order
€ E(Z t − µ)(Z t −1 − µ) = E (at + θat −1 )(at −1 + θat −2 ) =
= E(at at −1 + θat2−1 + θat at −2 + θ 2 at −1at −2 ) = θσ a2
-Autocovariance of higher order

E(Z t − µ)(Z t − j − µ) = E(at + θat −1 )(at − j + θat − j −1 ) =


= E(at at − j + θat −1at − j + θat at − j −1 + θ 2 at −1at − j −1 ) = 0 j >1

- Autocorrelation

γ1 θσ 2 θ
ρ1 = = 2 2
=
γ 0 (1+ θ )σ 1+ θ 2
ρ j = 0 j >1


Partial autocorrelation
MA(1): Stationarity and Ergodicity
Stationarity
MA(1) process is always covariance-stationary because
E ( Z t ) = µ Var ( Z t ) = (1 + θ 2 )σ 2

γ1 θσ 2 θ
ρ1 = = =
γ 0 (1+ θ 2 )σ 2 1+ θ 2
ρ j = 0 j >1

Ergodicity
MA(1) process
€ is ergodic for first and second moments because

∑ j
γ = σ 2
(1+ θ 2
) + θσ 2
<∞
j =0

If at were Gaussian, then Zt would be ergodic for all moments


MA(q) processes

A process is MA(q) if it can be written as a linear


combination of q lags of a white noise process.

Z t = µ + at + θ1at −1 + θ 2 at −2 +  + θ q at − q
First and Second moments of a MA(q)

E(Z t ) = µ
γ 0 = var(Z t ) = (1+ θ12 + θ 22 +  + θ q2 )σ a2
γ j = E(at + θ1at−1 +  + θ q at−q )(at− j + θ1at− j−1 +  + θq at− j−q )
'(θ j + θ j +1θ1 + θ j +2θ2 +  + θ qθ q− j )σ 2 for j ≤ q
γj =(
)0 for j > q
γ j θ j + θ j +1θ1 + θ j +2θ2 +  + θ qθ q− j
ρj = = q
γ0 2
∑i θ
i=1

Example MA(2)
θ1 + θ1θ 2 θ2
ρ1 = 2
ρ2 = ρ3 = ρ4 =  = ρk = 0
1+ θ1 + θ 2 2
1+ θ12 + θ 22
Invertibility: definition

- A MA(q) process is said to be invertible if there


exists a sequence of constants {π j } such that ∑ | π j |< ∞

j= 0

and

at = ∑ π€j Z t − j, t = 0,±1,...
j =0

- In other words, Zt is invertible if it admits and


autoregressive representation.

Necessary and Sufficient Conditions for
Invertibility

Theorem:
Let {Zt} be a MA(q). Then {Zt} is invertible if and
only if θ (x) ≠ 0 for all x ∈ C such that | x |≤ 1.

The coefficients {pj} are determined by the relation:




1
π (x) = ∑ π j x =
j
, | x |≤ 1.
j =0
θ (x)
MA processes are not uniquely identified
Consider the autocorrelation function of these
two MA(1) processes:
Z t = µ + at + θat −1 Z *t = µ + a*t +(1/θ )a*t −1

The autocorrelation functions are:


€ θ
€ 1) ρ1 = 2
1+ θ
1/θ θ
2) ρ *1 = 2 = 2
1+ (1/θ ) 1+ θ
MA processes are not uniquely identified, II

§ Then, these two processes show identical


correlation pattern: The MA coefficient is not
uniquely identified.

§ In other words: any MA(1) process has two


representations (one with MA parameter larger
than 1, and the other, with MA parameter
smaller than 1).
MA processes are not uniquely identified, III

•This means that each MA(1) has two


representations: one that is invertible, another
one that is not.
Z

•We prefer representations that are invertible so


we will choose the representation with q<1.
MA processes are not uniquely identified, IV

•The same problem is present for MA(q)


processes.

•In this case, one needs to look at the roots of


Z

the MA(q) polynomial: roots smaller than 1


imply non-invertibity.

•There is always an invertible representation,


obtained by inverting the root that is smaller
than 1.
Exercise

Consider a MA(1) process with a MA


coefficient equal to 1.3

1) Is it stationary? Is it ergodic?

2) is it invertible? If it is not, suggest an


alternative representation that has identical
autocorrelation structure and is invertible
MA(infinite)

This is the most general MA process.

It contains and infinite number of lags of a


white noise process.

Z t = µ + ∑ψ j at− j ψ0 = 1
j=0
MA(infinite): moments

E (Z t ) = µ, Var(Z t ) = σ a2 ∑ψ i2
i= 0

γ j = E [(Z t − µ)(Z t− j − µ)] = σ 2 ∑ψ iψ i+ j
i= 0

∑ψ ψ i i+ j
i= 0
ρj = ∞

∑ i
ψ 2

i= 0


MA(infinite): stationarity condition

Notice that in order to define the second


order moments we need

∑ψ 2
i <∞
i= 0

The process is covariance-stationary


provided the former condition holds.

Some interesting results

Proposition 1. ∞ ∞
2
∑ψ
i =0
i < ∞ ⇒ ∑ψ < ∞
i =0
i

(absolutely (square
summable) summable)

Proposition 2.
∞ ∞

∑ψ i < ∞ ⇒ ∑ γi < ∞
i= 0 i= 0

Ergodic for second moments


¥ ¥

Proof 1. å i
y < ¥ Þ å i <¥
y 2

i =0 i =0


If ∑ ψ i < ∞ ⇒ ∃ N < ∞ such that ψ i < 1 ∀i ≥ N
i= 0
∞ ∞
2
ψ i2 < ψ i ∀i ≥ N ⇒ ∑ψ i < ∑ ψ i
i= N i= N

Now,
∞ N −1 ∞ N −1 ∞
2 2 2 2
∑ i ∑ i ∑ i ∑ i + ∑|ψi |
ψ = ψ + ψ < ψ
i= 0 i= 0 i= N i= 0 i= N
(1) (2)
(1) It is finite because N is finite
(2) It is finite because is absolutely summable
The picture can't be displayed.


then
Proof 2.


γ j = σ 2 ∑ψ iψ i + j
i =0
∞ ∞
γ j = σ 2 ∑ψ iψ i + j ≤ σ 2 ∑ ψ iψ i + j
i =0 i =0
∞ ∞ ∞ ∞ ∞
2 2
∑ j
γ
j =0
≤ σ ∑∑ i i+ j
ψ ψ = σ ∑∑ ψ i ψ i+ j =
j =0 i =0 j =0 i =0
∞ ∞ ∞
= σ 2 ∑ ψ i ∑ ψ i+ j < σ 2 ∑ ψ i M < σ 2 M 2 < ∞
i =0 j =0 i =0

because by assumption ∑ψ i+ j <M
j =0
AR(p) processes
AR(1) process

An autoregressive process Z is a function of its own


past and a contemporaneous value of a
white noice sequence

Z t = c + φZ t −1 + at
AR(1): Stationarity

AR(1) process is stationary if φ <1

Z t = c + φc + φ 2 Z t− 2 + φat−1 + at =
= c(1+ φ + φ 2 + ) + at + φat−1 + φ 2 at− 2 + 
geometric progression 
 
MA( ∞ )
if f < 1 Þ
€ 1
(1)1 + f + f + ! = 2
bounded sequence
1-f
¥ ¥
1
(2)åy 2
j =å f 2j
= < ¥ if f < 1
j =0 j =0 1-f 2

∑ψ 2
j < ∞ is a sufficient condition for stationarity
j =0
AR(1): First and second order moments

Mean of a stationary AR(1)


c
Zt = + at + φat−1 + φ 2 at− 2 + 
1− φ
c
µ = E(Z t ) =
1− φ

Variance of a stationary AR(1)


€ 1
γ 0 = (1+ φ + φ +)σ =
2 4 2
σ 2

1− φ2 a
Autocovariance of a stationary AR(1)

- You need to solve a system of equations:


[ ] [ ]
γ j = E ( Z t − µ)( Z t− j − µ) = E (φ ( Z t−1 − µ) + at )( Z t− j − µ) =

[ ]
= φE ( Z t−1 − µ)( Z t− j − µ) + at ( Z t− j − µ) = φγ j−1

g j = fg j -1 j ³1

Autocorrelation of a stationary AR(1)
γj γ j−1
ρj = =φ = φρ j−1 j ≥1
γo γ0
ρ j = φ 2 ρ j− 2 = φ 3 ρ j− 3 =  = φ j ρ 0 = φ j
AR(1): Partial autocorrelation function

PACF: from Yule-Walker equations

φ11 = ρ `1 = φ
1 ρ1
ρ1 ρ 2 ρ 2 − ρ12 φ 2 − φ 2
φ 22 = = 2
= 2
=0
1 ρ1 1− ρ1 1− ρ1
ρ1 1
φ kk = 0 k ≥ 2


AR(1): Ergodicity

A stationary AR(1) process is ergodic for


first and second moments.

Show this as an exercise.


Causality and Stationarity
Consider the AR(1) process, Z t = f1Z t -1 + at

Iterating we obtain
Z t = a t + φ1a t + ...+ φ1k a t -k + φ1Z t -k -1 .
If φ1 < 1 we showed that

Z t = ∑ φ1 j at − j
j= 0

This cannot be done if φ1 ≥1, (no mean - square convergence)


€However, in this case one could write
Z t = φ1−1Z t +1 − φ1−1at +1

Then, Z t = −∑ φ1− j at + j
j= 0

and this is a stationary representation of Zt .


Causality and Stationarity, II
However, this stationary representation
depends on future values of at
f1 < 1 f1 < 1

It is customary to restrict
€ attention to AR(1)
processes with f1 < 1

Such processes are called stationary but also


CAUSAL, or future-indepent AR
f1 < 1

representations.
Causality and Stationarity, III

Definition: An AR(p) process defined by the equation


f p ( L) Z t = a t

is said to be causal, or a causal function of {at},


if there exists a sequence of constants ∞

{ψ } such that ∑ | ψ |< ∞ and Z t = ∑ψ j at − j,
j j t = 0,±1,...
j= 0
j =0
- A necessary and sufficient condition for causality is

φ (x) ≠ 0 for all x ∈C such that | x |≤ 1.



AR(2)
Z t = c + φ1Z t−1 + φ 2 Z t−2 + at

Stationarity Study of the roots of the characteristic equation


Let 1/α1 and 1/α 2 be the roots of the AR polynomial
such that 1- φ1L − φ 2 L2 = (1 − α1L)(1 − α 2 L).
Then, φ1 = α1 + α 2 and φ 2 = −(α1α 2 )
Z t is stationary iff : α i < 1,i = {1,2}.
First and Second order moments

Mean of AR(2)
E(Z t ) = c + φ1 E (Z t ) + φ 2 E(Z t ) ⇒
c
E(Z t ) = µ =
1− φ1 − φ 2
Variance
γ 0 = E(Z t − µ€) 2 = φ1 E(Z t−1 − µ)(Z t − µ) + φ 2 E(Z t−2 − µ)( Z t − µ) + E(Z t − µ)at
γ 0 = φ1γ −1 + φ 2γ −2 + σ 2 a
γ 0 = φ1ρ1γ 0 + φ 2 ρ 2γ 0 + σ 2 a
σ 2a
γ0 =
1− φ1ρ1 − φ 2 ρ 2
Autocorrelation function

γ j = E ( Z t − µ )( Z t − j − µ ) = φ1γ j −1 + φ2γ j −2 j ≥1

Difference equation:

ρ j = φ1ρ j−1 + φ 2 ρ j−2 j ≥ 1


ρ j = φ1ρ j−1 + φ 2 ρ j−2 j ≥1

φ1 &
ρ1 = #
€ j =1 ρ = φ ρ +φ ρ & 1 − φ2
#
1 1 0 2 1
%→ 2 %
j = 2 ρ 2 = φ1 ρ1 + φ2 ρ0 $ φ1
ρ2 = + φ2 #
1 − φ2 #$
j = 3 ρ3 = φ1 ρ 2 + φ2 ρ1
Partial Autocorrelations

Partial autocorrelations: from Yule-Walker equations

φ1 ρ 2 − ρ12
φ11 = ρ1 = ; φ 22 = 2 ; φ 33 = 0
1− φ 2 1− ρ1


(complex roots)
AR(p) process

Z t = c + φ1Z t −1 + φ2 Z t −2 + .......φ p Z t − p + at

Causality

All p roots of the characteristic equation


outside of the unit circle
Second order moments

Autocorrelation Function
ρ k = φ1ρ k−1 + φ 2 ρ k−2 + ......φ p ρ k− p
ρ1 = φ1ρ 0 + φ 2 ρ1 + ......φ p ρ p−1 %
' System of equations.
ρ 2 = φ1ρ11 + φ 2 ρ 0 + ......φ p ρ p−2 ' The first p
& autocorrelations:
 '
p unknowns and p equations
ρ p = φ1ρ p−1 + φ 2 ρ p−2 + ......φ p ρ 0 '(
ACF decays as mixture of exponentials and/or damped sine waves,
Depending on real/complex roots

PACF
φ kk = 0 for k > p
Relationship between AR(p) and MA(q)

Stationary AR(p)

Φ p (L)Z t = at Φ p (L) = (1 − φ1L − φ 2 L2 − ....φ p Lp )


1
€ ⇒ Φ p (L)Ψ(L) = 1
= Ψ(L)
Φ p (L) €
1
Zt = at = Ψ(L)at Ψ(L) = (1+ ψ1L + ψ 2 L2 + ....)
Φ p (L)
Relationship between AR(p) and MA(q), II

Invertible MA(q)

Z t = Θq (L)at Θq (L) = (1 − θ1L − θ 2 L2 − ....θ q Lq )


1
= Π(L) ⇒ Θq (L)Π(L) = 1
Θq (L)
€1
Π(L)Z t = €Z t = at Π(L) = (1+ π1L + π 2 L2 + ....)
Θq (L)


ARMA(p,q) Processes
ARMA(p,q)

F p ( L) Z t = Q q ( L)at
Invertibility ® roots of Q q ( x) = 0 x >1
Stationarity ® roots of F p ( x) = 0 x >1
F p ( L)
Pure AR representation ® P ( L) Z t = Z t = at
Q q ( L)
Q q ( L)
Pure MA representation ® Z t = at = Y ( L)at
F p ( L)
ARMA(1,1)

(1 − φL) Z t = (1 − θL)at
stationarity → φ < 1
invertibility → θ < 1
pure AR form → Π (L)Z t = at π j = (φ − θ )θ j −1 j ≥1
pure MA form → Z t = Ψ (L)at ψ j = (φ − θ )φ j −1 j ≥1
ACF of ARMA(1,1)
Z t Z t −k = φZ t −1Z t −k + at Z t −k − θat −1Z t −k

taking expectations
γ k = φγ k −1 + E(at Z t −k ) − θE(at −1Z t −k )

you get this system of equations


2 2
€ k =0 E(at Z t ) = σ a E (at −1Z t ) = (φ − θ )σa
γ 0 = φγ 1 + σa 2 − θ (φ − θ )σa 2
k = 1 γ 1 = φγ 0 − θσa 2
#system of 2 equations and 2 unknowns
k ≥ 2 γ k = φγ k −1 "
!solve for γ 0 and γ 1
ACF

'1 k =0
)
) (φ − θ )(1 − φθ )
ρk = ( 2 k =1
) 1 + θ − 2φθ
)
*φρ k −1 k ≥2

PACF
€ MA(1) ⊂ ARMA(1,1)
exponential decay

You might also like