0% found this document useful (0 votes)
135 views22 pages

HSTS203 Time Series

The document is a unit on time series analysis from a BSc honours in statistics course at the University of Zimbabwe. It covers topics such as stationary time series models including the general linear process, moving average processes, autoregressive processes, and mixed autoregressive–moving average processes. It provides definitions, examples, and theorems regarding these time series models and their properties such as autocovariance functions, autocorrelation functions, stationarity, and invertibility conditions.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
135 views22 pages

HSTS203 Time Series

The document is a unit on time series analysis from a BSc honours in statistics course at the University of Zimbabwe. It covers topics such as stationary time series models including the general linear process, moving average processes, autoregressive processes, and mixed autoregressive–moving average processes. It provides definitions, examples, and theorems regarding these time series models and their properties such as autocovariance functions, autocorrelation functions, stationarity, and invertibility conditions.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

BSC HONOURS IN STATISTICS

HSTS 203 : Time Series Analysis


University Of Zimbabwe

L Dhliwayo
Department of Statistics
University of Zimbabwe
Contents
1 Models for Stationary Time Series ii
1.1 What this Unit is all About . . . . . . . . . . . . . . . . . . . . . . . . . ii
1.2 The General Linear Process . . . . . . . . . . . . . . . . . . . . . . . . . iii
1.3 Autocovariance Generating Function . . . . . . . . . . . . . . . . . . . . iv
1.4 Moving average process of order q, MA(q) . . . . . . . . . . . . . . . . . vii
1.4.1 Invertibility Condition of a Moving Average Process . . . . . . . viii
1.5 Autoregressive Process of Order p, AR(p) . . . . . . . . . . . . . . . . . xii
1.5.1 Autoregressive Process of Order 1, AR(1) . . . . . . . . . . . . . xii
1.5.2 Stationarity of an Autoregressive Process . . . . . . . . . . . . . xiii
1.5.3 Autocorrelation Function of an AR(p) . . . . . . . . . . . . . . . xiv
1.5.4 Autocorrelation Function of an AR(2) . . . . . . . . . . . . . . . xv
1.6 Partial Autocorrelation Function (PACF) . . . . . . . . . . . . . . . . . xix
1.7 The Mixed Autoregressive-Moving Average Process . . . . . . . . . . . xx

i
Unit 1
Models for Stationary Time
Series

1.1 What this Unit is all About

Unit Objectives

At the end of this unit students are expected to be able to

1. ,

2.

3.

4.

5.

6.

7.

8.

Definition 1.1 Consider a sequence of random variables {at }. This process is said
to be white noise if the random variable’s are independent, identically distributed and
E(at ) = µt = µ for all t.

White noise is useful and that many useful processes can be constructed from it.

ii
Models for Stationary Time Series iii

1.2 The General Linear Process


Definition 1.2 A process {Zt } is called a general linear process if it can be expressed
in the form

X
Zt = ψj at−j = at + ψ1 at−1 + ψ2 at−2 + . . . (1.1)
j=0

with ψ0 = 1, ψj are constants and {at } is white noise process

The right hand side of the equation can be made finite by making the restriction

X
ψj2 < ∞ (1.2)
j=0

A general linear model can be expressed as a linear P∞combination of present and past
values of a purely random process. The condition ψ 2 < ∞ ensures that the linear
P∞ j=0 j
representation
P∞ j=0 ψj a t−j is mathematically meaningful. More precisely, it implies
that j=0 ψj at−j converges in mean-square to Zt .
Note:

converges in probability:


P |X̄ − µ| < ξ → 1 (1.3)

converges in mean square:

lim E (X̄ − µ)2 → 0


 
(1.4)
n→∞

Thus for each t there is a random variable Zt such that

hX i2
lim E ψj at−j − Zt →0 (1.5)
n→∞

for all t there exist Zt such that

hX i2
lim E ψj at−j − Zt →0 (1.6)
n→∞

Theorem 1.1 Let {Zt } be a general linear process ⇒ {Zt } is stationary, i.e. general
linear process is stationary.

P∞ 2
P
Proof 1.1 Since j=0 ψj < ∞ ⇒ mean square converges, then we can interchange
and expectation.

P P P
(i) E(Zt ) = E[ ψi at−j ] = ψi E(at−j ) = ψ × 0 = 0
iv Models for Stationary Time Series

(ii)
Cov(Zt , Zt−k ) = E(Zt Zt−k ) − E(Zt )E(Zt−k )

= E(Zt Zt−k )

= E [(at + ψ1 at−1 + ψ2 at−2 + . . . + ψk at−k + ψk+1 at−k−1 + . . .)(at−k + ψ1 at−k−1 + . . .)]

= 0 + E ψk a2t−k + E ψk+1 ψ1 a2t−k−1 + . . .


 
since (at is a white noise process)

= ψk σa2 + ψk+1 ψ1 σa2 + . . .

= σa2 [ψk + ψk+1 ψ1 + ψk+2 ψ2 + . . .]


P∞
= σa2 j=0 ψk+j ψj ψ0 = 1 which is independent of t

So both expectation and covariance are independent of time, t, therefore a general


linear process, {Zt }, is stationary.

Note:

1. For a general linear process


γ(k) = σa2 ∞
P
j=0 ψj ψj+k k = 0, 1, 2, . . .
P∞
= σa2 j=0 ψj ψj+|k| k = . . . , −2, −1, 0, 1, 2, . . .
P∞
= σa2 j=0 ψj ψj−|k|

P∞
= σa2 j=0 ψj ψj−k k∈N

2. A general linear process with non zero mean can be obtained by adding a constant
µ to the right hand side of the equation;

X
Zt = µ + ψj at−j (1.7)
j=0

this time mean 6= 0 but µ.

1.3 Autocovariance Generating Function


This is the function that is sometimes useful for finding the autocovariance function of
a stationary time series. This generates autocovariances.

Definition 1.3 Let {γ(k)} be the autocovariance function of a stationary time series
{Zt }. Then autocovariance generating function of Zt is defined by:
X
Γ(s) = γ(k)sk (1.8)
Models for Stationary Time Series v

Example 1.1 Suppose



 1 k=0
1
γ(k) = 2 k = ±1
0 otherwise

Find the autocovariance generating function.

Solution 1.1
k
P
Γ(s) = k=0 γ(k)s

= 1s0 + 21 s1 + 12 s−1 (1.9)

s 1
=1+ 2 + 2s

Theorem 1.2 Let {Zt } be a general linear process, that is, Zt = ∞


P
P 2 j=0 ψj at−j where
{ψj } is a set of weights such that ψj < ∞ finite and {at } is a white noise process
with mean 2
P zero and variance σa . Let γ(k) be the autocovariance function of Zt and
Γ(s) = γ(k)sk be the autocovariance generating function, then;
  ∞
1 X
Γ(s) = σa2 ψ(s)ψ where ψ(s) = ψj sj (1.10)
s
j=0

P∞
ψj2 < ∞ ⇒ Zt is stationary and γ(k) = σa2
P
Proof 1.2 The condition k=−∞ ψj ψj+k .
Thus;

γ(k)sk
P
Γ(s) = k

P h P∞ i
= k σa2 j=0 ψj ψj+k sk

= σa2 ψj ψj+k sk+j s−j


P P
k j

P∞ −j
P∞
= σa2 j=0 ψj s k=−j ψj+k sj+k
P∞ −j
P∞
= σa2 j=0 ψj s h=0 ψh s
h

= σa2 ψ(s)ψ( 1s )

Example 1.2 Find the autocovariance generating function of a general linear process;
∞  j
X 1
Zt = at−j (1.11)
3
j=0

1 j

Solution 1.2 ψj = 3
vi Models for Stationary Time Series

P∞ 1 j j P∞ s j 1 3
 
ψ(s) = j=0 3 s = j=0 3 = 1−s/3 = 3−s

P∞ 1 j −j P∞ 1 j
ψ( 1s ) = 1 3
 
j=0 3 s = j=0 3s = 1−/3s = 3−1/s

  
9σa2
Γ(s) = σa2 ψ(s)ψ( 1s ) = σa2 3
3−s
3
3−1/s = (3−s)(3−1/s)

Definition 1.4 (Backward Shift Operator) The backward shift operator B is de-
fined as

BXt = Xt−1

B 2 Xt = Xt−2
..
.
B s Xt = Xt−s

Thus ∆Xt = Xt − Xt−1 = Xt − BXt = (1 − B)Xt (1.12)

The general linear process can be expressed in terms of the backward shift operator.
Let

ψj B j
P
ψ(B) =

= 1 + ψ1 B + ψ2 B 2 + ψ3 B 3 + . . .

Thus

Zt = at + ψ1 at−1 + ψ2 at−2 + ψ3 at−3 + . . .

= at + ψ1 Bat + ψ2 B 2 at + ψ3 B 3 at + . . .

= 1 + ψ1 B + ψ2 B 2 + ψ3 B 3 + . . . at


= ψ(B)at

Examples of some common time series models are presented below.

1. Moving average process of order q, MA(q): A process {Zt }∞ t=−∞ is called


a moving average process of order q, MA(q) if it can be expressed in the form:

Zt = at − θ1 at−1 − . . . − θq at−q

where {at } is white noise process i.e. a sequence of independent and identically
distributed random variables each with mean 0 and variance σa2 .
Models for Stationary Time Series vii

2. Autoregressive process of order p, AR(p): A process {Zt }∞ t=−∞ is called


an autoregressive process of order p if it satisfies the model

Zt = φ1 Zt−1 + . . . + φp Zt−p + at

where {at } is a white noise process such that at is independent of Zt−1 , Zt−2 , . . ..

3. Autoregressive Moving Average process, ARMA(p,q): A process {Zt }∞ t=−∞


is called an autoregressive moving average process of order p and q if it satisfies
the model

Zt − φ1 Zt−1 − . . . − φp Zt−p = at − θ1 at−1 − . . . − θq at−q

where {at } is a white noise process such that at is independent of Zt−1 , Zt−2 , . . .

1.4 Moving average process of order q, MA(q)


Theorem 1.3 Let {Zt } be a moving average process of order q expressed in the form;

Zt = at − θ1 at−1 − θ2 at−2 − . . . − θq at−q


Pq
= at + j=1 (−θj )at−j

Pq
= j=0 ψj at−j where ψ0 = 1 and ψj = −θj

Then

(i) E(Zt ) = 0
Pq Pq
(ii) V ar(Zt ) = σa2 2
i=0 θi = σa2 2
i=0 ψi
Pq−k
(iii) Cov(Zt , Zt−k ) = σa2 i=0 ψi ψi+k
Pq−k
ψψ
(iv) Corr(Zt , Zt−k ) = Pq i 2i+k
i=0
i=0 ψi

Proof 1.3 (i)

E(Zt ) = E (at − θ1 at−1 − θ2 at−2 − . . . − θq at−q )

= E(at ) − θ1 E(at−1 ) − θ2 E(at−2 ) − . . . − θq E(at−q ) = 0

(ii)

V ar(Zt ) = V ar (at − θ1 at−1 − θ2 at−2 − . . . − θq at−q )

= V ar(at ) + θ12 V ar(at−1 ) + θ22 V ar(at−2 ) + . . . + θq2 V ar(at−q )

= σa2 + θ12 σa2 + θ22 σa2 + . . . + θq2 σa2


Pq Pq
= σa2 2
i=0 θi = σa2 2
i=0 ψi ψ0 = θ0 = 1
viii Models for Stationary Time Series

(iii)

γ(k) = Cov(Zt , Zt−k )

= Cov(at + ψ1 at−1 + . . . + ψq at−q , at−k + ψ1 at−k−1 + . . . + at−q ψt−q−1 )


Pq−k
= σa2 i=0 ψi ψi+k

(iv)
Pq−k
Cov(Zt ,Zt−k ) ψψ
ρk = Corr(Zt , Zt−k ) = = Pq i 2i+k
i=0
V ar(Zt ) i=0 ψi

Note that {Zt } for a moving average process is a stationary process since both E(Zt )
and autocovariance function does not depend on time t. This implies that all moving
average processes are stationary.

1.4.1 Invertibility Condition of a Moving Average Process

Example 1.3 Consider the two moving average processes, M A(1)’


1
Zt = at − at−1 (1.13)
2
Find

a. E(Zt )

b. V ar(Zt )

c. Cov(Zt , Zt−k )

d. Corr(Zt , Zt−k )

Solution 1.3 a. E(Zt ) = E(at − 12 at−1 ) = 0

b. V ar(Zt ) = V ar(at − 12 at−1 ) = V ar(at ) + 41 V ar(at−1 ) = σa2 + 14 σa2 = 54 σa2

c. Cov(Zt , Zt−k ) = Cov(at − 12 at−1 , at−k − 12 at−k−1 )

k = 0 Cov(at − 12 at , at − 12 at ) = V ar(Zt ) = 45 σa2

k = 1 Cov(Zt , Zt−1 ) = Cov(at − 21 at−1 , at−1 − 12 at−2 ) = − 12 σa2

k ≥ 2 Cov(Zt , Zt−k ) = 0

5 2

 4 σa k=0
Cov(Zt , Zt−k ) = γ(k) = − 12 σa2 k = ±1
0 otherwise

Models for Stationary Time Series ix

d. Corr(Zt , Zt−k )

5 2 5 2

Cov(Zt , Zt−k ) 4 σa / 4 σa =1 k=0
Corr(Zt , Zt−k ) = ρ(k) = = − 12 σa2 / 54 σa2 = − 25 k = ±1
V ar(Zt )
0/ 45 σa2 =0 otherwise

Example 1.4 Consider the two moving average processes, M A(1)’

Zt = at − 2at−1 (1.14)

a. E(Zt )

b. V ar(Zt )

c. Cov(Zt , Zt−k )

d. Corr(Zt , Zt−k )

Solution 1.4 a. E(Zt ) = E(at − 2at−1 ) = 0

b. V ar(Zt ) = V ar(at − 2at−1 ) = V ar(at ) + 4V ar(at−1 ) = σa2 + 4σa2 = 5σa2

c. Cov(Zt , Zt−k ) = Cov(at − 2at−1 , at−k − 2at−k−1 )

k = 0 Cov(at − 2at , at − 2at ) = V ar(Zt ) = 5σa2

k = 1 Cov(Zt , Zt−1 ) = Cov(at − 2at−1 , at−1 − 2at−2 ) = −2σa2

k ≥ 2 Cov(Zt , Zt−k ) = 0

 5σa2

k=0
Cov(Zt , Zt−k ) = γ(k) = −2σa2 k = ±1
0 otherwise

d. Corr(Zt , Zt−k )

5σa2 /5σa2 = 1

k=0
Cov(Zt , Zt−k ) 
Corr(Zt , Zt−k ) = ρ(k) = = −2σa2 /5σa2 = − 25 k = ±1
V ar(Zt )
0/5σa2 =0 otherwise

Note that the two moving average processes (MA(1) processes

a. Zt = at − 21 at−1

b. Zt = at − 2at−1
x Models for Stationary Time Series

have got the same acf



 1 k=0
Corr(Zt , Zt−k ) = ρ(k) = −2 k = ±1
 5
0 otherwise

This means that we cannot specify or identify a MA process uniquely from the given
acf. But if we express the two models by writing at in terms of Zt , Zt−1 , . . . we find by
successive substitution that;

Zt = at − θat−1 = (1 − θB)at

⇒ at = (1 − θB)−1 Zt (1.15)

= Zt − θZt−1 + θ2 Zt−2 ∓ . . . [ binomial expansion of (1 − θB)−1 ]

Zt = at − 1θ at−1 = (1 − 1θ B)at

⇒ at = (1 − 1θ B)−1 Zt (1.16)

12
= Zt − 1θ Zt−1 + θ Zt−2 ∓ . . . [ binomial expansion of (1 − 1θ B)−1 ]

If |θ| < 1 series for model (3.15) converges whereas that for model (3.16) does not.
Model (3.15) is said to be invertible whereas model (3.16) is not. The imposition of
the invertibility condition ensures that there is a unique moving average process for a
given acf.
Thus the general M A(q) may be expressed as

Zt = (1 − θ1 B − θ2 B 2 − . . . − θq B q )at = θ(B)at (1.17)

where θ(B) is a polynomial of order q in B.

Lemma 1.1 An M A(q) is invertible if the roots of

θ(B) = 1 − θ1 B − θ2 B 2 − . . . − θq B q = 0 (1.18)

lie outside the unit circle.

Example 1.5 Consider the a moving average processes, M A(1)


1
Zt = at − at−1 (1.19)
2

Find

a. Show that it is stationary.


Models for Stationary Time Series xi

b. Is Zt an invertible process.

c. Find the acf of Zt

Solution 1.5 a. E(Zt ) = E(at − 21 at−1 ) = 0

V ar(Zt ) = V ar(at − 12 at−1 ) = V ar(at ) + 41 V ar(at−1 ) = σa2 + 41 σa2 = 54 σa2

Cov(Zt , Zt−k ) = Cov(at − 12 at−1 , at−k − 12 at−k−1 )

k = 0 Cov(at − 12 at , at − 12 at ) = V ar(Zt ) = 54 σa2

k = 1 Cov(Zt , Zt−1 ) = Cov(at − 21 at−1 , at−1 − 12 at−2 ) = − 12 σa2

k ≥ 2 Cov(Zt , Zt−k ) = 0

5 2

 4 σa k=0
Cov(Zt , Zt−k ) = γ(k) = − 12 σa2 k = ±1
0 otherwise

Since E(Zt ) and γ(k) do not depend on t, Zt is stationary.

b.

Zt = at − 12 at−1 = (1 − 12 B)at

⇒ θ(B) = 1 − 12 B = 0 ⇒ − 12 B = −1 ⇒B=2

Since |B| > 1 ⇒ Zt is an invertible process.

c.
 5 2 5 2
σ / σ =1 k=0
Cov(Zt , Zt−k )  41 a2 45 a2
Corr(Zt , Zt−k ) = ρ(k) = = − 2 σa / 4 σa = − 25 k = ±1
V ar(Zt )
0/ 45 σa2 =0 otherwise

Note: All M A(q) processes are stationary.


Note: Invertibility conditions for a M A(2) process Zt = a − t + θ1 at−1 + θ2 at−2 can
be identified if the parameters satisfies the following conditions:

1. |θ2 | < 1

2. θ1 + θ2 < 1

3. θ2 − θ1 < 1
xii Models for Stationary Time Series

1.5 Autoregressive Process of Order p, AR(p)

Autoregressive processes are regressions on themselves.

Definition 1.5 A process {Zt }∞ t=−∞ is called an autoregressive process of order p,


AR(p), if it can be expressed in the form:

Zt = φ1 Zt−1 + . . . + φp Zt−p + at

where {at } is a white noise process such that at is independent of Zt−1 , Zt−2 , . . ..

A current value of the series Zt is a linear combination of the p most recent past vales
of itself plus an innovation term, at , which incorporates everything in the series at
time, t, that is not explained by the past values. at is assumed to be independent of
Zt−1 , Zt−2 , . . .

1.5.1 Autoregressive Process of Order 1, AR(1)

A process {Zt }∞
t=−∞ is called an autoregressive process of order 1, AR(1), if it satisfies
the model
Zt = φZt−1 + at

Note that

E(Zt ) = 0

For k = 0

V ar(Zt ) = γ(0) = Cov(Zt , Zt )

= V ar(φZt−1 ) + σa2 by independent

= φ2 V ar(Zt−1 ) + σa2 = φ2 γ(0) + σa2 ⇒ γ(0) 1 − φ2 = σa2


 

σa2
⇒ γ(0) = 1−φ2

For k = 1

γ(1) = Cov(Zt , Zt−1 )

= Cov(φZt−1 + at , φZt−2 + at−1 )

= φ2 Cov(Zt−1 , Zt−2 ) + φσa2 = φ2 γ(1) + φσa2 ⇒ γ(1) 1 − φ2 = φσa2


 

φσa2
⇒ γ(1) = 1−φ2
= φγ(o)
Models for Stationary Time Series xiii

Continuing for k ≥ 2 we have:



σa2
γ(0) = 1−φ2








γ(1) = φγ(0) 


The Yule-Walker equations
γ(2) = φγ(1)



..



. 



γ(k) = φγ(k − 1)



 1 k=0




 φ k=1



ρ(k) =


 φ2 k = 2

 ..
.




 k
φ = for all k

Yule-Walker equations can be solved recursively or we can solve the equations explicitly
depending on the nature of the roots of characteristics equations.

1.5.2 Stationarity of an Autoregressive Process

An AR(p) Zt = φ1 Zt−1 + φ2 Zt−2 + . . . + φp Zt−p + at can be expressed as follows in


terms of the backward shift operator:

Zt − φ1 Zt−1 − φ2 Zt−2 − . . . − φp Zt−p = at

1 − φ1 B − φ2 B 2 − . . . − φp B p
 
= at

φ(B)Zt = at

Inorder to check whether an AR(p) is stationary or not, we use the characteristic


equation φ(B) = 0. If the roots of φ(B) = 0 in absolute values are greater than unit,
then the process {Zt } is stationary.

Example 1.6 Determine whether each of the following AR(p) processes are stationary
or not.

1. Zt = 21 Zt−1 + at
1 1
2. Zt = 12 Zt−1 + 12 Zt−2 + at

Solution 1.6
xiv Models for Stationary Time Series

Note: All AR(p) processes are invertible.


Note: Stationarity conditions for an AR(2) process Zt = φ1 Zt−1 + φ2 Zt−2 + at can be
identified if the parameters satisfies the following conditions:

1. |φ2 | < 1

2. φ1 + φ2 < 1

3. φ2 − φ1 < 1

Example 1.7 Find the values of φ1 for which Zt = φ1 Zt−1 + 92 Zt−2 + at is stationary.

1.5.3 Autocorrelation Function of an AR(p)

A process {Zt }∞
t=−∞ is called an autoregressive process of order p if it satisfies the
model
Zt = φ1 Zt−1 + . . . + φp Zt−p + at (1.20)
where {at } is a white noise process such that at is independent of Zt−1 , Zt−2 , . . ..
multiply both sides of the AR(p) by Zt−k , k = 1, 2, . . . and take expectations. Assume
stationarity and that at and Zt−k are independent.

E(Zt Zt−k ) = γ(k) = E [Zt−k (φ1 Zt−1 + . . . + φp Zt−p + at )]

= E [φ1 Zt−1 Zt−k ] + E [φ2 Zt−2 Zt−k ] + . . . + E [φp Zt−p Zt−k ] + E [at Zt−k ]

⇒ γ(k) = φ1 γ(k − 1) + φ2 γ(k − 2) + . . . + φp γ(k − p) k = 1, 2, . . .

Note: E(at Zt ) = E [at (φ1 Zt−1 + . . . + φp Zt−p + at )] = E a2t = σa2




when k=0: Multiply Zt by Zt and take expectations:

γ(0) = φ1 γ(1) + φ2 γ(2) + . . . + φp γ(p) + σa2

⇒ γ(0) − φ1 γ(1) − φ2 γ(2) − . . . − φp γ(p) = σa2


h i
γ(1)
⇒ γ(0) 1 − φ1 γ(0) − φ2 γ(2)
γ(0) − . . . − φ γ(p) 2
p γ(0) = σa (1.21)

⇒ γ(0) [1 − φ1 ρ(1) − φ2 ρ(2) − . . . − φp ρ(p)] = σa2

σa2
⇒ γ(0) = [1−φ1 ρ(1)−φ2 ρ(2)−...−φp ρ(p)]

divide equation [1.21] by its variance γ(0) we get ACF, ρ(k) given by:

ρk = φ1 ρk−1 + φ2 ρk−2 + . . . + φp ρk−p k = 1, 2, 3, . . . (1.22)


Models for Stationary Time Series xv

We call equation [1.22] the Yule Walker-Walker equations. This equation can be solved
recursively or we can solve the equation explicitly depending on the nature of the roots
of characteristic equation.
The Yule-Walker equations are a set of difference equations and they have a general
solution. The general solution depends on the nature of the roots that are obtained.

1. General Solution for Unique Roots


|k| |k|
ρk = A1 α1 + A2 α2 + . . . + Ap αp|k| for αi 6= αj (1.23)

where

• {αi } are roots obtained by setting ρk = αk


• A0i s constants obtained by setting ρk = αk and the initial conditions. α1 , . . . , αp
are the roots of the resulting auxiliary equations.
P
• Ai = 1 and the first (ρ − 1) Yule-Walker equations provide (p − 1) further
restrictions on the {Ai } using ρ0 = 1 and ρk = ρ−k

For an AR(2), if the roots α1 and α2 are unique real roots, then the general
solution is given by:
|k| |k|
ρk = A1 α1 + A2 α2

2. General Solution for Identical Roots


If α is a repeated root of multiplicity m then:

ρk = A1 + A2 k + . . . + Am k m−1 αk
 
(1.24)
For an AR(2), if the roots α1 and α2 are identical real roots, then the general
solution is given by:
ρk = [A1 + A2 k] αk

3. General Solution for Complex Roots


If complex roots are obtained then the general solution will be in terms of sin
and cos. For an AR(2), if α1 = a + bi and α2 = a − bi then the general solution
is given by:

ρk = Rk [A1 cos kθ + A2 sin kθ] for − π < θ < π (1.25)



where R = a2 + b2 and θ = tan−1 ( ab )

1.5.4 Autocorrelation Function of an AR(2)

General Steps For AR(2) the general solution is of the form


|k| |k|
ρk = A1 α1 + A2 α2 (1.26)

and ACF is
ρk = φ1 ρk−1 + φ2 ρk−2 (1.27)
xvi Models for Stationary Time Series

S1 Set ρk = αk , ρ0 = 1 and ρk = ρ−k


⇒ αk = φ1 αk−1 + φ2 αk−2
and solve for α to obtain α1 and α2
S2. Substitute α1 and α2 into
ρk = A1 α1k + A2 α2k
and solve for A1 and A2 by setting k = 0 and k = 1.
S3. Substitute α1 , α2 , A1 and A2 into
|k| |k|
ρk = A1 α1 + A2 α2 (1.28)

Example 1.8 Consider the following AR(2) process


1 2
Zt = Zt−1 + Zt−2 + at
3 9
1. Show that the model is stationary.
2. Showing all your working deduce that the autocorrelation function of Zt is given
by:
2 |k|
|k|
ρk = 16 5
− 13

21 3 + 21 for k = 0, ±1, ±2, . . .

Solution 1.7 1.
2. Yule-Walker equations of Zt = 13 Zt−1 + 92 Zt−2 + at is obtained from finding ex-
pectation after multiplying Zt by Zt−k
Zt Zt−k = 31 Zt−1 Zt−k + 92 Zt−2 Zt−k + at Zt−k

E [Zt Zt−k ] = 13 E [Zt−1 Zt−k ] + 29 E [Zt−2 Zt−k ] + E [at Zt−k ]

ρk = 31 ρk−1 + 29 ρk−2

S1. Set ρk = αk , ρ0 = 1 and ρk = ρ−k and solve for α


⇒ αk = 13 αk−1 + 29 αk−2

⇒ αk − 13 αk−1 − 29 αk−2 = 0
 2 1 (1.29)
αk−2 α − 3 α − 29 = 0


2
⇒ α1 = 3 and α2 = − 13
2
S2. Substitute α1 = 3 and α2 = − 13 into

ρk = A1 α1k + A2 α2k
and solve for A1 and A2 by setting k = 0 and k = 1
 k  k
2 1
ρk = A1 + A2 −
3 3
Models for Stationary Time Series xvii

when k = 0: ρ0 = A1 + A2

when k = 1: ρ1 = 23 A1 − 13 A2
From the initial conditions: ρ0 = 1, ρ1 = ρ−1
1 2
ρk = φ1 ρk−1 + φ2 ρk−2 = ρk−1 + ρk−2
3 9
when k = 1 ρ1 = 13 ρ0 + 29 ρ−1 ⇒ ρ1 = 3
7

⇒ ρ0 = 1 = A1 + A2

3
⇒ ρ1 = 7 = 23 A1 − 13 A2 (1.30)

16 5
⇒ A1 = 21 and A2 = 21

S3. Substitute α1 = 23 , α2 = − 13 , A1 = 16
21 and A2 = 5
21 into
|k| |k|
ρk = A1 α1 + A2 α2 (1.31)

we get
 |k|  |k|
16 2 5 1
ρk = + − , k = 0, ±1, ±2, . . . (1.32)
21 3 21 3

Example 1.9 Consider the following AR(2) process


1 1
Zt = Zt−1 − Zt−2 + at
2 16
1. Show that {Zt } is a stationary process.
2. Showing all your working deduce that the autocorrelation function of Zt is given
by:
   k
15 1
ρk = 1 + k for k = 0, ±1, ±2, . . .
17 4

Solution 1.8 1. Required to show that {Zt } is a stationary process.

Zt = 12 Zt−1 − 1
16 Zt−2 + at

⇒ Zt − 12 BZt + 1 2
16 B Zt = at

1 − 21 B + 1 2
 
⇒ 16 B Zt = at

⇒ 1 − 21 B + 1
16 B
2 =0

⇒ (B − 4)2 = 0 ⇒ B = 4 twice
xviii Models for Stationary Time Series

Thus Zt is a stationary process since the roots of the characteristic polynomial lie
outside a unit circle.

2. Required to show that the autocorrelation function of Zt is given by:


   k
15 1
ρk = 1 + k for k = 0, ±1, ±2, . . .
17 4

Yule-Walker equations of Zt = 21 Zt−1 − 1


16 Zt−2 + at is obtained from finding
expectation after multiplying Zt by Zt−k

Zt Zt−k = 21 Zt−1 Zt−k − 1


16 Zt−2 Zt−k + at Zt−k

E [Zt Zt−k ] = 12 E [Zt−1 Zt−k ] − 1


16 E [Zt−2 Zt−k ] + E [at Zt−k ]

ρk = 12 ρk−1 − 1
16 ρk−2

S1. Set ρk = αk , ρ0 = 1 and ρk = ρ−k and solve for α

⇒ αk = 12 αk−1 − 1 k−2
16 α

⇒ αk − 21 αk−1 + 1 k−2
16 α =0
(1.33)
⇒ αk−2 α2 − 12 α + 1
 
16 =0

1 2 1
 
⇒ α− 4 ⇒α= 4 twice
1
S2. Substitute α = 4 into
ρk = [A1 + A2 k] αk
and solve for A1 and A2 by setting k = 0 and k = 1
 k
1
ρk = [A1 + A2 k]
4
0
when k = 0: ρ0 = [A1 + A2 (0)] 14 = A1

1 1 1

when k = 1: ρ1 = [A1 + A2 (1)] 4 = 4 [1 + A2 ]
From the initial conditions: ρ0 = 1, ρ1 = ρ−1
1 1
ρk = φ1 ρk−1 + φ2 ρk−2 = ρk−1 − ρk−2
2 16
when k = 1 ρ1 = 12 ρ0 − 1
16 ρ−1 ⇒ ρ1 = 8
17

⇒ ρ0 = 1 = A1

8 1
⇒ ρ1 = 17 = 4 [1 + A2 ] (1.34)

15
⇒ A1 = 1 and A2 = 17
Models for Stationary Time Series xix

S3. Substitute α = 14 , A1 = 1 and A2 = 15


17 into

ρk = [A1 + A2 k] α2k (1.35)

we get
   k
15 1
ρk = 1 + k (1.36)
17 4

Example 1.10 Consider the following AR(2) process

1
Zt = Zt−1 − Zt−2 + at
2

1. Show that the model is stationary.

2. Showing all your working, deduce that the autocorrelation function of Zt is given
by:
 k     
1 2 kπ 1 kπ
ρk = cos + sin for k = 0, ±1, ±2, . . .
2 4 3 4

Solution 1.9 Consider the following AR(2) process

1
Zt = Zt−1 − Zt−2 + at
2

1. Required to show that the model is stationary.

2. Required to show that the autocorrelation function of Zt is given by:


 k     
1 2 kπ 1 kπ
ρk = cos + sin for k = 0, ±1, ±2, . . .
2 4 3 4

1.6 Partial Autocorrelation Function (PACF)


Definition 1.6 Let {Zt } be a stationary time series. The PACF at lag k, k =
1, 2, 3, . . . is defined by

φkk = Corr(Zt , Zt−k |Zt−1 , Zt−2 , . . . , Zt−(k−1) )

Note

i. P ACF = φkk is a conditional ACF given or after the effect of interming variables

Zt−1 , Zt−2 , . . . , Zt−(k−1)

ii. φ11 = Corr(Zt , Zt−1 ) = ρ(1)


xx Models for Stationary Time Series

Lemma 1.2 Let {Zt } be a stationary time series. Then the PACF is the value of φk
in the autoregressive
Zt = φ1 Zt−1 + . . . + φk Zt−k + at
such that
φk minimizes E [Zt − φ1 Zt−1 − . . . − φk Zt−k ]2 = E a2t
 

Example 1.11 Let Zt = φ1 Zt−1 + at

Solution 1.10
E a2t = E [Zt − φ1 Zt−1 ]2 = F (φ)
 

= E Zt − 2φ1 Zt Zt−1 + φ2 Zt−1


2
 

= γ(0) − 2φ1 γ(1) + φ21 γ(0)

dF (φ)
dφ = −2γ(1) + 2φ1 γ(0) = 0

γ(1)
⇒φ = γ(0) = ρ1

Note: The Behaviour of ACF and PACF

1. For an M A(q) process


i. ACF cuts off after lag q
ii. PACF tails off after lag q
2. For an AR(p) process
i. ACF tails off after lag p
ii. PACF cuts off after lag p
3. For an ARM A(p, q) process
i. ACF tails off after lag q
ii. PACF tails off after lag p

1.7 The Mixed Autoregressive-Moving Average Process


A general mixed autoregressive-moving average process is a process that we get by
combining both M A(q) and AR(p) processes. we denote this process ARM A(p, q) and
is given by
Zt = φ1 Zt−1 + φ2 Zt−2 + . . . + φp Zt−p + at − θ1 at−1 − . . . − θq at−q

We say that the process {Zt } is a mixed autoregressive-moving average process of order
p and q respectively. Since the ARM A(p, q) is a hybrid of two processes it means that
the features of these processes still prevail. For this general ARM A(p, q) we state the
following facts without proof.
Models for Stationary Time Series xxi

i. Subject to at being independent of Zt−1 , Zt−2 , . . ., a stationary solution for an


ARMA process exist if and only if the roots of the AR(p) characteristic equation
Φ(x) = 1 − φ1 x − φ2 x2 − . . . − φp xp = 0 all exceed unity in modulus..

ii. An ARM A(p, q) process is invertible if and only if the roots of the M A(q) char-
acteristic equation Θ(x) = 1 − θ1 x − θ2 x2 − . . . − θq xq = 0 all exceed unity in the
modulus.

You might also like