HSTS203 Time Series
HSTS203 Time Series
L Dhliwayo
Department of Statistics
University of Zimbabwe
Contents
1 Models for Stationary Time Series ii
1.1 What this Unit is all About . . . . . . . . . . . . . . . . . . . . . . . . . ii
1.2 The General Linear Process . . . . . . . . . . . . . . . . . . . . . . . . . iii
1.3 Autocovariance Generating Function . . . . . . . . . . . . . . . . . . . . iv
1.4 Moving average process of order q, MA(q) . . . . . . . . . . . . . . . . . vii
1.4.1 Invertibility Condition of a Moving Average Process . . . . . . . viii
1.5 Autoregressive Process of Order p, AR(p) . . . . . . . . . . . . . . . . . xii
1.5.1 Autoregressive Process of Order 1, AR(1) . . . . . . . . . . . . . xii
1.5.2 Stationarity of an Autoregressive Process . . . . . . . . . . . . . xiii
1.5.3 Autocorrelation Function of an AR(p) . . . . . . . . . . . . . . . xiv
1.5.4 Autocorrelation Function of an AR(2) . . . . . . . . . . . . . . . xv
1.6 Partial Autocorrelation Function (PACF) . . . . . . . . . . . . . . . . . xix
1.7 The Mixed Autoregressive-Moving Average Process . . . . . . . . . . . xx
i
Unit 1
Models for Stationary Time
Series
Unit Objectives
1. ,
2.
3.
4.
5.
6.
7.
8.
Definition 1.1 Consider a sequence of random variables {at }. This process is said
to be white noise if the random variable’s are independent, identically distributed and
E(at ) = µt = µ for all t.
White noise is useful and that many useful processes can be constructed from it.
ii
Models for Stationary Time Series iii
The right hand side of the equation can be made finite by making the restriction
∞
X
ψj2 < ∞ (1.2)
j=0
A general linear model can be expressed as a linear P∞combination of present and past
values of a purely random process. The condition ψ 2 < ∞ ensures that the linear
P∞ j=0 j
representation
P∞ j=0 ψj a t−j is mathematically meaningful. More precisely, it implies
that j=0 ψj at−j converges in mean-square to Zt .
Note:
converges in probability:
P |X̄ − µ| < ξ → 1 (1.3)
hX i2
lim E ψj at−j − Zt →0 (1.5)
n→∞
hX i2
lim E ψj at−j − Zt →0 (1.6)
n→∞
Theorem 1.1 Let {Zt } be a general linear process ⇒ {Zt } is stationary, i.e. general
linear process is stationary.
P∞ 2
P
Proof 1.1 Since j=0 ψj < ∞ ⇒ mean square converges, then we can interchange
and expectation.
P P P
(i) E(Zt ) = E[ ψi at−j ] = ψi E(at−j ) = ψ × 0 = 0
iv Models for Stationary Time Series
(ii)
Cov(Zt , Zt−k ) = E(Zt Zt−k ) − E(Zt )E(Zt−k )
= E(Zt Zt−k )
Note:
P∞
= σa2 j=0 ψj ψj−k k∈N
2. A general linear process with non zero mean can be obtained by adding a constant
µ to the right hand side of the equation;
∞
X
Zt = µ + ψj at−j (1.7)
j=0
Definition 1.3 Let {γ(k)} be the autocovariance function of a stationary time series
{Zt }. Then autocovariance generating function of Zt is defined by:
X
Γ(s) = γ(k)sk (1.8)
Models for Stationary Time Series v
Solution 1.1
k
P
Γ(s) = k=0 γ(k)s
s 1
=1+ 2 + 2s
P∞
ψj2 < ∞ ⇒ Zt is stationary and γ(k) = σa2
P
Proof 1.2 The condition k=−∞ ψj ψj+k .
Thus;
γ(k)sk
P
Γ(s) = k
P h P∞ i
= k σa2 j=0 ψj ψj+k sk
P∞ −j
P∞
= σa2 j=0 ψj s k=−j ψj+k sj+k
P∞ −j
P∞
= σa2 j=0 ψj s h=0 ψh s
h
= σa2 ψ(s)ψ( 1s )
Example 1.2 Find the autocovariance generating function of a general linear process;
∞ j
X 1
Zt = at−j (1.11)
3
j=0
1 j
Solution 1.2 ψj = 3
vi Models for Stationary Time Series
P∞ 1 j j P∞ s j 1 3
ψ(s) = j=0 3 s = j=0 3 = 1−s/3 = 3−s
P∞ 1 j −j P∞ 1 j
ψ( 1s ) = 1 3
j=0 3 s = j=0 3s = 1−/3s = 3−1/s
9σa2
Γ(s) = σa2 ψ(s)ψ( 1s ) = σa2 3
3−s
3
3−1/s = (3−s)(3−1/s)
Definition 1.4 (Backward Shift Operator) The backward shift operator B is de-
fined as
BXt = Xt−1
B 2 Xt = Xt−2
..
.
B s Xt = Xt−s
The general linear process can be expressed in terms of the backward shift operator.
Let
ψj B j
P
ψ(B) =
= 1 + ψ1 B + ψ2 B 2 + ψ3 B 3 + . . .
Thus
= at + ψ1 Bat + ψ2 B 2 at + ψ3 B 3 at + . . .
= 1 + ψ1 B + ψ2 B 2 + ψ3 B 3 + . . . at
= ψ(B)at
Zt = at − θ1 at−1 − . . . − θq at−q
where {at } is white noise process i.e. a sequence of independent and identically
distributed random variables each with mean 0 and variance σa2 .
Models for Stationary Time Series vii
Zt = φ1 Zt−1 + . . . + φp Zt−p + at
where {at } is a white noise process such that at is independent of Zt−1 , Zt−2 , . . ..
where {at } is a white noise process such that at is independent of Zt−1 , Zt−2 , . . .
Pq
= j=0 ψj at−j where ψ0 = 1 and ψj = −θj
Then
(i) E(Zt ) = 0
Pq Pq
(ii) V ar(Zt ) = σa2 2
i=0 θi = σa2 2
i=0 ψi
Pq−k
(iii) Cov(Zt , Zt−k ) = σa2 i=0 ψi ψi+k
Pq−k
ψψ
(iv) Corr(Zt , Zt−k ) = Pq i 2i+k
i=0
i=0 ψi
(ii)
(iii)
(iv)
Pq−k
Cov(Zt ,Zt−k ) ψψ
ρk = Corr(Zt , Zt−k ) = = Pq i 2i+k
i=0
V ar(Zt ) i=0 ψi
Note that {Zt } for a moving average process is a stationary process since both E(Zt )
and autocovariance function does not depend on time t. This implies that all moving
average processes are stationary.
a. E(Zt )
b. V ar(Zt )
c. Cov(Zt , Zt−k )
d. Corr(Zt , Zt−k )
k ≥ 2 Cov(Zt , Zt−k ) = 0
5 2
4 σa k=0
Cov(Zt , Zt−k ) = γ(k) = − 12 σa2 k = ±1
0 otherwise
Models for Stationary Time Series ix
d. Corr(Zt , Zt−k )
5 2 5 2
Cov(Zt , Zt−k ) 4 σa / 4 σa =1 k=0
Corr(Zt , Zt−k ) = ρ(k) = = − 12 σa2 / 54 σa2 = − 25 k = ±1
V ar(Zt )
0/ 45 σa2 =0 otherwise
Zt = at − 2at−1 (1.14)
a. E(Zt )
b. V ar(Zt )
c. Cov(Zt , Zt−k )
d. Corr(Zt , Zt−k )
k ≥ 2 Cov(Zt , Zt−k ) = 0
5σa2
k=0
Cov(Zt , Zt−k ) = γ(k) = −2σa2 k = ±1
0 otherwise
d. Corr(Zt , Zt−k )
5σa2 /5σa2 = 1
k=0
Cov(Zt , Zt−k )
Corr(Zt , Zt−k ) = ρ(k) = = −2σa2 /5σa2 = − 25 k = ±1
V ar(Zt )
0/5σa2 =0 otherwise
a. Zt = at − 21 at−1
b. Zt = at − 2at−1
x Models for Stationary Time Series
This means that we cannot specify or identify a MA process uniquely from the given
acf. But if we express the two models by writing at in terms of Zt , Zt−1 , . . . we find by
successive substitution that;
Zt = at − θat−1 = (1 − θB)at
⇒ at = (1 − θB)−1 Zt (1.15)
Zt = at − 1θ at−1 = (1 − 1θ B)at
⇒ at = (1 − 1θ B)−1 Zt (1.16)
12
= Zt − 1θ Zt−1 + θ Zt−2 ∓ . . . [ binomial expansion of (1 − 1θ B)−1 ]
If |θ| < 1 series for model (3.15) converges whereas that for model (3.16) does not.
Model (3.15) is said to be invertible whereas model (3.16) is not. The imposition of
the invertibility condition ensures that there is a unique moving average process for a
given acf.
Thus the general M A(q) may be expressed as
θ(B) = 1 − θ1 B − θ2 B 2 − . . . − θq B q = 0 (1.18)
Find
b. Is Zt an invertible process.
k ≥ 2 Cov(Zt , Zt−k ) = 0
5 2
4 σa k=0
Cov(Zt , Zt−k ) = γ(k) = − 12 σa2 k = ±1
0 otherwise
b.
Zt = at − 12 at−1 = (1 − 12 B)at
⇒ θ(B) = 1 − 12 B = 0 ⇒ − 12 B = −1 ⇒B=2
c.
5 2 5 2
σ / σ =1 k=0
Cov(Zt , Zt−k ) 41 a2 45 a2
Corr(Zt , Zt−k ) = ρ(k) = = − 2 σa / 4 σa = − 25 k = ±1
V ar(Zt )
0/ 45 σa2 =0 otherwise
1. |θ2 | < 1
2. θ1 + θ2 < 1
3. θ2 − θ1 < 1
xii Models for Stationary Time Series
Zt = φ1 Zt−1 + . . . + φp Zt−p + at
where {at } is a white noise process such that at is independent of Zt−1 , Zt−2 , . . ..
A current value of the series Zt is a linear combination of the p most recent past vales
of itself plus an innovation term, at , which incorporates everything in the series at
time, t, that is not explained by the past values. at is assumed to be independent of
Zt−1 , Zt−2 , . . .
A process {Zt }∞
t=−∞ is called an autoregressive process of order 1, AR(1), if it satisfies
the model
Zt = φZt−1 + at
Note that
E(Zt ) = 0
For k = 0
σa2
⇒ γ(0) = 1−φ2
For k = 1
φσa2
⇒ γ(1) = 1−φ2
= φγ(o)
Models for Stationary Time Series xiii
1 k=0
φ k=1
ρ(k) =
φ2 k = 2
..
.
k
φ = for all k
Yule-Walker equations can be solved recursively or we can solve the equations explicitly
depending on the nature of the roots of characteristics equations.
1 − φ1 B − φ2 B 2 − . . . − φp B p
= at
φ(B)Zt = at
Example 1.6 Determine whether each of the following AR(p) processes are stationary
or not.
1. Zt = 21 Zt−1 + at
1 1
2. Zt = 12 Zt−1 + 12 Zt−2 + at
Solution 1.6
xiv Models for Stationary Time Series
1. |φ2 | < 1
2. φ1 + φ2 < 1
3. φ2 − φ1 < 1
Example 1.7 Find the values of φ1 for which Zt = φ1 Zt−1 + 92 Zt−2 + at is stationary.
A process {Zt }∞
t=−∞ is called an autoregressive process of order p if it satisfies the
model
Zt = φ1 Zt−1 + . . . + φp Zt−p + at (1.20)
where {at } is a white noise process such that at is independent of Zt−1 , Zt−2 , . . ..
multiply both sides of the AR(p) by Zt−k , k = 1, 2, . . . and take expectations. Assume
stationarity and that at and Zt−k are independent.
= E [φ1 Zt−1 Zt−k ] + E [φ2 Zt−2 Zt−k ] + . . . + E [φp Zt−p Zt−k ] + E [at Zt−k ]
σa2
⇒ γ(0) = [1−φ1 ρ(1)−φ2 ρ(2)−...−φp ρ(p)]
divide equation [1.21] by its variance γ(0) we get ACF, ρ(k) given by:
We call equation [1.22] the Yule Walker-Walker equations. This equation can be solved
recursively or we can solve the equation explicitly depending on the nature of the roots
of characteristic equation.
The Yule-Walker equations are a set of difference equations and they have a general
solution. The general solution depends on the nature of the roots that are obtained.
where
For an AR(2), if the roots α1 and α2 are unique real roots, then the general
solution is given by:
|k| |k|
ρk = A1 α1 + A2 α2
ρk = A1 + A2 k + . . . + Am k m−1 αk
(1.24)
For an AR(2), if the roots α1 and α2 are identical real roots, then the general
solution is given by:
ρk = [A1 + A2 k] αk
and ACF is
ρk = φ1 ρk−1 + φ2 ρk−2 (1.27)
xvi Models for Stationary Time Series
Solution 1.7 1.
2. Yule-Walker equations of Zt = 13 Zt−1 + 92 Zt−2 + at is obtained from finding ex-
pectation after multiplying Zt by Zt−k
Zt Zt−k = 31 Zt−1 Zt−k + 92 Zt−2 Zt−k + at Zt−k
ρk = 31 ρk−1 + 29 ρk−2
⇒ αk − 13 αk−1 − 29 αk−2 = 0
2 1 (1.29)
αk−2 α − 3 α − 29 = 0
⇒
2
⇒ α1 = 3 and α2 = − 13
2
S2. Substitute α1 = 3 and α2 = − 13 into
ρk = A1 α1k + A2 α2k
and solve for A1 and A2 by setting k = 0 and k = 1
k k
2 1
ρk = A1 + A2 −
3 3
Models for Stationary Time Series xvii
when k = 0: ρ0 = A1 + A2
when k = 1: ρ1 = 23 A1 − 13 A2
From the initial conditions: ρ0 = 1, ρ1 = ρ−1
1 2
ρk = φ1 ρk−1 + φ2 ρk−2 = ρk−1 + ρk−2
3 9
when k = 1 ρ1 = 13 ρ0 + 29 ρ−1 ⇒ ρ1 = 3
7
⇒ ρ0 = 1 = A1 + A2
3
⇒ ρ1 = 7 = 23 A1 − 13 A2 (1.30)
16 5
⇒ A1 = 21 and A2 = 21
S3. Substitute α1 = 23 , α2 = − 13 , A1 = 16
21 and A2 = 5
21 into
|k| |k|
ρk = A1 α1 + A2 α2 (1.31)
we get
|k| |k|
16 2 5 1
ρk = + − , k = 0, ±1, ±2, . . . (1.32)
21 3 21 3
Zt = 12 Zt−1 − 1
16 Zt−2 + at
⇒ Zt − 12 BZt + 1 2
16 B Zt = at
1 − 21 B + 1 2
⇒ 16 B Zt = at
⇒ 1 − 21 B + 1
16 B
2 =0
⇒ (B − 4)2 = 0 ⇒ B = 4 twice
xviii Models for Stationary Time Series
Thus Zt is a stationary process since the roots of the characteristic polynomial lie
outside a unit circle.
ρk = 12 ρk−1 − 1
16 ρk−2
⇒ αk = 12 αk−1 − 1 k−2
16 α
⇒ αk − 21 αk−1 + 1 k−2
16 α =0
(1.33)
⇒ αk−2 α2 − 12 α + 1
16 =0
1 2 1
⇒ α− 4 ⇒α= 4 twice
1
S2. Substitute α = 4 into
ρk = [A1 + A2 k] αk
and solve for A1 and A2 by setting k = 0 and k = 1
k
1
ρk = [A1 + A2 k]
4
0
when k = 0: ρ0 = [A1 + A2 (0)] 14 = A1
1 1 1
when k = 1: ρ1 = [A1 + A2 (1)] 4 = 4 [1 + A2 ]
From the initial conditions: ρ0 = 1, ρ1 = ρ−1
1 1
ρk = φ1 ρk−1 + φ2 ρk−2 = ρk−1 − ρk−2
2 16
when k = 1 ρ1 = 12 ρ0 − 1
16 ρ−1 ⇒ ρ1 = 8
17
⇒ ρ0 = 1 = A1
8 1
⇒ ρ1 = 17 = 4 [1 + A2 ] (1.34)
15
⇒ A1 = 1 and A2 = 17
Models for Stationary Time Series xix
we get
k
15 1
ρk = 1 + k (1.36)
17 4
1
Zt = Zt−1 − Zt−2 + at
2
2. Showing all your working, deduce that the autocorrelation function of Zt is given
by:
k
1 2 kπ 1 kπ
ρk = cos + sin for k = 0, ±1, ±2, . . .
2 4 3 4
1
Zt = Zt−1 − Zt−2 + at
2
Note
i. P ACF = φkk is a conditional ACF given or after the effect of interming variables
Lemma 1.2 Let {Zt } be a stationary time series. Then the PACF is the value of φk
in the autoregressive
Zt = φ1 Zt−1 + . . . + φk Zt−k + at
such that
φk minimizes E [Zt − φ1 Zt−1 − . . . − φk Zt−k ]2 = E a2t
Solution 1.10
E a2t = E [Zt − φ1 Zt−1 ]2 = F (φ)
dF (φ)
dφ = −2γ(1) + 2φ1 γ(0) = 0
γ(1)
⇒φ = γ(0) = ρ1
We say that the process {Zt } is a mixed autoregressive-moving average process of order
p and q respectively. Since the ARM A(p, q) is a hybrid of two processes it means that
the features of these processes still prevail. For this general ARM A(p, q) we state the
following facts without proof.
Models for Stationary Time Series xxi
ii. An ARM A(p, q) process is invertible if and only if the roots of the M A(q) char-
acteristic equation Θ(x) = 1 − θ1 x − θ2 x2 − . . . − θq xq = 0 all exceed unity in the
modulus.