0% found this document useful (0 votes)
53 views21 pages

NA (Q) ARMA (P,: T T PTP T T QTQ

An ARMA(p,q) model is defined as a combination of an autoregressive (AR) process of order p and a moving average (MA) process of order q. It can be written as: Yt = a0 + α1Yt-1 + ... + αpYt-p + Zt + β1Zt-1 + ... + βqZt-q Where Yt is the time series, Zt is white noise, and the coefficients αi and βi capture the AR and MA components. For an ARMA model to be valid, its AR and MA polynomials must be relatively prime (have no common factors) and its roots

Uploaded by

General Master
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
53 views21 pages

NA (Q) ARMA (P,: T T PTP T T QTQ

An ARMA(p,q) model is defined as a combination of an autoregressive (AR) process of order p and a moving average (MA) process of order q. It can be written as: Yt = a0 + α1Yt-1 + ... + αpYt-p + Zt + β1Zt-1 + ... + βqZt-q Where Yt is the time series, Zt is white noise, and the coefficients αi and βi capture the AR and MA components. For an ARMA model to be valid, its AR and MA polynomials must be relatively prime (have no common factors) and its roots

Uploaded by

General Master
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 21

NA(q) &AR(p) => ARMA(p, q

5. MODELS FOR STATIONARY TIME SERIES

5.3 Autoregressive Moving Average Processes (ARMA)


An ARMA model, of order ( p, q ) is defined by
p g
Yt = α1Yt −1 + .... + α pYt − p + Zt + β1Zt −1 + ...... + βq Zt −q (5.3.1)
ie
p q
Yt = ∑αiYt −i + ∑ βi Zt −i
i=1 i=0

which is an autoregressive moving average process of order ( p, q ) or an ARMA ( p, q )

In other words it is a combination of AR ( p ) and MA ( q ) such that Yt is expressed


as a linear combination of previous Y ' s plus current and past noise.

Notation: φ ( B) Yt = θ ( B) Zt

1 + β1B + .... + β q B q
p
1 − α1 B − .... − α p B
polynomial of degree ' q '
polynomial of degree (p)
'p'
⑭,
B 1
= +

BB
x1,q 1
=

For example, ARMA (1,1) we could write the process in the backshift operator form:
⑪B
↓ (1 − α B ) Yt = (1 − β B) Zt
1

1 dB
-

1
11 20(i7
=
-
i
+

zpiztin
y- z1cYz alBiza
i
=

-
i

G1
= =P ciy+ i
-

zpizai
=

Po
1
=

f(B)Z7
-/8(B)YA ->
=

zP"Cit- i 11 441
= -

- 1.
... -

dpYtp apizti 2A
=

B,zA
+

-
1
...+
+

pazz

[pBP)YA (1 B,B +... BqBP)zt


=

(1 d,B-...
+ +

-
=
-

O(B) YA G(B)ZA
=
=
same as ARCp)

( )
If Yt has a nonzero mean, µ , we set a0 = µ 1 − α1 − α 2 − ... − α p and write the model
as
Yt = a0 + α1Yt −1 + .... + α pYt − p + Zt + β1Zt −1 + ...... + β q Zt −q

The problem with the general definition of ARMA ( p, q ) models are summarise as
follows:
(i) parameter redundancy
(ii) stationary AR models that depend on the future
(iii) MA models that are not unique

To overcome these problems we can do the following:

(i) The ARMA ( p, q ) model to be in its simplest form; that is we require that φ ( B )
and θ ( B ) have no common factors

carsal
(ii) Introduce the concept of causality. An ARMA ( p, q ) model is <
casual if the time
series can be written as general linear process. Alternatively, if the zeroes of
φ ( B ) lie outside the unit circle.

(iii) An ARMA ( p, q ) model is said to be invertible, if the zeroes of θ ( B ) lie outside

plz
the unit circle

2
zz G(B)g(B) YA
= <
GBB)YA 8(B) ZA =
=

YA O(B)G(B) ZA
=

G
↓ Representations YA V(B) Zt
=

zt π(B)Yt MA( ∞) or general linear process


=

(a)

Yt = φ −1 ( B )θ ( B ) Z t
= ψ ( B ) Zt

where
ψ ( B ) = 1 + ψ 1B + ψ 2 B 2 + ..... and ψ 0 = 1

So:
Yt = Z t + ψ 1Z t −1 + ψ 2 Z t −2 + ......

Yt + k = Z t + k + ψ 1Z t + k −1 + ψ 2 Z t + k − 2 + .... + | zt + zt −1 + .....
     
E ( ⋅ ) =0 known forecast

(b) AR ( ∞ ) or inverted form:


θ −1 ( B ) φ ( B ) Yt = Z t
π ( B) Yt = Zt

where
π ( B ) = 1 − π 1B − π 2 B 2 − ....

(c) Linear Difference Equation


Use the AR characteristic equation to find the roots, then find the general
solution.

3
5.3.1 Properties

(i). Assumption: we assume that there are no common factors in the AR and MA
polynomials. If there were, we would cancel them and the model reduceS to an
ARMA process of lower order.

⑧(B) YA 8(B)Zt
=

(ii). Stationarity

Yz 4(B)ZA
=

ψ ( B ) converges for B -
≤1
#(B) Xt zt
=

i.e. zeroes of φ ( B ) lie outside the unit circle

or using p ( λ ) = λ p − α1λ p−1 − ... − α p = 0

the roots are such that λi < 1 lie inside unit circle

(iii). Invertibility

π ( B ) converges for B -
≤1
i.e. zeroes of θ ( B ) lie outside the unit circle

or using p ( x ) = x q + β1x q −1 + .... + β q = 0

the roots are such that xi < 1 lie inside unit circle

4
⑧(B) YA 8(B)Zt
=

Clearly YA 1eA
(iv). Other forms =

θ ( B)
Moving average:- Yt = Z
φ ( B) t
zt
(BSYA
or =

φ ( B)
Autoregressive:- Zt = Y
θ (B) t

(v). Mean
As before E (Yt ) = 0

But we can introduce a non-zero mean:-

φ ( B ) (Yt − µ ) = θ ( B ) Zt
i.e.
φ ( B ) Yt = v + θ ( B ) Zt

where
v = φ ( B ) ⋅ µ = (1 − ∑α i ) µ

ARMA models have a fixed unconditional mean but a time-varying


conditional mean.

5
(t 4,yA 1 (YA
=

-
+

-
2 ...
+ +

(pyt -

p
+

zt p,zA 1 82EA
+
-
+

2
-
+
-..
BqEA q
+
-

multiply by YA-1 and take expectations

(vi). Auto-covariance and Variance


Similar approach to AR case, but needs adapting.
Multiply equation (5.3.1) by Yt − k and take expectation for k = 0,1, 2,....

γ k = α1γ k −1 + α 2γ k − 2 + ....... + α pγ k − p
+ (5.3.2)
(
E ( Z tYt − k ) + β1E ( Z t −1Yt − k ) + ... + β q E Z t − qYt − k )

 For k > q , all E ( ⋅) terms are zero


Thus, correlogram behaves like that of an AR model, i.e. tails off

 k ≤ q ( or p ) , γ 1 → γ p ( or γ q )
will depend directly on the MA β i ' s as well as the AR, α i ' s through
the terms E ( Zt −i Yt −k )

To find these:-
Zt −k times equation (5.3.1) and take E ( ⋅) for k = 0,1, 2,...., q

Then the solution of Eqn (5.3.2) gives γ 0 , γ 1 ,..., γ p or γ q ( )

(vii). Identification
Both ACF and PACF “tails off”. That is, both the ACF and the PACF
exponentially decrease.
Much of fitting ARMA models are guesswork and trial-and-error.

ARMA(p,q)
# MA(q)

tails off
ACF
cuts off after lag g tails off

tails of tails off


atsoffattlayp
PACF
6
DONOTSMIAEARNAGR)AA fusA!
&
Illustration 5.1: Parameter Redundancy, Causality and Invertibility

Consider the process


Yt = 0.4Yt −1 + 0.45Yt −2 + Zt + Z t −1 + 0.25Z t −2
Or, in operator form

(1 − 0.4B − 0.45B )Y = (1 + B + 0.25B ) Z


2
t
2
t
O(B) &(B)
This appears to be an ARMA ( 2, 2 ) model. However, the associated polynomials give

Whoops! Common factor!!!


φ ( B ) = 1 − 0.4 B − 0.45 B 2 ⇒ (1 + 0.5 B )(1 − 0.9 B )
2
θ ( B ) = 1 + B + 0.25B 2 ⇒ (1 + 0.5 B )(1 + 0.5 B ) = (1 + 0.5B )

i.e. (Xun
e
2
1 + 0.5 B )(1 − 0.9 B ) Yt = (1 + 0.5B ) Z t =
(85
B) (85B) ZA

After cancellation, the process becomes an ARMA (1,1) model which can be written as

(1 − 0.9B ) Yt = (1 + 0.5B ) Zt or Yt = 0.9Yt −1 + Z t + 0.5Z t

"ARMAC,
The model is casual because
φ ( B ) = (1 − 0.9 B ) = 0 ⇒ B = 109 which is the outside the unit circle

The model is invertible because


θ ( B ) = (1 + 0.5B ) = 0 ⇒ B = 2 which is outside the unit circle

Y(B)

Y BE
=

7
0.92p2 0.93B3
( 0.5B)(1 0.9B 0.92B2 0.93B37 ..)
+
+
+
=
1 2.9B + + +
...
+

0.5(0.9) B 0.518.95B3 ...


- + +

&
-

1 (0.9+0.5)B+ 0.9 (0.9+0.5) B2+ 0.92 (0.9 0.5) B3 +...


+
+

MA() n
1.4 4 no
To write the model as a linear process, we can obtain the ψ - weights 1.4

θ ( B ) 1 + 0.5 B −1
(a) ψ ( B) = = = (1 + 0.5 B )(1 − 0.9 B ) B ≤1
φ ( B ) 1 − 0.9 B

−1
ψ ( B ) = (1 + 0.5 B )(1 − 0.9 B )

Ve 1
=
&
( )
= (1 + 0.5B ) 1 + 0.9 B + 0.92 B 2 + 0.93 B3 + ....

So
j −1 (1:(0.5 0.9)(0.9)
+ 1.4
=

Ψ j = ( 0.5 + 0.9 )( 0.9 ) for j ≥ 1


( 2:(8.5 0.9)(0.9)2- 1.4(0.9)
+

=
=

And "

1 3:(0.5 0.9)(0.9)3

=
+
1.4(0.9)2 =

Yt = Z t + 1.4∑ ( 0.9 )
j −1
Zt − j
j =1

AR(0)
To find the invertible representation, we can obtain the π - weights

φ ( B ) 1 − 0.9 B −1
(b) π ( B) = = = (1 − 0.9 B )(1 + 0.5 B ) B ≤1
θ ( B ) 1 + 0.5B

−1
π ( B ) = (1 − 0.9 B )(1 + 0.5 B )

(
= (1 − 0.9 B ) 1 − 0.5 B + 0.52 B 2 − 0.53 B 3 + .... )
So
j j −1
π j = ( −1) ( 0.5 + 0.9 )( 0.5 ) for j ≥ 1

And

Yt = 1.4∑ ( −0.5 )
j −1
Yt − j + Z t
j =1

8
Example 5.3.1 ARMA (1,1) model
A process is defined by
Yt = α Yt −1 + Zt − β Zt −1

Show that:

(i) γ0 =
(1 − 2αβ + β )σ 2 2
z

1−α 2

 (1 − αβ )(α − β )  k −1
(ii) ρk =  2 
α for k ≥1
 1 − 2αβ + β 

Solution 5.3.1

(i) Multiply the process by Yt − k :

YtYt −k = α Yt −1Yt − k + Z tYt −k − β Z t −1Yt − k

Take expectation:
E (YtYt − k ) = E (α Yt −1Yt − k + Z tYt −k − β Z t −1Yt − k )

γ k = αγ k −1 + E ( Z tYt −k ) − β E ( Z t −1Yt −k )

when k = 0 : γ 0 = αγ 1 + E ( Z tYt ) + β E ( Zt −1Yt ) (1)

when k = 1: γ 1 = αγ 0 + E ( Z tYt −1 ) + β E ( Zt −1Yt −1 ) (2)

9
Now
(1): E ( Z tYt ) = E  Zt (α Yt −1 + Z t − β Z t −1 )  = σ z2

(2) E (Yt Z t −1 ) = E  Z t −1 (α Yt −1 + Z t − β Z t −1 ) 

= ασ z2 − βσ z2
= (α − β ) σ z2

Hence,
γ 0 = αγ 1 + 1 − β (α − β )σ z2  (3)

γ 1 = αγ 0 − βσ z2 (4)

γ k = αγ k −1 for k ≥ 2

Solving (3) and (4):

γ0 =
(1 − 2αβ + β )σ 2 2
z
QED
1−α 2

(ii) Now γ 1 = αγ 0 − βσ z2


(1 − 2αβ + β )σ 2 2
z
− βσ z2
2
1−α

=
(1 − αβ )(α − β )σ z2
1−α 2

10
And

ρ1 =
(α − β )(1 − αβ )
1 + 2αβ + β 2

In general γ k = αγ k −1 for k ≥ 2
So γ 2 = αγ 1
γ 3 = αγ 2 = α 2γ 1
γ 4 = αγ 3 = α 3γ 1

γk
Hence, ρk = k ≥1
γ0
= α k −1ρ1 k ≥1

Therefore,
 (1 − αβ )(α − β )  k −1
ρk =  2 
α k ≥1
 1 − 2αβ + β 

11
Example 5.3.2
Consider the process
Yt = −0.5Yt −1 + 0.14Yt − 2 + Z t − 0.5Z t −1

(i) Find the variance and covariance


(ii) Find the ACF and the first six correlation coefficients
(iii) Write the process as a MA ( ∞ ) and as an AR ( ∞ )

Solution 5.3.2
(i) Multiply the process by Yt − k and take expectation

γ k = −0.5γ k −1 + 0.14γ k −2 + E ( ZtYt −k ) − 0.5E ( Zt −1Yt −k )

Wold eqn: k = 0 : γ 0 = −0.5γ 1 + 0.14γ 2 + E ( ZtYt ) − 0.5E ( Zt −1Yt )

Y-W eqns:
k = 1: γ 1 = −0.5γ 0 + 0.14γ 1 − 0.5E ( Zt −1Yt −1 )
k = 2 : γ 2 = −0.5γ 1 + 0.14γ 0

Three equations to solve for γ 0 , γ 1 and γ 2


but first we need expectations E ( Z tYt ) = σ z2 E ( Zt −1Yt −1 ) = σ z2

E ( Z t −1Yt ) = E  Z t −1 ( −0.5Yt −1 + 0.14Yt − 2 + Z t − 0.5 Z t −1 ) 

= −0.5E ( Zt −1Yt −1 ) − 0.5E ( Zt −1Zt −1 )

= −0.5σ z2 − 0.5σ z2
= −σ z2

12
So:

γ 0 + 0.5γ 1 − 0.14γ 2 = 1.5σ z2  γ 0 = 2.8219σ z2 


 
0.5γ 0 + 0.86γ 1 = −0.5σ z2  i.e. γ 1 = −2.222σ z2 
γ 2 + 0.5γ 1 − 0.14γ 0 = 0  
 γ 2 = 1.5061σ z2 

(ii) Using LDE solution:


p ( λ ) = λ 2 + 0.5λ − 0.14

λ 2 + 0.5λ − 0.14 = 0

( λ − 0.2)( λ + 0.7) = 0
λ1 = 0.2 or λ2 = −0.7

k k
Hence, ρ k = A ( 0.2 ) + B ( −0.7 )

k = 0 : ρ0 = A + B = 1  A = −0.0971
 ⇒
k = 1: ρ1 = 0.2 A − 0.7B = −0.787 B = 1.0971

k k
∴ ρ k = 1.0971( −0.7 ) − 0.0971( 0.2 ) k ≥1

ρ1 = −0.787 ρ 2 = 0.534 ρ3 = −0.377

ρ 4 = 0.263 ρ5 = −0.184 ρ 6 = 0.129

13
(iii) Now φ ( B) = (1 − 0.2B)(1+ 0.7B) and θ ( B) = (1 − 0.5B)

So
MA ( ∞ ) :

Yt = (1 − 0.2 B ) (1 + 0.7 B )−1 (1 − 0.5 B )  Z t


−1

= Z t − 1 Z t −1 + 0.64
 Zt−2 −
 0.46 Z t − 3 + 0.3196
 Z t − 4 − .....
ψ1 ψ2 ψ3 ψ4

AR( ∞) :

Z t = (1 − 0.2 B )(1 + 0.7 B )(1 − 0.5 B )  Yt


−1
 

( )
Zt = 1 + 0.5B − 0.14B2 (1 − 0.5B ) Yt
−1

= Yt −
1Yt −1 − 0.36
 Yt − 2 − .....
π1 π2

Yt = Yt −1 + 0.36Yt −2 + ..... + Zt

14
Example 5.3.3
Find the autocorrelation coefficients and plot the correlogram for ARMA(1,2) process
Yt = 0.6Yt −1 + Z t − 0.3Z t −1 − 0.1Z t −2

Solution 5.3.3

Multiply by Yt − k : YtYt − k = 0.6Yt −1Yt −k + Z tYt −k − 0.3Zt −1Yt −k − 0.1Z t − 2Yt − k

Take E ( ⋅) : γ k = 0.6γ k −1 + E ( ZtYt −k ) − 0.3E ( Zt −1Yt −k ) − 0.1E ( Zt −2Yt −k )

k = 0 : γ 0 = 0.6γ 1 + E ( ZtYt ) − 0.3E ( Zt −1Yt ) − 0.1E ( Zt −2Yt )

k = 1: γ 1 = 0.6γ 0 + E ( ZtYt −1 ) − 0.3E ( Zt −1Yt −1 ) − 0.1E ( Zt −2Yt −1 )

E ( Z tYt ) = 0.6 E ( Z t Yt −1 ) + E ( Z t Z t ) − 0.3 E ( Z t Z t −1 ) − 0.1 E ( Z t Z t − 2 ) = σ z2


   
=0 =0 =0

E ( Z t −1Yt ) = 0.6 E ( Z t −1Yt −1 ) + E ( Z t Z t −1 ) − 0.3E ( Z t −1Z t −1 ) − 0.1 E ( Z t −1Z t − 2 ) = 0.3σ z2


 
=0 =0

E ( Zt −2Yt ) = 0.6 E ( Zt −2Yt −1 ) + E ( Zt −2 Zt ) − 0.3 E ( Zt −2Zt −1 ) − 0.1E ( Zt −2 Zt −2 ) = 0.08σ z2


  
Zt −1Yt =0.3σ z2 =0 =0

15
Hence,
γ 0 = 0.6γ 1 + 0.902σ z2  σ Y2 = γ 0 = 1.1σ z2 
 ⇒  and ρ1 = 0.3
γ 1 = 0.6γ 0 − 0.33σ z2  γ 1 = 0.33σ z2 

For k = 2 : γ 2 = 0.6γ 1 − 0.1σ z2 = 0.098σ z2 and ρ 2 = 0.089

k −2
In general:- ρ k = 0.6 ρ k −1 k ≥ 2 or ρ k = 0.089 ( 0.6 ) k≥2
-0.99 -0.98 -0.97 -0.96 -0.95
ρk

1 2 3 4 5

16
Example 5.3.4
For the ARMA(1,2) model, Yt = 0.8Yt −1 + Z t + 0.7 Z t −1 + 0.6 Zt −2
Show that
(a) ρ k = 0.8 ρ k −1 for k > 2

σ 2 
(b) ρ 2 = 0.8 ρ1 + 0.6  z2 
σ y 
 

Solution 5.3.4
(a) wlog , we assume that the mean of the series is zero.

YtYt −k = 0.8Yt −1Yt −k + Z tYt −k + 0.7 Z t −1Yt −k + 0.6Z t −2Yt − k

E (YtYt −k ) = 0.8E (Yt −1Yt −k ) + E ( ZtYt −k ) + 0.7 E ( Zt −1Yt −k ) + 0.6E ( Zt −2Yt −k )

γ k = 0.8γ k −1 for k > 2


And
ρ k = 0.8ρ k −1 for k > 2 QED

(b) Cov (Yt , Yt − 2 ) = E  ( 0.8Yt −1 + Z t + 0.7 Z t −1 + 0.6 Z t − 2 ) Yt − 2 

= E ( 0.8Yt −1 + 0.6 Z t − 2 ) Yt − 2 

= 0.8Cov (Yt −1, Yt −2 ) + 0.6E ( Zt −2Yt −2 )

17
γ 2 = 0.8γ1 + 0.6E ( Zt −2Yt −2 )

Now
E ( Z t − 2Yt − 2 ) = E ( Z t Yt )

= E  Z t ( 0.8Yt −1 + Z t + 0.7 Z t −1 + 0.6 Z t − 2 ) 

= σ z2

Hence, γ 2 = 0.8γ 1 + 0.6σ z2

 σ z2 
Thus ρ2 = 0.8ρ1 + 0.6  2 QED
 σY 

18
Example 5.3.5
Consider the edited version from Minitab output below
Final Estimates of Parameters

Type Coef SE Coef T P


AR 1 0.5796 0.1287 4.50 0.000
MA 1 -0.3871 0.1446 -2.68 0.009
Constant 13.669 9.496 1.44 0.155
Mean 32.52 22.59

(a) Identify the type of model


(b) Write down the estimated model
(c) Is the model stationary and invertible?

Solution 5.3.5
(a) This is an ARMA(1,1) with a non-zero mean

(b) (Yt − 32.52 ) = 0.5716 (Yt −1 − 32.52 ) + Zt − 0.3871Zt −1

Yt = 32.52 + 0.5716Yt −1 − 18.8486 + Z t − 0.3871Zt −1

= 13.6710.5716Yt −1 + Z t − 0.3871Zt −1

(c) p ( λ ) = λ − 0.5796 ie λ − 0.5796 = 0 ⇒ λ < 1


hence stationary

p ( x ) = x − 0.3871 ie x − 0.3871 = 0 ⇒ x < 1

Hence invertibile

19
Example 5.3.5
Consider the edited version from Minitab output below
Final Estimates of Parameters

Type Coef SE Coef T P


AR 1 0.5796 0.1287 4.50 0.000
MA 1 -0.3871 0.1446 -2.68 0.009
Constant 13.669 9.496 1.44 0.155
Mean 32.52 22.59

(a) Identify the type of model


(b) Write down the estimated model
(c) Is the model stationary and invertible?

Solution 5.3.5
(a) This is an ARMA(1,1) with a non-zero mean

(b) (Yt − 32.52 ) = 0.5716 (Yt −1 − 32.52 ) + Zt − 0.3871Zt −1

Yt = 32.52 + 0.5716Yt −1 − 18.8486 + Z t − 0.3871Zt −1

= 13.6710.5716Yt −1 + Z t − 0.3871Zt −1

(c) p ( λ ) = λ − 0.5796 ie λ − 0.5796 = 0 ⇒ λ < 1


hence stationary

p ( x ) = x − 0.3871 ie x − 0.3871 = 0 ⇒ x < 1

Hence invertibile

19

You might also like