LN LinearTSModels (Contd)
LN LinearTSModels (Contd)
Yt = et + 1 et 1 + 2 et 2 + + q et q (29)
33
For q = 1, we have the M A(1) process
Yt = et + 1et 1 (30)
Consider the variance and autocovariances of this process:
E(Yt2) = E(et + 1et 1)2 = 2e (1 + 21)
E(YtYt 1) = E(et + 1et 1)(et 1 + 1et 2)
) 1 = 1 2e
E(YtYt 2) = E(et + 1et 1)(et 2 + 1et 3)
) 2=0
Similarly, higher order autocovariances all equal to zero.
The autocorrelations of an M A(1) process are therefore:
1
1 =
1 + 21
2 = 3 = =0
If j 1j < 1, the M A(1) process may be expressed as an
infinite series in Yt: Hence, its partial autocorrelations do
not cut off but damp toward zero.
34
yt = 1yt 1 + + pyt p +et + 1et 1 + + q et q (31)
For p = q = 1 we have the ARM A(1; 1) process:
y t = 1 y t 1 + et + 1 et 1 (32)
Squaring this and taking expectation, we can show that
1 + 2 1 1 + 21 2
0 = 2 e
1 1
Myltiplying by yt 1 and taking expectation, yields
2 ( 1 + 1)(1 + 1 1) 2
1 = 1 0 + 1 e = 2 e
1 1
Higher-order autocovariances are given by
k = 1 k 1; k = 2; 3; : : :
The autocorrelation function of the ARM A(1; 1) process
is thus:
( 1 + 1)(1 + 1 1)
1 =
1 + 2 1 1 + 21
k = 1 k 1; k = 2; 3; :::
The first coefficient depends on the parameters of the
AR and M A part of the process. Subsequent coeffi-
cients decline exponentially at a rate that depends on the
35
AR parameter.
36
I ACF exhibits positive spike at lag 1
s = 0 for s > 1
I P ACF exhibits oscillating decay:
( 1 )s
ss =
(1 + 21 + + 2s
1 )
M A(1) i.e. p = 0; q = 1; 1 < 0
I ACF exhibits positive spike at lag 1
s = 0 for s > 1
I P ACF exhibits monotonic decay:
ARM A(1; 1), i.e. p = q = 1; 0 < 1 < 1
I ACF exhibits monotonic decay beginning after
lag 1.
I P ACF exhibits oscillating decayt beginning after
lag 1. 11 = 1:
ARM A(1; 1), i.e. p = q = 1; 1 < 1 < 0
I ACF exhibits oscillating decay beginning after
lag 1.
I P ACF exhibits monotonic decay beginning after
lag 1. 11 = 1:
ARM A(p; q)
I ACF decays (either monotonic or oscillatory)
beginning after lag q
37
I P ACF decays (either monotonic or oscillatory)
beginning after lag p
6 Wold’s Theorem
Let fxtg be any zero-mean covariance stationary process.
Then we can write it as
P
1
xt = B(L)et = bt " t i (33)
i=0
2 P1 2
"t W N (0; ") where b0 = 1 and i=0 bi < 1
It is clear that an M A(q) attempts to approximate Wold’s
representation.
What about an AR(p)? Can it be seen as an approxima-
tion of Wold’s representation?
xt = 1 xt 1 + 2 xt 2 + + p xt p + "t
xt 1 xt 1 2 xt 2 p xt p = "t
(L)xt = "t
where
2 p
(L) = 1 1L 2L pL
If xt is covariance stationary, the roots of (L) all lie
outside the unit circle. Then, xt can be expressed as a
38
convergent infinite moving average of the innovations:
1
xt = "t
(L)
For example, in the case of AR(1);
(L) = 1 1L and
1 1 2 2
= =1+ 1L + 1L +
(L) 1 1 L
and
1 2
xt = "t = "t + 1 "t 1 + 1 "t 2 +
(L)
Both M A and AR processes impose restrictions in the
estimation of the Wold representation.
A more flexible form is provided by the ARM A processes.
xt = 1 xt 1 + + p xt p + "t + 1 "t 1 + + q "t q
xt 1 xt 1 p xt p = "t + 1 "t 1 + + q "t q
(L)xt = (L)"t
(L)
xt = "t
(L)
For example, an ARM A(1; 1) gives
39
1+ 1L
xt = "t
1 1L
2 2
= (1 + 1 L)(1 + 1L + 1L + )"t
2 2
= [1 + ( 1 + 1 )L + ( 1 + 1+ 1 )L + ]"t
Why do AR and M A processes have ACF and P ACF
that behave as described above?
Consider the M A(1) process: xt = "t + 1 "t 1 :
Innovations up to lag 1 affect xt, so values more than
one period apart have no common influences. Therefore
ACF cut off abruptly after 1 period.
Similar reasoning explains why ACF of an M A(q) process
cuts off abruptly after q lags.
All M A processes are covariance stationary, regardless
of the values of the parameters.
An M A(q) process xt = "t + 1"t 1 + + q "t q , is
said to be invertible if and only if the roots of the lag
polynomial 1 + 1L + + q Lq = 0 all lie outside the
unit circle.
Invertibility means that the M A(q) process can be ex-
pressed as an infinite order AR process.
40
For example, the M A(1) process xt = "t + 1"t 1 is
invertible if the root of 1 + 1L = 0 is greater than 1 in
absolute value, i.e. j 1j < 1; in which case
1
xt = "t
1 + 1L
2 2
(1 1L + 1L )xt = "t
2 3
xt = 1 xt 1 1 xt 2 + 1 xt 3 + "t
The P ACF; ss; is defined as the regression coefficient
of the last term in an autoregression:
xt = 1 xt 1 + 2 xt 2 + + s xt s + vt
Since an invertible M A process has an infinite order AR
representation, its P ACF will not cut off abruptly after
some lag.
As shown above, a covariance stationary AR(1) process
can be represented as an infinite order M A process:
1
xt = "t = "t + 1"t 1 + 21"t 2 +
(L)
Values of the series at different points in time have com-
mon influences, no matter how far apart they are. There-
fore, its ACF does not cut off abruptly.
41
Similar reasoning leads to persistent ACF for higher or-
der AR processes.
The P ACF of an AR(p) process cuts off abruptly after
p lags because if the model generating the series is
xt = 1 xt 1 + 2 xt 2 + + p xt p + "t
then the parameter p+1 in the autoregression
xt = 1 xt 1 + 2 xt 2 + + p xt p + p+1 xt p 1 + "t
must be zero.
42