3.2: Causality and Invertibility: Example: Mean and ACVF of An AR (1) Process 3.2.1
3.2: Causality and Invertibility: Example: Mean and ACVF of An AR (1) Process 3.2.1
some deeper thoughts are required in the case of AR(p) and ARMA(p, q) processes. For simplicity, we start by
investigating the autoregressive process of order one, which is given by the equations X = ϕX + Z (writing t t−1 t
N −1
2 N j
Xt = ϕXt−1 + Zt = ϕ Xt−2 + Zt + ϕZt−1 = … = ϕ Xt−N + ∑ ϕ Zt−j . (3.2.1)
j=0
j
Xt = ∑ ϕ Zt−j (3.2.2)
j=0
is the weakly stationary solution to the AR(1) equations, provided that |ϕ| < 1 . These calculations would indicate
moreover, that an autoregressive process of order one can be represented as linear process with coefficients ψ = ϕ . j
j
j
E[ Xt ] = ∑ ϕ E[ Zt−j ] = 0, t ∈ Z. (3.2.2)
j=0
γ(h) = Cov(Xt+h , Xt )
∞ ∞
j k
= E [ ∑ ϕ Zt+h−j ∑ ϕ Zt−k ]
j=0 k=0
∞ ∞ 2 h
2 k+h k 2 h 2k
σ ϕ
=σ ∑ϕ ϕ =σ ϕ ∑ϕ = ,
2
1 −ϕ
k=0 k=0
where h ≥ 0 . This determines the ACVF for all h using that γ(−h) = γ(h) . It is also immediate that the ACF
satisfies ρ(h) = ϕ . See also Example 3.1.1 for comparison.
h
linear process. However, one may proceed as follows. Rewrite the defining equations of an AR(1) process as
−1 −1
Xt = −ϕ Zt+1 + ϕ Xt+1 , t ∈ Z. (3.2.3)
−N −j
Xt = ϕ Xt+N − ∑ ϕ Zt+j , t ∈ Z. (3.2.4)
j=1
Note that in the weakly stationary case, the present observation has been described in terms of past innovations. The
representation in the last equation however contains only future observations with time lags larger than the present time
t . From a statistical point of view this does not make much sense, even though by identical arguments as above we
may obtain
∞
−j
Xt = − ∑ ϕ Zt+j , t ∈ Z, (3.2.5)
j=1
The result of the previous example leads to the notion of causality which means that the process (X : t ∈ Z) has a t
representation in terms of the white noise (Z : s ≤ t) and that is hence uncorrelated with the future as given by
s
Definition: Causality
∞
An ARMA(p, q) process given by (3.1.1) is causal if there is a sequence (ψ j : j ∈ N0 ) such that ∑ j=0
| ψj | < ∞ and
∞
Xt = ∑ ψj Zt−j , t ∈ Z. (3.2.6)
j=0
Causality means that an ARMA time series can be represented as a linear process. It was seen earlier in this section how
an AR(1) process whose coefficient satisfies the condition |ϕ| < 1 can be converted into a linear process. It was also
shown that this is impossible if |ϕ| > 1 . The conditions on the autoregressive parameter ϕ can be restated in terms of the
corresponding autoregressive polynomial ϕ(z) = 1 − ϕz as follows. It holds that
|ϕ| < 1 if and only if ϕ(z) ≠ 0 for all |z| ≤ 1,
Theorem 3.2.1
Let (X t be an ARMA(p, q) process such that the polynomials ϕ(z) and θ(z) have no common zeroes. Then
: t ∈ Z)
(Xt : t ∈ Z)is causal if and only if ϕ(z) ≠ 0 for all z ∈ C with |z| ≤ 1 . The coefficients (ψ : j ∈ N ) are determined j 0
A concept closely related to causality is invertibility. This notion is motivated with the following example that studies
properties of a moving average time series of order 1.
Example 3.2.3
Let (X t: t ∈ N) be an MA(1) process with parameter θ = θ . It is an easy exercise to compute the ACVF and the ACF
1
as
2 2
⎧ (1 + θ )σ ,
⎪
h = 0, ⎧1
⎪
h = 0.
2 −1
γ(h) = ⎨ θσ 2 , h =1 ρ(h) = ⎨ θ(1 + θ ) , h = 1. (3.2.8)
⎩
⎪ ⎩
⎪
0 h > 1, 0 h > 1.
These results lead to the conclusion that ρ(h) does not change if the parameter θ is replaced with θ . Moreover, there −1
exist pairs (θ, σ ) that lead to the same ACVF, for example (5, 1) and (1/5, 25). Consequently, we arrive at the fact
2
and
~ ~ ~
Xt = Z t + 5 Z t−1 , t ∈ Z, (Z : t ∈ Z) ∼ iid N (0, 1), (3.2.10)
~
are indistinguishable because we only observe X but not the noise variables Z and
t t Zt .
For convenience, the statistician will pick the model which satisfies the invertibility criterion which is to be defined next.
It specifies that the noise sequence can be represented as a linear process in the observations.
j=0
| πj | < ∞ and
∞
Zt = ∑ πj Xt−j , t ∈ Z. (3.2.11)
j=0
Theorem 3.2.2
Let (X t be an ARMA(p, q) process such that the polynomials ϕ(z) and θ(z) have no common zeroes. Then
: t ∈ Z)
(Xt : t ∈ Z)is invertible if and only if θ(z) ≠ 0 for all z ∈ C with |z| ≤ 1 . The coefficients (π ) are determined by j j∈N0
From now on it is assumed that all ARMA sequences specified in the sequel are causal and invertible unless explicitly
stated otherwise. The final example of this section highlights the usefulness of the established theory. It deals with
parameter redundancy and the calculation of the causality and invertibility sequences (ψ : j ∈ N ) and (π : j ∈ N ) . j 0 j 0
which seem to generate an ARMA(2,2) sequence. However, the autoregressive and moving average polynomials have
a common zero:
~ 2
ϕ (z) = 1 − .4z − .21 z = (1 − .7z)(1 + .3z),
~ 2 2
θ (z) = 1 + .6z + .09 z = (1 + .3z) .
Therefore, one can reset the ARMA equations to a sequence of order (1,1) and obtain
Xt = .7 Xt−1 + Zt + .3 Zt−1 . (3.2.14)
Now, the corresponding polynomials have no common roots. Note that the roots of ϕ(z) = 1 − .7z and
θ(z) = 1 + .3z are 10/7 > 1 and −10/3 < −1, respectively. Thus Theorems 3.2.1 and 3.2.2 imply that causal and
invertible solutions exist. In the following, the corresponding coefficients in the expansions
∞ ∞
j=0 j=0
are calculated. Starting with the causality sequence (ψ j : j ∈ N0 ) . Writing, for |z| ≤ 1 ,
∞ ∞
θ(z) 1 + .3z
j j
∑ ψj z = ψ(z) = = = (1 + .3z) ∑(.7z) , (3.2.16)
ϕ(z) 1 − .7z
j=0 j=0
(|z| ≤ 1 ) as
j j−1 j j−1
π0 = 1 and πj = (−1 ) (.3 + .7)(.3 ) = (−1 ) (.3 ) . (3.2.19)
j−1 j j−1
Xt = Zt + ∑(.7 ) Zt−j and Zt = Xt + ∑(−1 ) (.3 ) Xt−j . (3.2.20)
j=1 j=1
In the remainder of this section, a general way is provided to determine the weights (ψ : j ≥ 1) for a causal ARMA(p, q)
j
process given by ϕ(B)X = θ(B)Z , where ϕ(z) ≠ 0 for all z ∈ C such that |z| ≤ 1 . Since ψ(z) = θ(z)/ϕ(z) for these
t t
z , the weight ψ can be computed by matching the corresponding coefficients in the equation ψ(z)ϕ(z) = θ(z) , that is,
j
2 p q
(ψ0 + ψ1 z + ψ2 z + …)(1 − ϕ1 z − … − ϕp z ) = 1 + θ1 z + … + θq z . (3.2.21)
ψ0 = 1,
ψ1 − ϕ1 ψ0 = θ1 ,
ψ2 − ϕ1 ψ1 − ϕ2 ψ0 = θ2 ,
k=1
k=1
if we define ϕ = 0 if j > p and θ = 0 if j > q . To obtain the coefficients ψ one therefore has to solve the
j j j
homogeneous linear difference equation (3.2.2) subject to the initial conditions specified by (3.2.1). For more on this
subject, see Section 3.6 of Brockwell and Davis (1991) and Section 3.3 of Shumway and Stoffer (2006).
R calculations
In R, these computations can be performed using the command ARMAtoMA. For example, one can use the commands
>ARMAtoMA(ar=.7,ma=.3,25)
>plot(ARMAtoMA(ar=.7,ma=.3,25))
which will produce the output displayed in Figure 3.4. The plot shows nicely the exponential decay of the ψ -weights
which is typical for ARMA processes. The table shows row-wise the weights ψ , … , ψ . This is enabled by the
0 24
Contributers
Alexander Aue (Department of Statistics, University of California, Davis)