0% found this document useful (0 votes)
63 views2 pages

4.5 Autoregressive Processes AR (P) : Definition 4.7. An

The document summarizes autoregressive (AR) time series models. It defines an AR(p) model as explaining the current value of a time series (Xt) as a linear function of its past p values plus white noise. It provides different notations for writing AR models, including vector and backshift operator notation. It then focuses on the special case of AR(1) models, showing how they can be written as an infinite combination of past white noise values, making AR(1) models stationary time series.

Uploaded by

harry
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
63 views2 pages

4.5 Autoregressive Processes AR (P) : Definition 4.7. An

The document summarizes autoregressive (AR) time series models. It defines an AR(p) model as explaining the current value of a time series (Xt) as a linear function of its past p values plus white noise. It provides different notations for writing AR models, including vector and backshift operator notation. It then focuses on the special case of AR(1) models, showing how they can be written as an infinite combination of past white noise values, making AR(1) models stationary time series.

Uploaded by

harry
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

74 CHAPTER 4.

STATIONARY TS MODELS

4.5 Autoregressive Processes AR(p)


The idea behind the autoregressive models is to explain the present value of the
series, Xt , by a function of p past values, Xt−1 , Xt−2 , . . . , Xt−p .

Definition 4.7. An autoregressive process of order p is written as

Xt = φ1 Xt−1 + φ2 Xt−2 + . . . + φp Xt−p + Zt , (4.20)

where {Zt } is white noise, i.e., {Zt } ∼ W N(0, σ 2 ), and Zt is uncorrelated with
Xs for each s < t.

Remark 4.12. We assume (for simplicity of notation) that the mean of Xt is zero.
If the mean is E Xt = µ 6= 0, then we replace Xt by Xt − µ to obtain

Xt − µ = φ1 (Xt−1 − µ) + φ2 (Xt−2 − µ) + . . . + φp (Xt−p − µ) + Zt ,

what can be written as

Xt = α + φ1 Xt−1 + φ2 Xt−2 + . . . + φp Xt−p + Zt ,

where
α = µ(1 − φ1 − . . . − φp ).

Other ways of writing AR(p) model use:

Vector notation: Denote


φ = (φ1 , φ2 , . . . , φp )T ,
Xt−1 = (Xt−1 , Xt−2 , . . . , Xt−p )T .

Then the formula (4.20) can be written as

Xt = φT Xt−1 + Zt .

Backshift operator: Namely, writing the model (4.20) in the form

Xt − φ1 Xt−1 − φ2 Xt−2 − . . . − φp Xt−p = Zt ,

and applying BXt = Xt−1 we get

(1 − φ1 B − φ2 B 2 − . . . − φp B p )Xt = Zt
4.5. AUTOREGRESSIVE PROCESSES AR(P) 75

or, using the concise notation we write

φ(B)Xt = Zt , (4.21)

where φ(B) denotes the autoregressive operator

φ(B) = 1 − φ1 B − φ2 B 2 − . . . − φp B p .

Then the AR(p) can be viewed as a solution to the equation (4.21), i.e.,
1
Xt = Zt . (4.22)
φ(B)

4.5.1 AR(1)
According to Definition 4.7 the autoregressive process of order 1 is given by

Xt = φXt−1 + Zt , (4.23)

where Zt ∼ W N(0, σ 2 ) and φ is a constant.

Is AR(1) a stationary TS?

Corollary 4.1 says that an infinite combination of white noise variables is a sta-
tionary process. Here, due to the recursive form of the TS we can write AR(1) in
such a form. Namely
Xt = φXt−1 + Zt
= φ(φXt−2 + Zt−1 ) + Zt
= φ2 Xt−2 + φZt−1 + Zt
..
.
k−1
X
= φk Xt−k + φj Zt−j .
j=0

This can be rewritten as


k−1
X
φk Xt−k = Xt − φj Zt−j .
j=0

What would we obtain if we have continued the backwards operation, i.e., what
happens when k → ∞?

You might also like