0% found this document useful (0 votes)
19 views26 pages

Lecture 17 Arima FCST

The lecture covers ARIMA models, including their definitions, examples, and forecasting techniques. It discusses the importance of differencing in model specification and the distinction between random walk and stationary AR(1) models. Additionally, it highlights the challenges of identifying nonstationary time series and the implications of mis-specifying models.

Uploaded by

qwang971218
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views26 pages

Lecture 17 Arima FCST

The lecture covers ARIMA models, including their definitions, examples, and forecasting techniques. It discusses the importance of differencing in model specification and the distinction between random walk and stationary AR(1) models. Additionally, it highlights the challenges of identifying nonstationary time series and the implications of mis-specifying models.

Uploaded by

qwang971218
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 26

Statistics 5350/7110

Forecasting

Lecture 17
Forecasting ARIMA Models

Professor Robert Stine


Preliminaries
• Questions?

• Gr ding

• Assignments
• Assignment 4 is posted

• No o ice hours tod y… H lloween!

• Quick review
• Forec sting ARMA models
• Role of di erence equ tion form
• Role of moving ver ge form

2
a
a
ff
ff
a
a
a
a
a
Today’s Topics
Text, §5.1

• De inition of n ARIMA model


• ARMA model it to the di erences

• Ex mples of ARIMA models

• Forec sting speci l c ses of ARIMA models… the det ils


• R ndom w lk versus st tion ry AR(1) model
• Integr ted utoregression, ARIMA(1,1,0) or ARI(1,1)
• Integr ted moving ver ge, ARIMA(0,1,1) or IMA(1,1)

3
a
a
f
a
a
a
a
a
f
a
a
a
a
a
ff
a
a
a
ARIMA Model
• Incorpor te di erencing into speci ic tion of model
• Three choices specify model
p Order of the utoregression
d Degree of di erencing (p, d, q) De inition 5.1

q Order of the moving ver ge


∇d Xt = α + ϕ1( ∇d Xt−1) + ⋯ + ϕp( ∇d Xt−p) + wt + θ1wt−1 + ⋯θqwt−q
ϕ(B) ∇d Xt = α + θ(B) wt ∇d = (1 − B)d
• Intercept in the model is
α = δ (1 − ϕ1 − ϕ2 − ⋯ − ϕp), δ = E( ∇d Xt)
• Equiv lent to s ying th t the di erenced d t (1-B)d Xt is ARMA(p,q)
• Pr ctice: Identify if non-st tion ry, then model di erences s needed s ARMA

• Exp nded scope


• R ndom w lk is trivi l ARIMA(0,1,0) model
• Exponenti l smoothing is n ARIMA(0,1,1) model (next cl ss)
4
a
f
a
a
a
a
a
a
a
a
ff
a
ff
a
a
a
a
a
a
ff
a
a
a
f
Estimating an ARIMA Model
• S me s for ARMA models
• Di erence the d t
• Fit p r meters to di erences s if st rting with n ARMA process

• Adv nt ge of ARIMA style


• Why bother since “essenti lly” n ARMA model: Computing forec sts
• Softw re sums di erences s needed
• Computes prediction
• Computes st nd rd error of prediction (MSPE, me n squ red prediction error)

• R det ils
• ` rim ` nd `s rim ` functions incorpor te choice of di erencing p r meter
• `ARMAtoMA` works for non-st tion ry models to get MA weights

5
a
a
ff
a
a
a
a
a
a
a
a
a
a
a
a
a
ff
a
a
ff
a
a
a
a
a
a
a
a
a
Examples of Nonstationary
Macro Time Series
• Gross domestic product (GDP)
• Returns on the stock m rket (SP500)
• Consumer price index
a
Examples of Nonstationary Time Series
• Nomin l US Gross Domestic Product <latexit sha1_base64="RcGP5mQ8tfypgV+hbR2/n/71Lug=">AAAC+HicfVLLbhMxFPUMrxJeKSzZWERERW0nM6ikZIFUwYZlkUgbqY6iOx5nasVjW7YDDaP5Cn6AHWKL2LCFr+BvcJKpCC3hSLaOz/U58rWdasGti+NfQXjl6rXrNzZuNm7dvnP3XnPz/pFVU0NZnyqhzCAFywSXrO+4E2ygDYMiFew4nbya14/fMWO5km/dTLNhAbnkY07BeWm0GeyQlOVclmAMzKrSUFE1MCYSUgGY7GAiVI4HI+dFjNsv2kuBjA3Q0suVn0q3m1QV/jcINWusu7Vz+zzhYtRlq2Bjt5Xg7cshK2ZieH7qnqwGENDaqDPc/q9xzekJk1l9PaNmK456cdJ7to+X5PleTbo9nETxAi1U43DU/EYyRacFk44KsPYkibUb+jTHqWBVg0wt00AnkLMTTyUUzA7LxbNW+LFXMjxWxg/p8EJddZRQWDsrUr+zAHdqL9bm4p8al3rqyk7f+t/QSVXaea/MpKNBz9fWgczAZJFjZ1XDt3neC15Pjp5GSTfqvtlrHbysG95AD9EjtIUStI8O0Gt0iPqIBh+D78GP4Gf4IfwUfg6/LLeGQe15gP5C+PU3EGjcNQ==</latexit>
Ch nge (log) ≈ Percent Ch nge
Xt
r log Xt = log
• Se son lly djusted = log
Xt 1
Xt Xt 1 +Xt 1
⇣ Xt 1 ⌘
• Log sc le shows long-term trend = log 1 + XtXtXt1 1

• Di erences look st tion ry, but for huge Covid outlier


Xt Xt 1
⇡ Xt 1

• Ch nge in vol tility ssoci ted with (in l tion)

FRED d t Text, Ex mple 5.6 goes though 2002 7


a
ff
a
a
a
a
a
a
a
a
a
a
a
a
a
a
a
f
a
Examples of Nonstationary Time Series
• SP 500 Stock M rket Index
• Log sc le shows long-term trend
• Return de ined s (Pt - Pt-1)/Pt-1
• Periods of persistent high vol tility

CRSP d t , vi WRDS 8
a
a
a
a
f
a
a
a
Examples of Nonstationary Time Series
• Consumer price index (CPI)
• Index growth, with di erent regimes (line r from 1980 through 2019)
• Di erences ppe r more st tion ry
• But di erencing reve ls ch nges in vol tility

CRSP d t , vi WRDS 9
ff
a
ff
a
a
a
a
a
ff
a
a
a
a
a
Three Models for Nonstationary
Time Series
• R ndom w lk versus AR(1) Ex mple 5.3
• Integr ted utoregression Ex mple 5.4
• Integr ted moving ver ge Ex mple 5.5
a
a
a
a
a
a
a
a
a
a
Random Walk with Drift Ex mple 5.3

• Model
Xt = δ + Xt−1 + wt
• Forec st
• Assume we know the model nd h ve d t X1, X2, …, Xn.
• First forec st is obvious
X̂n+1 = δ + Xn
• T ke condition l expect tion to ind the next
Xn+2 = δ + Xn+1 + wn+2 ⇒ X̂n+2 = δ + (δ + Xn) + 0 = 2 δ + Xn
• Predictions re line r trend from l st v lue
Di erences re δ + wt, so just
X̂n+m = m δ + Xn dd them up s extr pol te

• Me n squ red prediction error (MSPE)


• B ck-substitution expression shows the ccumul tion of errors (MA weights ψj = 1)
Xn+m = m δ + wn+m + wn+m−1 + ⋯ + wn+1 ⇒ E(Xn+m − X̂n+m)2 = m σw2
Grows nd grows
m terms with no limit
11
a
a
a
ff
a
a
a
a
a
a
a
a
a
a
a
a
a
a
a
f
a
a
a
a
a
a
a
Random Walk with Drift Ex mple 5.3

• R ndom w lk
• Di erence equ tion sets ϕ =1
Xt = δ + Xt−1 + wt
• Forec st does not revert to me n (no const nt me n to revert tow rds)
• Forec st expected squ red error MSPE grows with extr pol tion

• AR(1) model
• Constr ins |ϕ| < 1
Xt = α + ϕ Xt−1 + wt
• St tion ry, though φ might be close to 1.
• Forec st reverts to me n μ = α + ϕμ ⇒ μ = α/(1 − ϕ)
• MSPE grows to limit t Var(Xt) = σw2 /(1 − ϕ 2)
• Distinguishing these models
• Known s the “unit root” problem in st tistics/econometrics (Text §8.2)
12
a
a
ff
a
a
a
a
a
a
a
a
a
a
a
a
a
a
a
Competing Risks
• Which is the worse mist ke?
• The process is r ndom w lk, but you model it s n AR(1)
• The process is n AR(1), but you model it s r ndom w lk

• Short term
• AR(1) forec sts revert to the me n, where s r ndom w lk forec sts drift
• AR(1) MSPE ppro ches v ri nce, where s r ndom w lk MSPE grows without bound

• Ex mple

13
Not e sy to distinguish
a
a
a
a
a
a
a
a
a
a
a
a
a
a
a
a
a
a
a
a
a
Competing Risks
• Situ tion
• Process is r ndom w lk with sm ll positive drift δ = 0.1
• Fitted model is AR(1), le ving sm ll mount of residu l utocorrel tion

• Forec sts
• AR(1) forec sts revert to process me n
RW
• Actu l me n trends up (δ = 0.1)

• Econometric preference
• Common to ssume RW unless
AR(1)
convinced th t it is not
• As if null hypothesis is H0: Xt is RW
r ther th n other w y round.
• Not the right st nd rd error!

14
a
a
a
a
a
a
a
a
a
a
a
a
a
a
a
a
a
a
a
a
a
Competing Risks
• Situ tion
• Process is r ndom w lk (RW) with sm ll positive drift δ = 0.1
• Suppose model it inste d s RW with possible drift

• Forec sts
• RW forec sts trend upw rd
• RW st nd rd errors grow RW

• F lse sense of ccur cy?


• AR(1) will overst te ccur cy if
time series is r ndom w lk. AR(1)
• RW is more conserv tive:
Less con ident of wh t the future.

15
a
a
a
a
f
a
a
a
a
a
a
a
a
a
a
a
a
a
a
a
a
a
a
a
Integrated Autoregression Ex mple 5.4

• ARIMA(1,1,0)
.k. . ARI(1,1)
• Di erences re AR(1) r ther th n uncorrel ted s in r ndom w lk
• Resembles n AR(2)
∇Xt = δ + ϕ ( ∇Xt−1) + wt ⇒ Xt = δ + (1 + ϕ) Xt−1 + ϕ Xt−2 + wt
• Forec sting
• 1 step he d
X̂n+1 = [Xn+1] = [δ + (1 + ϕ)Xn − ϕXn−1 + wn+1] = δ + (1 + ϕ)Xn − ϕXn−1 + 0
error Xn+1 − [Xn+1] = wn+1
• 2 steps he d
[Xn+2] = [δ + (1 + ϕ)Xn+1 − ϕXn + wn+2] = δ + (1 + ϕ)[Xn+1] − ϕXn + 0
weight on wn+1
error Xn+2 − [Xn+2] = wn+2 + (1 + ϕ)(Xn+1 − [Xn+1]) = wn+2 + (1 + ϕ)wn+1 incre ses

• 3 steps he d
[Xn+3] = [δ + (1 + ϕ)Xn+2 − ϕXn+1 + wn+3] = δ + (1 + ϕ)[Xn+2] − ϕ[Xn+1] + 0 Do you see
p ttern in the
error Xn+3 − [Xn+3] = wn+3 + (1 + ϕ)wn+2 + (1 + ϕ + ϕ 2)wn+1 MA weights?

16
a
a
a
ff
a
a
a
a
a
a
a
a
a
a
a
a
a
a
a
a
Integrated Autoregression
• Another w y to see wh t’s h ppening to prediction error
• Moving ver ge represent tion
• Not st tion ry process, but c n still compute the MA weights.
• Very simple r.h.s. since θ(B) =1
ϕ(B) ψ(B) = θ(B) = 1
• Solve by equ ting coe icients recursively ( s before)
(1 − (1 + ϕ)B + ϕB )(1 + ψ1B + ψ2B + ⋯) = 1
2 2

• Method is more “mech nic l” th n the direct m nipul tion on prior slide.

• Gener l p ttern Do you see p ttern


• Geometric series in these MA weights?

ψj = 1 + ϕ + ϕ 2 + ⋯ + ϕ j, j = 1,2,…
• Limiting v lue is
ψj → 1/(1 − ϕ)
17
a
a
a
a
a
a
a
a
a
a
a
a
ff
a
a
a
a
a
a
a
a
a
Example ARIMA(1,1,0)
• ARIMA(1,1,0) process
• Model is
∇Xt = 0.6 ∇Xt−1 + wt
• Observed time series

18
Example ARIMA(1,1,0)
• ARIMA(1,1,0) process
• Model is
∇Xt = 0.6 ∇Xt−1 + wt
• Observed di erences

19
ff
Estimates and Forecasts
• Summ ry of estim tes

• Forec sts

20
a
a
a
Integrated Moving Average Ex mple 5.5

• ARIMA(0,1,1)
.k. . IMA(1,1)

Text uses di erent not tion


• Di erences re moving ver ge k Xt = Xt−1 + wt − λ wt−1
• Resembles n AR(2) bec use of connection to
exponenti l smoothing.
∇Xt = δ + θwt−1 + wt ⇒ Xt = δ + Xt−1 + θwt−1 + wt
• Forec sting
• 1 step he d
X̂n+1 = [Xn+1] = [δ + Xn + θwn + wn+1] = δ + Xn + θwn + 0
error Xn+1 − [Xn+1] = wn+1
• 2 steps he d
[Xn+2] = [δ + Xn+1 + θwn+1 + wn+2] = δ + [Xn+1] + 0 + 0 = 2 δ + Xn + θwn
error Xn+2 − [Xn+2] = wn+2 + θ wn+1 + wn+1 = wn+2 + (1 + θ) wn+1
• 3 steps he d
[Xn+3] = [δ + Xn+2 + θwn+2 + wn+3] = δ + [Xn+2] + 0 + 0 = 3 δ + Xn + θwn
error Xn+3 − [Xn+3] = wn+3 + (1 + θ) wn+2 + (1 + θ) wn+1
21
a
a
ff
a
a
a
a
a
a
a
ff
a
a
a
a
a
a
a
a
Integrated Autoregression
• Another w y to see wh t’s h ppening to prediction error
• Polynomi l lgebr repl ces expressions for prediction error

• Moving ver ge represent tion


• B ckshift polynomi l form is
ϕ(B) ψ(B) = θ(B) ⇒ (1 − B) ψ(B) = 1 + θ B
• Solve by equ ting coe icients recursively ( s before)
(1 − B)(1 + ψ1B + ψ2B 2 + ⋯) = 1 + θ B
• Gener l p ttern emerges quickly
ψ0 = 1, ψ1 = 1 + θ, ψ2 = ψ1, ψ3 = ψ2, … or ψj = 1 + θ, j = 1,2,…
• Me n squ red prediction error
• Squ red error grows line rly with incre sing extr pol tion
E(Xn+m − X̂n+m)2 = σw2 (1 + (m − 1)(1 + θ)2)

22
a
a
a
a
a
a
a
a
a
a
a
a
a
a
ff
a
a
a
a
a
a
a
a
Example ARIMA(0,1,1)
• IMA(1,1) process
• Model is
∇Xt = 0.6 wt−1 + wt
• Observed time series
• Sequence plot looks st tion ry
• ACF/PACF both ppe r to dec y geometric lly

23
a
a
a
a
a
a
Example ARIMA(0,1,1)
• IMA(1,1) process
• Model is
∇Xt = 0.6 ∇Xt−1 + wt
• Observed di erences
• ACF/PACF suggest MA(1) if ignore the ACF fter l g 6

24
ff
a
a
Estimates and Forecasts
• Summ ry of estim tes
• Sm ll estim te of δ

• Forec sts
• Sm ll estim te of δ le ds to
very gr du l incre se
• Very wide prediction interv ls

25
a
a
a
a
a
a
a
a
a
a
a
a
What’s next?
• Modeling ex mples
• More model di gnostics (we’ve seen these in the output)
• Applying these ide s to re l d t

• Model testing
• How do we decide when model is good enough?
• How much does it m tter

26
a
a
a
a
a
a
a
a

You might also like