0% found this document useful (0 votes)
31 views47 pages

Chapter 15 - Time Series Regression and Forecasting

Uploaded by

Omar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
31 views47 pages

Chapter 15 - Time Series Regression and Forecasting

Uploaded by

Omar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 47

Chapter 15: Time Series Regression and Forecasting

>Time Series DAtA:What's Different:

c
examples
>urble
correlated wit
self-

>
Introduction time
to series Data 2 Serial Correlation:

(1)

(2)

(3)
notation:

lags, first differences, and growth rates:

c
((Yt) m(Y=-Y-)
note:
bc I
InCYt-Yer) E
=
, we multiply by 4 Annualize
to

c
Autocorrelation Serial Correlation):

1 Thisi sthe same


pop.

W
correlation mrible
used
1
Ing before.
Sample Auto correlations:
months
L 14 ago
1 month

Ago

As
you
see that
Asj" CPUC As
j" his
cy,
less
predictive power

c
forecasting, stationality, and the mean squared Error:
forecast

* c important
stationality:

I have
adort this
about
a re

> This is
what we

Care About

,
we return to this later
in the chapter.
think
c So
of stationality as
follows:

-
you
have series
a
of data:( 1 ) Y,, Y, Yy,..., Y time series
out-of-sample
to data.

· And Another series


of dataofthe same
length thatstarts when series (1) at
is
4,:(2) Y, s'Yats'...Y++ s

Then the distribution distribution


·
probability sampling both time
of series;which are
of the same
length;sampling
Ys;butthey start
at differentperiods the
frequency a same are same

· note:
they may
have differentmiles ofcoefficients, data, etc;but
they are the same in
probability distribution

sprobalistic sense)

differentpoints have the this


· ..time series
starting At in time same mean a mariance, is

the i.. but time data.


equimlentto is series

· Since each y, . . . , tehas same


sampling sprobabilitydistributions a
they have the same mean a

Variance
(Gly,) =...
E(y)
=
a
nrcy,) ...

varcy,

> This the


is same
concepty i.i.d;butfor time series data (dathAcross time

forecasts and forecast Errors:

a
dAtAup period I
to

Actual Realization ( W

forecasto utofsample prediction


That why
is

it called
is

M5ft

We Care About
C
this alotistime
Series
Regression
C
Autoregressions:

b Autocorrelation
> ARSI) C
ofSerial

) ARCP)

The
firstorder Autoregressive model ARCI):

Random error
noise
Example:ARCI) model for the
growth rate
of614:

> So Ho:B, O

H,: , 0 c
only malid if
stationary scries

>Estimated model

>
GPPGR2017:
Q3
CYT
w

Y +

15
he ARCP) Model: multiple lags
using for forecasting:

So jointhypothesis that:Ho:, 0, p0
=

>
=
...,

more on the ARCI) model:

consider the model: Pot PY- 4=


I following ARCI
It
+

(1) CAses
ofB:

c (I) B 0 <
history does not matter

be: P+ random
c
4 U,intercept noise
+
model
(2) O<, 1c
stationary (Series
stationary
is

(3)
i B,
1 C
Y Po +
Y. Y=C
+

history permanent is

& This known


is As A Random walk model

show the
· We will that
if P, =1> series
nonstationary
is

note:Inter discuss
~ . on we will cases (1) 9(2) in the
monstrationality section
ofthis chapter

(4) there a
is case where B, 1 a Another where
0, however in economics are do not
study those cases.

> .
they are me in Economics a we will focus more or (2) 2(3).

Implications of 0
B, 1 and that
P 0:

other than the looka tthe implications q:0


concepty stationarity;lets
so
a B 1

consider the model:


3 So
following Y B+ BY- 4
+
=

that
Yo0c initial condition
- .
suppose

Also that:U, =0.3, R2 0.2, 4,


·
suppose =
=
-0.1, 4, 0.4

thatEIU)
. so we are
justassuming 0 C
they cansel out
on
Average

that:B0 So the
· now
suppose &
B0.3, Series
stationary
is

0 0.310) 0.30.3
C...
,
+
+

0.510.03) 0.05
y2 0 +
0.2

Y3 0 + 0.510.05) 0.1 0.075

0 0.50.075) + 0.2 0.1625


Yu
-
+

So As see that here: start i A "shock occurs" (n,=0.3) & get y,


c you you y =
0.
you

c then
Y, obtained
is As
half of the firstshock(0.5(0.3) A
+
second shock (U.=-0.2)
obtained
>
Y is As half
of
the
hay shock
ofthe first a
halfof the second shock 0.510.051) +

Athird shock
(m, 0.1)
=

3
Yy
obtained
is As half
of
the
hay ofthe hay shock
ofthe first
and half
of
the
hoty ofthe second shock and ye the
of
third

Shock (0.5) 0.075) A


+

fourth shock (Un 0.2)

C.: As
you
see that the firstshock's impact less
is a less as t"(first shockmatters less a less)

> Same in and, 3rd shock, etc.


goes

C... this shows that


"today"is as result
a
of the initial condition in
+

declining importance All


previous
shocks leader shocks

C So whatt his tells us isthat


for stationary series (models;history matters;but
recent
history matters more a earlier
history
matters less.

Implications of
0
B, 1 and that
no introduction
of
An
intercept:

I now;we will
study the same
implications above;buti sintro of intercept

&.
SAy B, I

1 0.5(0) 0.3 1.3


3.
Y,
+
+

Y2 1 + 0.511.3) 0.2 1.45

Y3 1 0.5(1.45)
+
0. I 1.625

1 + 0.5(1.625 +0.2-2.0125
Yu

the
3 So As
you
see that
implications are the same As in
p0 =

could think trend drift


y P.
> So As a or a
you

> So history continues matter


to here (At
decreasing
A
Rates;the same
way
when P. =0; buto n
topy thatwe have

trend.
A

and around thattrend.


declining
shocks happen
a
(Ata Rites

Handline

See >
C note:
stationary models Are Actually P, I
C-1, 1 c however;i n this course are rule out
the
negative

its advanced classes


c
for more
they are common
not in economics.
-

Implications y B, 1
P 0.
+
=

< So here are assume


non-stationary
A model;since ,

0 (1)(0) 0.3 0.3


3.
Y,
+
+

(1) (0.3) 0.1


Y20 +
0.2

Y0 +
(1)(0.1) 0. I O

Yy0 (I) (0 10.2 0.2


-
+

> So this known


is As the radom walk models more on this in inter sections

c
but
as
you
see that
by
is
perfect history ofAll
A
previous
shocks
(they are
declining)
not

, . us My

Since 0 + 0.3 0.2 + 0.2


Y,
C 0.1

-
W
All
ofhistory, sumo previous shocks
initial
condition

> So
"today"is the
sumy All
previous
shocks.

3 So All
of history incorporated
is into
todity; isno
defining down
history.

Implications ofB. 1 i An introduction An


intercept:
of

I now;we will
study the same
implications above;buti sintro of intercept

0.S
SAy
.
&
0.5 (l) (0) 0.3 0.8
3.
Y,
+ +

Y2 0.5 + (1) 10.8) 0.2 1. I

Y 0.5+ (1)) 1.1) 0.1 1.5

(1) (1.5 2.2


Yy 10.2
0.5 + -

the B. =0;buti tcomes trend


history of B
so matters same as in A
in 0.5
c
=

C. So
by J. +Gd=
+

410.5)
30
+

initial
history +
0.3 0.2 0.1 +0.2

trend -
condition
history y shocks

this
> So Also
is
non-stationary
A series

- .
i P 0
=
> Simple Andom walkmodel

random walkmodel a trend


·
if $,0 c in

Time additional model:


c series
regressions with
predictors and the Auto
Regressive
Distributed
Lag (ADL)

>error shock
Example y ADL:nterest
rates and the term spread:

spread btw c

2 ts
Istimation the and
MSAE forecastIntervals:
> of

forecastErrow:

of y's up Tto
c vector

vector X,
c of up to T

unexplained
~error

&
tid arrow due being
to
wrong

The mean
squared forecast
er ror (msft)
of
ADL model:

itwool be the MSSE the B,


a so
Are true
if
pop. Bs

cie:((u + +
1)
< So Above whatwe are
trying say
to that:MSSE ECUT" if I are estimated truthfully
0

*
e

& since > Ms+ E Elu+1)" ECIB Bol Y+ 23

((u,,, e(0)
+

E(u 1)
+

Also the
of
number
ofpredictors
are small relative sample
to size (most of
world
MSFE arise
from ECU++,
and ECCB Bol Y, + B B 2] 'would be
very
small

c iopposite (ECU+) E21B Bo)Y, + (BY B 23's this would be


large

The root mean


squared forecastError RMSAE):

c i.e. similar
standard
to
error
ofRegression
construct
Using the RMSAE to n ternals:

normal distribution
c
from a 2 1.x
of

Example:forecastinterval:

fanchart

here darker
3 So forecasts are more
likely be
to in
regions.
Selection
>
Lag Using Information
criteria:

models.
Lag selection in
Arcps

3 So we don't need to

About
worry Omiting
Ariables

making multiple t-testo r test


f
I so

like chapter
is 8.

too
large Cdfv b estimates" would be
large efficient) less precise estimate
i pis number St (not
> So
y ( gp, a

the
of B, C less variance
of Y large.
is

c
ip
too
is small (bins ofI,"be we have omitted lays thathave still
strong predictive power
(b)
they are still

strongly serially
correlated
4,

I so thati swhen BIC's SAIC's come in


handy.
The
Bayes Information Criterion BIC:

so think random mariable


c. I p
as
pop,
· And BICCP) estimates " P

2 The Aknike
Information Criterion MIC:

3 As

AIC
you
uses
see that

different
a
Al

penalty
term

i.e:
c
monthly frequency is
quarterly.
Example:BICSP) in ARP) model:

C Since itminimizes BICSP)

c
there is a here
trade of

RV
that
predictors
regressors
We know As

But
BICCP) has chosen instead 46
·
p=2 y
=

That should be used.


· is
why Again is
Lag selection in
ADLp,r) models.

>note ACL model Looks like this:


general


reads:
nonstationary
·
a

(A)

(B)

x)

(D)

(2)
What is Trend?:
a

trend line

>has some sort


of quadratic trend

c no observable trend.
Chard tell
to
< does not have trend
a
Deterministic 2 Stochastic Trends:

3 So the firstseries appeared be


to
polynomial in time.

>Random shock

I
serially
uncorrected

> As we will show Random


later; walk models are
special
a
type of Arcis model where
,
I
here have differentminuses

"... they
So in time:
ifyou compare
& . a series

>
proof:
uncorrelated Y,)
Since UI serially
is
(EIKLY,, ...,
0

And that
· we know stationarity:VArsy
is
Varsyz...vAriYz-p) rcYss) etc

so random walk model is:


·
Ye *t

.
·
WrCY) WrCY, +mrCUL

mrcy) wrcy.
+

On

c since 6? F0 C Ary,) wrCYE


> So
Assuming:1 0:
Y, Y k, u,
+
= =

Y =Y,
+

k, k, 4,
=
+

yy Y =
kz k, kz ks
+
=
+ +

Y= Yz
=

-
1
4
+

=
= k, +
kc ...
+

kz
+

C..
mrcY) wr K, t...+
Us to f(t)

time
c function of

c so what we saw have


again
that
is the
jointdistribution of (Y,s:...Yits) depends on 5 since
mrcy) for example;differs across time

dift I periods
Random
for whether Y,has walk (thus
stationary)
tests A
a non or not

Stochastic trends and Unit


Autoregressive roots:
what caused trends:
problems are
by

Industrial production

Japanese

& Statistically Significant


C positive
URATE

JAP IP

C both are
very stratiscally significants they are
strongly correlated;buti n realitysince they are both trending mimbles,
they seem

correlated but
ingestthis correlation
sparios.
is
How do detect
stochastic trends?
you

As discussed earlier, the random walk is special


a case the much model
P,
of
· to

& contains stochastic trend.


<if, non-stationary
1
CYseries is A
=

hus model;the trend


· the
in ARCI)
hypothesis that ARCI has stochastic
a is:

Ho:, I
non-stationary be Unit Root, Stochastic trend
c
of
1,: , I be stochastic trend
a
stationary y
no unit Root (no

This the model: ARCI model


>
hypothesis testi s uses
following iP +
Bi, *=)
+

so
rejectthe null hypothesis n on
stationary.
if y
a we series is
· now this test
ismost
easily implemented by estimating modified
a version the
of AR (II model;which obtain
is
by subtracting it

from both sides:

=El Pot PY 1*t1

IP+P, Y- -+*t

I Do Pr tl t
+

P E1, t
+ +
.

C... : 8 0
I non
stationary

-:8 0 >
Stationary

so B, I Rootexists
stationarity
I
. C Unit (non

so Root
stationary
unit exists
·

I B I no (non

distributed
c isnotnormally it not
is
normally distributed

..

Clinear)
le
deterministic
WriAb ~
time
trand

ne


Loef
The
Dickey-fuller Test
in ARP):

Ho:8 D

H,:84

deterministic trand.
c
including
When should include time
a trend in the of test?
you

3 mene

S merman men

3 Hers are must include

time
A
trend
M
-

Bo Bi

>I-sided

I fail to rejectHo: 80 => P, 1=

3 I(CPP) is non stationary (we saw thatthere

was trend
a
graphically
How Address Raised trends:
mitigate problems by
to

i < is
stationary.

s nonstationarity:Breaks and model


stability:

a so this
non-stationary
is by , differ
Across time
Tests Breaks change
for in
Regression (officients:

(1) Case
IBreak change)
1: is known:

Durm iAble

3 Jo is for intercept

5, is
for ,

U, for
is
5,

c8 0, 4 =
=
0
=
) nostructural Break

null Structural break


3
if
we
reject this
hypothesis occurs t
At
understand
, this
diagram will
help to structural Breaks in this case:

O
T
I

D(I) 0 0 00 I , 1 I 1 I

So that structural breakoccurs 5


suppose we know a t=
at

So their introduced technological charge, pandemic that


some
governmentpolicy innovation,
or some
3 · is or or

At
-
causes a structural break.

their structural
change;then model
if
a so is no ou r ADLs1, 1 is:

> Y B+PY- , +4t


+

c their
if
is structural breaks;then before is the model will look. As
follows:

> Y B+PY- , +4t


+

however;after I, the model looka s


c
will
follows:

>
+

Vo +

V)Ye-,
+ +

8 + Un x
= 1
+

Ut
C Since this is
jointdistribution.
a

break
c f
we
reject H.:8 0, 2= = 0
= ) structural occurs

evidence breaks.
ifthe fail reject
structural
of
s to (no

&
Y, Read the slides.
for case (2)

You might also like