Time Series Lecture2
Time Series Lecture2
Peter Bartlett Last lecture: 1. Objectives of time series analysis. 2. Time series models. 3. Time series modelling: Chasing stationarity.
Stationarity
{Xt } is strictly stationary if for all k, t1 , . . . , tk , x1 , . . . , xk , and h, P (Xt1 x1 , . . . , Xtk xk ) = P (Xt1 +h x1 , . . . , Xtk +h xk ). i.e., shifting the time axis does not affect the distribution. We shall consider second-order properties only.
Weak Stationarity
We say that {Xt } is (weakly) stationary if 1. t is independent of t, and 2. For each h, X (t + h, t) is independent of t. In that case, we write X (h) = X (h, 0).
Stationarity
The autocorrelation function (ACF) of {Xt } is dened as X (h) X (h) = X (0) Cov(Xt+h , Xt ) = Cov(Xt , Xt ) = Corr(Xt+h , Xt ).
Stationarity
2 Example: i.i.d. noise, E[Xt ] = 0, E[Xt ] = 2 . We have 2 if h = 0, X (t + h, t) = 0 otherwise.
Thus,
1. t = 0 is independent of t. 2. X (t + h, t) = X (h, 0) for all t. So {Xt } is stationary. Similarly for any white noise (uncorrelated, zero mean), Xt W N (0, 2 ).
Stationarity
Example: Random walk, St = i=1 Xi for i.i.d., mean zero {Xt }. 2 We have E[St ] = 0, E[St ] = t 2 , and S (t + h, t) = Cov(St+h , St )
h t
= Cov St +
s=1
Xt+s , St
An aside: covariances
Cov(X + Y, Z ) = Cov(X, Z ) + Cov(Y, Z ), Cov(aX, Y ) = a Cov(X, Y ), Also if X and Y are independent (e.g., X = c), then Cov(X, Y ) = 0.
Random walk
10
10
15
10
15
20
25
30
35
40
45
50
10
Stationarity
Example: MA(1) process (Moving Average): Xt = Wt + Wt1 , We have E[Xt ] = 0, and X (t + h, t) = E(Xt+h Xt ) = E[(Wt+h + Wt+h1 )(Wt + Wt1 )] 2 2 (1 + ) if h = 0, = 2 if h = 1, 0 otherwise.
11
{Wt } W N (0, 2 ).
0.8
0.6
0.4
/(1+ )
0.2
0 10
10
12
Stationarity
Example: AR(1) process (AutoRegressive): Xt = Xt1 + Wt , {Wt } W N (0, 2 ).
Assume that Xt is stationary and || < 1. Then we have E[Xt ] = EXt1 =0 (from stationarity)
2 2 2 E[Xt ] = 2 E[Xt 1 ] +
2 = 1 2
(from stationarity),
13
Stationarity
Example: AR(1) process, Xt = Xt1 + Wt , {Wt } W N (0, 2 ). Assume that Xt is stationary and || < 1. Then we have 2 E[Xt ] = 0, = 1 2 X (h) = Cov(Xt+h1 + Wt+h , Xt )
2 E [X t ]
14
0.9
0.8
0.7
0.6
0.5
0.4
0.3 0.2
|h|
0.1
0 10
10
15
Linear Processes
An important class of stationary time series:
Xt = +
j =
j Wtj
where and
2 {Wt } W N (0, w )
|j | < .
j =
16
Linear Processes
Xt = +
j =
j Wtj
We have X =
2 X (h) = w j =
j h+j .
(why?)
17
Xt = +
j =
j Wtj
Choose
, 1 if j = 0, j = 0 otherwise.
2 ). Then {Xt } W N (, W
(why?)
18
Xt = +
j =
j Wtj
Choose
Then Xt = Wt + Wt1 .
=0 1 if j = 0, j = if j = 1, 0 otherwise.
(why?)
19
Xt = +
j =
j Wtj
Choose
=0 j j = 0
if j 0, otherwise.
(why?)
20
21
(xt+|h| x )(xt x ),
t=1
22
(xt+|h| x )(xt x ).
t=1
the sample covariance of (x1 , xh+1 ), . . . , (xnh , xn ), except that we normalize by n instead of n h, and we subtract the full sample mean.
23
24