0% found this document useful (0 votes)
27 views33 pages

STA457 Lecture4

The lecture covers time series statistical models, focusing on measures of dependence, autocorrelation, and stationarity. Key concepts include the definitions and properties of univariate time series, autocovariance, and the distinction between weak and strong stationarity. Examples of time series models such as white noise, moving averages, and random walks are discussed to illustrate these concepts.

Uploaded by

harperzhang2002
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views33 pages

STA457 Lecture4

The lecture covers time series statistical models, focusing on measures of dependence, autocorrelation, and stationarity. Key concepts include the definitions and properties of univariate time series, autocovariance, and the distinction between weak and strong stationarity. Examples of time series models such as white noise, moving averages, and random walks are discussed to illustrate these concepts.

Uploaded by

harperzhang2002
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 33

TA457: Time Series Analysis

Lecture 4

Lijia Wang

Department of Statistical Sciences


University of Toronto

Lijia Wang (UofT) TA457: Time Series Analysis 1 / 33


Overview

Last Time:
1 ETS Models
2 White noise
Today:
1 Time series statistical models
2 Autocorrelation
3 Stationarity

Lijia Wang (UofT) TA457: Time Series Analysis 2 / 33


Outline

1 Time series statistical models


Measure of dependence
Examples of time series models

2 Autocorrelation
Autocovariance
Autocorrelation function
Other autocorrelation functions

3 Stationary
Weak stationary
Strong stationary
Other forms of stationary

Lijia Wang (UofT) TA457: Time Series Analysis 3 / 33


Review: Time series Data

Univariate Time Series:

A univariate time series is a sequence of measurements of the same


variable collected over time. Most often, the measurements are made at
regular time intervals.

Data are not necessarily independent and not necessarily identically


distributed.
Dependence is important
The ordering matters

Lijia Wang (UofT) TA457: Time Series Analysis 4 / 33


Measures of Dependence

A complete description of a time series, observed as a collection of n


random variables at arbitrary time points t1 , t2 , . . . , tn , for any positive
integer n, is provided by the joint distribution function, evaluated as the
probability that the values of the series are jointly less than the n
constants, c1 , c2 , . . . , cn ; i.e.,
Definition:

Ft1 ,t2 ,...,tn (c1 , c2 , . . . , cn ) = P (xt1 ≤ c1 , xt2 ≤ c2 , . . . , xtn ≤ cn )

Unfortunately, these multidimensional distribution functions cannot


investigated easily unless the random variables are jointly normal

Lijia Wang (UofT) TA457: Time Series Analysis 5 / 33


The marginal distributions

Definition: Let Ft (x) be the marginal distribution functions of xt and is


defined by
Ft (x) = P {xt ≤ x} .
The corresponding marginal density function is then defined as

∂Ft (x)
ft (x) =
∂x
provided that the density exists.

Lijia Wang (UofT) TA457: Time Series Analysis 6 / 33


Examples of time series models

Some typical examples of time series models:

White noises
Moving Averages
Autoregressions
Random Walk

Lijia Wang (UofT) TA457: Time Series Analysis 7 / 33


White Noise (Review)

Definition: Suppose that {wt , t ∈ Z} is a sequence of independent and


identically distributed random variables with mean zero, and variance
E [wt2 ] = σw2 < ∞.
xt = wt
is called a strong white noise series in L2 (abbreviated xt ∼ iid(0, σw2 )).

If in addition, we assume wt ∼ N(0, σw2 ), then we say xt is a Gaussian


iid
white noise, abbreviated xt ∼ N(0, σw2 ).

Lijia Wang (UofT) TA457: Time Series Analysis 8 / 33


Random Walk (Review)

Definition: Consider a time series model where the present state xt equals
xt−1 plus some constant and an error term:

xt = δ + xt−1 + εt

This is sometimes called a random walk with drift model. Simple


arithmetic yields that
t
X
xt = δt + x0 + εi
i=1

Lijia Wang (UofT) TA457: Time Series Analysis 9 / 33


Example: Moving Average

A moving average is a tool to smooth the series. A moving average of


order 3 for the white noise process wt can be written as
1
vt = (wt−1 + wt + wt+1 )
3

A more rigorous definition will be given next week.

Lijia Wang (UofT) TA457: Time Series Analysis 10 / 33


Example: Autoregressions

The output variable xt of the autoregressive model depends on the past


values of the series. An example of the autoregressive model (AR of order
2) is
xt = xt−1 − 0.9xt−2 + wt for t = 1, 2, · · · , 500.

A more rigorous definition will be given next week.

Lijia Wang (UofT) TA457: Time Series Analysis 11 / 33


Mean function

We further define the mean of a time series random variable.

Definition: The mean function is defined as


Z ∞
µxt = E (xt ) = xft (x) dx,
−∞

provided it exists, where E denotes the usual expected value operator.


When no confusion exists about which time series we are referring to, we
will drop a subscript and write µxt as µt .

Lijia Wang (UofT) TA457: Time Series Analysis 12 / 33


Example: Moving average series

Example: A moving average is a tool to smooth the series. A moving


average of order 3 for the white noise process wt can be written as
1
ϵt = (wt−1 + wt + wt+1 ),
3
i.i.d
where wt ∼ N(0, σw2 ).

Question: Derive the mean of ϵt .


1
µϵt = E (ϵt ) = [E (wt−1 ) + E (wt ) + E (wt+1 )] = 0.
3

Lijia Wang (UofT) TA457: Time Series Analysis 13 / 33


Example: Random Walk series

Example: Consider the following random walk series:

xt = xt−1 + wt ,
i.i.d
with initial condition x0 = 0 and where wt ∼ N(0, σw2 ).

Question: Derive the mean of xt .


t
X
µxt = E (xt ) = E (xt−1 ) + E (wt ) = E (x0 ) + E (wi ) = 0.
i=1

Lijia Wang (UofT) TA457: Time Series Analysis 14 / 33


Outline

1 Time series statistical models


Measure of dependence
Examples of time series models

2 Autocorrelation
Autocovariance
Autocorrelation function
Other autocorrelation functions

3 Stationary
Weak stationary
Strong stationary
Other forms of stationary

Lijia Wang (UofT) TA457: Time Series Analysis 15 / 33


Autocovariance

Definition: For a given time series xt , the autocovariance function is


defined as the second-moment product

γx (s, t) = Cov (xs , xt ) = E [(xs − µs ) (xt − µt )] ,


for all s and t.

We may write γx (s, t) as γ(s, t) if no confusion.


Note that γx (s, t) = γx (t, s) for all time points s and t.

Lijia Wang (UofT) TA457: Time Series Analysis 16 / 33


Autocovariance: properties

The Autocovariance have the following properties:


The autocovariance measures the linear dependence between two
points on the same series observed at different times.
Very smooth series exhibit autocovariance functions that stay large
even when the t and s are far apart, whereas choppy series tend to
have autocovariance functions that are nearly zero for large
separations.
if γx (s, t) = 0, xs and xt are not linearly related, but there still may be
some dependence structure between them. If, however, xs and xt are
bivariate normal, γx (s, t) = 0 ensures their independence.

Lijia Wang (UofT) TA457: Time Series Analysis 17 / 33


Variance

Definition: The variance function is defined as


h i
var (xt ) = E (xt − µt )2 ,

for all t.

Obviously, for s = t, the autocovariance reduces to the (assumed finite)


variance, because
h i
γx (t, t) = E (xt − µt )2 = Var (xt )

Lijia Wang (UofT) TA457: Time Series Analysis 18 / 33


Examples

Example: Compute the mean and autocovariance function of a:


1 White noise sequence wt
2 “Moving average sequence” xt = wt + θwt−1
3 The time series xt = cos(2πt) + sin(2πt)wt

Lijia Wang (UofT) TA457: Time Series Analysis 19 / 33


Autocorrelation function (ACF)

Definition: The autocorrelation function (ACF) is defined as

γ(s, t)
ρ(s, t) = p .
γ(s, s)γ(t, t)

The ACF measures the linear predictability of the series at time t, say xt ,
using only the value xs .

Lijia Wang (UofT) TA457: Time Series Analysis 20 / 33


Autocorrelation function (ACF): properties

The ACF have the following properties:

1 −1 ≤ ρ(s, t) ≤ 1
2 Suppose xt = β0 + β1 xs .
1 If β1 > 0, then ρ(s, t) = 1;
2 If β1 < 0, then ρ(s, t) = −1.

Lijia Wang (UofT) TA457: Time Series Analysis 21 / 33


The cross-covariance function

Definition: The cross-covariance function between two series, xt and yt , is

γxy (s, t) = cov (xs , yt ) = E [(xs − µxs ) (yt − µyt )] .

There is also a scaled version of the cross-covariance function, called the


cross-correlation function (CCF), and is given by

γxy (s, t)
ρxy (s, t) = p .
γx (s, s)γy (t, t)

Lijia Wang (UofT) TA457: Time Series Analysis 22 / 33


Outline

1 Time series statistical models


Measure of dependence
Examples of time series models

2 Autocorrelation
Autocovariance
Autocorrelation function
Other autocorrelation functions

3 Stationary
Weak stationary
Strong stationary
Other forms of stationary

Lijia Wang (UofT) TA457: Time Series Analysis 23 / 33


Weak Stationarity

Definition:
A weakly stationary time series, xt , is a finite variance process such that
(i) the mean value function, µt , is constant and does not depend on time
t, and
(ii) the autocovariance function, γ(s, t), depends on s and t only through
their difference |s − t|.
We will use the term stationary to mean weakly stationary.

Lijia Wang (UofT) TA457: Time Series Analysis 24 / 33


Weak Stationarity: Notations

Note that for a stationary process:


The mean function, E (xt ) = µt , of a stationary time series is
independent of time t, we will write µt = µ.
The autocovariance function, γ(s, t), of a stationary time series, xt ,
depends on s and t only through their difference |s − t|, we may
simplify the notation. Let s = t + h, where h represents the time shift
or lag. We have

γ(t + h, t) = Cov (xt+h , xt ) = Cov (xh , x0 ) = γ(h, 0) = γ(h)

Lijia Wang (UofT) TA457: Time Series Analysis 25 / 33


Weak Stationarity: Cont.

Definition: The autocovariance function of a stationary time series will be


written as

γ(h) = cov (xt+h , xt ) = E [(xt+h − µ) (xt − µ)] .

Definition: The autocorrelation function (ACF) of a stationary time series


will be written as
γ(t + h, t) γ(h)
ρ(h) = p = .
γ(t + h, t + h)γ(t, t) γ(0)

Lijia Wang (UofT) TA457: Time Series Analysis 26 / 33


Weak Stationarity: properties

For a stationary series, we have


Pn Pn
0 ≤ var (a1 x1 + · · · + an xn ) = j=1 k=1 aj ak γ(j − k)
|γ(h)| ≤ γ(0)
γ(h) = γ(−h).

Lijia Wang (UofT) TA457: Time Series Analysis 27 / 33


Additional properties of Stationarity

Properties and additional terminology:


The property that the mean is time-invariant is also referred to as
first order stationarity or mean stationarity.
First order stationarity along with the property that the
autocovariance is a function of only the lag is also referred to as
second order stationarity.
In this course, unless stated specifically, we use the word “stationary”
to refer to “weak stationary” or “second order stationarity”

Lijia Wang (UofT) TA457: Time Series Analysis 28 / 33


Stationary Examples

Examples:
1 The white noise wt is stationary
2 The moving average series of white noise series
xt = wt + θwt−1 , θ ∈ R is stationary.
3 A random walk series is NOT stationary
1 Though a random walk with mean zero increments is first-order
stationary, it is not second-order stationary.
2 A random walk with drift is neither first-order nor second-order
stationary.

Lijia Wang (UofT) TA457: Time Series Analysis 29 / 33


Strong Stationarity

Definition: We say that the time series xt is strictly stationary or


strongly stationary if its finite dimensional distributions are
shift-invariant. Namely, the probability distribution of

{xt1 , xt2 , . . . , xtk }

is identical to that of the time shifted set

{xt1 +h , xt2 +h , . . . , xtk +h } ,

That is,

P {xt1 ≤ c1 , . . . , xtk ≤ ck } = P {xt1 +h ≤ c1 , . . . , xtk +h ≤ ck }

for all k = 1, 2, . . ., all time points t1 , t2 , . . . , tk , all numbers c1 , c2 , . . . , ck ,


and all time shifts h = 0, ±1, ±2, . . ..

Lijia Wang (UofT) TA457: Time Series Analysis 30 / 33


Weak vs. Strong Stationarity

Relationships between weak and strong stationarity:


1 xt strictly stationary ̸⇒ xt weakly stationary.
2 xt strictly stationary and E[xt2 ] < ∞ ⇒ xt weakly stationary.
3 xt weakly stationary and xt a Gaussian process ⇒ xt strictly
stationary.
4 If xt ∈ L2 for all t ∈ Z, then xt not weakly stationary ⇒ xt is not
strictly stationary.

Lijia Wang (UofT) TA457: Time Series Analysis 31 / 33


Joint stationary

Definition: Two time series, say, xt and yt , are said to be jointly


stationary if they are each stationary, and the cross-covariance function

γxy (h) = cov (xt+h , yt ) = E [(xt+h − µx ) (yt − µy )] .

is a function only of lag h.

Definition: The cross-correlation function (CCF) of jointly stationary


time series xt and yt is defined as

γxy (h)
ρxy (h) = p
γx (0)γy (0)

Lijia Wang (UofT) TA457: Time Series Analysis 32 / 33


Trend Stationarity

For time series xt = α + βt + yt , where yt is stationary:


The mean function is µx,t = E (xt ) = α + βt + µy , which is not
independent of time.
Therefore, the process is not stationary.
The autocovariance function, however, is independent of time,
because
γx (h) = cov (xt+h , xt ) = E [(xt+h − µx,t+h ) (xt − µx,t )] = E [(yt+h −
µy ) (yt − µy )] = γy (h).
Thus, the model may be considered as having stationary behavior around a
linear trend; this behavior is sometimes called trend stationarity.

Lijia Wang (UofT) TA457: Time Series Analysis 33 / 33

You might also like