0% found this document useful (0 votes)
144 views189 pages

Time Series Analysis Parte 1 PDF

This document discusses an introduction to time series analysis. It covers fundamental time series concepts such as components of a time series including trends, seasonality, cycles, and irregular variations. Examples of different types of time series data are provided, such as stock prices, sales figures, and atmospheric measurements. Autoregressive models for analyzing stationary and non-stationary time series are also mentioned. The document appears to be a lecture on basic time series analysis topics.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
144 views189 pages

Time Series Analysis Parte 1 PDF

This document discusses an introduction to time series analysis. It covers fundamental time series concepts such as components of a time series including trends, seasonality, cycles, and irregular variations. Examples of different types of time series data are provided, such as stock prices, sales figures, and atmospheric measurements. Autoregressive models for analyzing stationary and non-stationary time series are also mentioned. The document appears to be a lecture on basic time series analysis topics.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 189

Introduction to time series

First and second order properties


Stationary and non-stationary models
Autoregressive Models
List of Some Useful R Functions and Homework

Time Series Analysis - Part 1

Dr. Esam Mahdi

Islamic University of Gaza - Department of Mathematics

April 19, 2017

1 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

What is a Time Series?

A collection of data {Xt } recorded over a period of time


(daily, weekly, monthly, quarterly, yearly, etc.), analyzed to
understand the past, in order to predict the future (forecast),
helping managers and policy makers to make well-informed
and sound decisions.
An important feature of most time series is that observations
close together in time tend to be correlated (serially
dependent).
Time could be discrete: t = 1, 2, ..., or continous: t > 0.

2 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

The Nature of Time Series Data - Example 1


Quarterly earnings per share for 1960Q1 to 1980Q4 of the U.S.
company, Johnson & Johnson, Inc. (Upward trend).
15
Quarterly Earnings per Share

10
5
0

1960 1965 1970 1975 1980

Time

3 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

The Nature of Time Series Data - Example 2


A small .1 second (1000 points) sample of recorded speech for the
phrase ”aaa...hhh”. (Regular repetition of small wavelets).
4000
3000
speech

2000
1000
0

0 200 400 600 800 1000

Time

4 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

The Nature of Time Series Data - Example 3


Annual numbers of lynx trappings in McKenzie river in Northwest
Territories of Canada over the years 1821-1934. (Aperiodic cycles
of approximately 10 years).
7000
6000
5000
4000
lynx

3000
2000
1000
0

1820 1840 1860 1880 1900 1920

Time
5 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

The Nature of Time Series Data - Example 4


Monthly Airline Passenger Numbers 1949-1960. (Seasonality
appears to increase with the general trend).
600
500
AirPassengers

400
300
200
100

1950 1952 1954 1956 1958 1960

Time

6 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

The Nature of Time Series Data - Example 5


Returns of the New York Stock Exchange (NYSE) from February
2, 1984 to December 31, 1991. (Average return of approximately
zero, however, volatility (or variability) of data changes over time).
0.05
0.00
NYSE

−0.05
−0.10
−0.15

1984 1985 1986 1987 1988 1989 1990 1991

Time

7 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

Components of a Time Series

In general, the time series can be decompose into 4 components:


Secular Trend, Seasonal Variation, Cyclical Variation, and
Irregular Variation that can be modelled deterministically with
mathematical functions of time (see the next slide).
General Trend: The smooth long term direction (upward or
downward) of a time series.
Seasonal Variation: Patterns of change in a time series within
a year which tend to repeat each year.
Cyclical Variation: The rise and fall of a time series over
periods longer than one year.
Irregular Variation: Random and follow no regularity in the
occurrence pattern.

8 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

Three Types of Time Series Decomposition


1 The additive decomposition model:
Xt = Tt + St + Ct + It

Xt = Original data at time t


Tt = Trend value at time t
St = Seasonal fluctuation at time t,
Ct = Cyclical fluctuation at time t,
It = Irregular variation at time t.
2 The multiplicative decomposition model:
Xt = Tt × St × Ct × It
3 The Mixed decomposition model: For example, if Tt and
Ct are correlated to each others, but they are independent
from St and It , the model will be Xt = Tt × Ct + St + It
9 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

Secular (General) Trend: Sample Chart


The increase or decrease in the movements of a time series that
does not appear to be periodic is known as a trend.

10 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

Seasonal Variation: Sample Chart


Short-term fluctuation in a time series which occur periodically in a
year. This continues to repeat year after year, although the term is
applied more generally to repeating patterns within any fixed
period, such as restaurant bookings on different days of the week.

11 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

Cyclical Variation: Sample Chart


Recurrent upward or downward movements in a time series where
the period of cycle is greater than a year. Also these variations are
not regular as seasonal variation.

12 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

Time Series Decomposition in R


In R, the function decompose() estimates trend, seasonal, and
irregular effects using the Moving Averages method (MA). In this
case, the series is decompose into 3 components: trend-cycle
component, seasonal component, and irregular component.
1 The additive decomposition model:

Xt = Tt + St + It

2 The multiplicative decomposition model:

Xt = Tt × St × It

where Tt is the trend-cycle component (containing both


trend and cycle components).
Continue to next slide ...
13 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

The Function decompose() in R - Example


Atmospheric concentrations of CO2 (monthly from 1959 to 1997).
R> D <- decompose(co2)
R> season.term <- D$figure; trend.term <- D$trend
R> random.term <- D$random; plot(D)
Decomposition of additive time series
360
observed
340
360 320
trend
340
3 320
seasonal
1
−1
−3
0.5
random
−0.5

1960 1970 1980 1990

Time

14 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

Procedure for Decomposition


1 Decompose the series into its components (say multiplicative
model: Xt = Tt × St × Ct × It .
2 Create the centered moving averages series (MAt = Tt × Ct ).
3 Deseasonalize the series (Deseasonalizing means: remove the
seasonal effects when seasonal pattern of period m exists).
This will be done as follows:
Find the ratio Xt /MAt = St × It .
Take the average of each corresponding season to eliminate
the irregular term (randomness) and get the seasonal indices.
Normalize seasonal indices to get SIt (divide each seasonal
index’s value by the sum of all indices’ values and then
multiply by m) to make sure they average to 1 (sum = m).
Find the ratio of Xt /SIt (this is the deseasonalized series Xt? ).
4 Estimate the trend for the deseasonalized series.
5 Forecast future values of each component (X̂t? ).
6 Reseasonalize predictions (ResP = SIt × X̂t? ). 15 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

Estimating the Seasonality: Moving Averages Method


Moving Averages method (MA) is used to smooth out the
short-term fluctuations and identify the long-term trend in the
original series (i.e, to create a smooth series in order to reduce
the random fluctuations in the original series and estimates
the trend-cycle component). Moving averages of length
m ∈ Z is obtained by taking the average of the initial subset
of m consecutive observations (usually m denotes the seasonal
pattern of m periods). Then the subset is modified by
”shifting forward”; that is, excluding the first observation of
the series and including the next observation following the
initial subset in the series. This creates a new subset of m
consecutive observations, which is averaged. Do the shifting
forward method again and repeat the procedures until you get
the moving averages of span of period m.
Continue to next slide ... 16 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

Centered Moving Averages (Centered MA)


If m is odd, MA is termed a centered moving averages. For
example: For the data of observations a, b, c, d, e, f , g , the
moving averages with span 3 (or length 3) is
a+b+c b+c +d c +d +e d +e+f e+f +g
{ , , , , },
3 3 3 3 3
If m is even, then apply a moving average of span 2 on the
resulted series created by the moving average of even span, m,
to get the centered moving averages series. For example: The
moving averages with span of 4 is
m1 = a+b+c+d4 , m2 = b+c+d+e
4 , m3 = c+d+e+f
4 , m4 =
d+e+f +g
4 , so that the centered moving averages will be
m1 + m2 m2 + m3 m3 + m4
{ , , }.
2 2 2
17 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

Example: Moving Averages Method

Time Xt three-years moving averages four-years moving averages


four-years two-years
1986 10 - - -
1987 14 (10 + 14 + 11) ÷ 3 = 11.7 - -
(10 + 14 + 11 + 21) ÷ 4 = 14.00
1988 11 (14 + 11 + 21) ÷ 3 = 15.3 (14.0 + 14.25) ÷ 2 = 14.125
(14 + 11 + 21 + 11) ÷ 4 = 14.25
1989 21 (11 + 21 + 11) ÷ 3 = 14.3 (14.25 + 14.75) ÷ 2 = 14.500
(11 + 21 + 11 + 16) ÷ 4 = 14.75
1990 11 (21 + 11 + 16) ÷ 3 = 16.0 (14.75 + 14.50) ÷ 2 = 14.625
(21 + 11 + 16 + 10) ÷ 4 = 14.50
1991 16 (11 + 16 + 10) ÷ 3 = 12.3 (14.50 + 14.75) ÷ 2 = 14.625
(11 + 16 + 10 + 22) ÷ 4 = 14.75
1992 10 (16 + 10 + 22) ÷ 3 = 16.0 (14.75 + 15.50) ÷ 2 = 15.125
(16 + 10 + 22 + 14) ÷ 4 = 15.50
1993 22 (10 + 22 + 14) ÷ 3 = 15.3 (15.50 + 16.00) ÷ 2 = 15.750
(10 + 22 + 14 + 18) ÷ 4 = 16.00
1994 14 (22 + 14 + 18) ÷ 3 = 18.0 (16.00 + 16.75) ÷ 2 = 16.375
(22 + 14 + 18 + 13) ÷ 4 = 16.75
1995 18 (14 + 18 + 13) ÷ 3 = 15.0 (16.75 + 16.75) ÷ 2 = 16.750
(14 + 18 + 13 + 22) ÷ 4 = 16.75
1996 13 (18 + 13 + 22) ÷ 3 = 17.7 (16.75 + 16.50) ÷ 2 = 16.625
(18 + 13 + 22 + 13) ÷ 4 = 16.50
1997 22 (13 + 22 + 13) ÷ 3 = 16.0 - -
1998 13 - - -

18 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

R-code: The Centered of Moving Averages Series

R> x <- c(10,14,11,21,11,16,10,22,14,18,13,22,13)


R> y <- numeric() ; z <- numeric(); w <- numeric()
R> ## Compute three-years moving averages (say: y)
R> for (i in 1: (length(x)-2)){y[i] <- sum(x[i:(i+2)])/3}
R> ## Compute four-years moving averages (say: z)
R> for (i in 1: (length(x)-3)){z[i] <- sum(x[i:(i+3)])/4}
R> ## Compute two-years moving averages (say: w)
R> for (i in 1: (length(z)-1)){w[i] <- sum(z[i:(i+1)])/2}
R> round(y,2); round(z,2) ; round(w,2)
[1] 11.67 15.33 14.33 16.00 12.33 16.00 15.33 18.00 15.00
[1] 14.00 14.25 14.75 14.50 14.75 15.50 16.00 16.75 16.75
[1] 14.12 14.50 14.62 14.62 15.12 15.75 16.38 16.75 16.62
19 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

R> plot(ts(x,start=1986),ylab="original series and MA")


R> points(ts(y,start=1987),pch=1,type="o",col="red")
R> points(ts(w,start=1988),pch=8,type="b",col="blue")
R> legend(x="bottomright",c("original series","MA(m=3)",
+ "MA (m=4)"),col=c(1,2,4),text.col="green4",
+ pch=c(46,1,8),lty=c(1,19,15),merge=TRUE,bg='gray95')
R> title(main="Smoothing curve: centered moving averages")
Smoothing curve: centered moving averages
22
20
original series and MA

18



16

● ● ●
● ●


14


12


original series
● MA(m=3)
MA (m=4)
10

1986 1988 1990 1992 1994 1996 1998

Time

20 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

Example - Deseasonalizing Using Moving Averages Method

21 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

Example: Computing Moving Averages

22 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

Example: Centered moving averages

23 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

Example: Computing Seasonal Ratios

24 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

Example: Calculating Raw Seasonal Indices

25 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

Example: Normalizing Seasonal Indices (Make Sum SI =4)


4×0.7135
0.7135+0.7077+0.7406+1.844 = 0.7124.

26 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

Example: Deseasonalizing Raw Data

27 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

Estimation the Trend


In the absence of the seasonal effect, the time series model may be
written in the Wold representation:

Xt = µt + t ,

where µt = Tt is a deterministic component (trend), t = It is an


independent and identically distributed (i.i.d.) stochastic
component, and µt & t are uncorrelated, and µt can be modeled
in many ways:
Linear: µt = a + bt.
Quadratic: µt = a + bt + ct 2 .
There are two methods that can be used to estimate µt :
Method 1: Least squares method.
Method 2: Smoothing by means of moving averages.
28 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

Estimation the Trend - Least Squares Method

The long term trend of many time series often approximates a


straight line. If Tt is the dependent variable and t is the
independent variable (represents the time index), the least
squares fitted line (Simple Linear Regression):

T̂t = β̂0 + β̂1 t,


Pn
i=1 ti Ti − nt̄ T̄
β̂1 = P n 2 2
,
i=1 ti − nt̄
β̂0 = T̄ − β̂1 t̄.

Residual is the difference between the true and predicted


value:
êt = Tt − T̂t .
29 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

Least Squares Method - Root Mean Square Error

Root Mean Square Error is a goodness of fit measure defined as:


the square root of the mean of the squares of the residuals
v
u n
u1 X
RMSE = t êt2
n
t=1 30 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

β̂1 =
Year Qtr Revenue Xt? t tXt? t2
1992
1992
1
2
1026.00
1056.00
1440.2
1494.4
1
2
1440.2
2988.8
1
4
451804.8 − 20( 210 39423.4
20 )( 20 )
1992
1992
3
4
1182.00
2861.00
1598.5
1553.6
3
4
4795.5
6214.4
9
16 2870 − 20( 210
20 )
2
1993 1 1172.00 1645.1 5 8225.5 25
1993 2 1249.00 1767.6 6 10605.6 36
1993 3 1346.00 1820.3 7 12742.1 49 = 56.93,
1993 4 3402.00 1847.4 8 14779.2 64
1994 1 1286.00 1805.1 9 16245.9 81
1994 2 1317.00 1863.8 10 18638.0 100
1994 3 1449.00 1959.6 11 21555.6 121 β̂0 =
1994 4 3893.00 2114.0 12 25368.0 144
1995 1 1462.00 2052.2 13 26678.6 169
1995 2 1452.00 2054.9 14 28768.6 196 39423.4 210
1995 3 1631.00 2205.7 15 33085.5 225 −56.931( )
1995 4 4200.00 2280.7 16 36491.2 256 20 20
1996 1 1776.25 2493.3 17 42386.1 289
1996 2 1808.25 2559.0 18 46062.0 324
1996 3 1941.75 2626.0 19 49894.0 361 = 1373.39,
1996 4 4128.75 2242.0 20 44840.0 400
Sum - - 39423.4 210 451804.8 2870 regression line to
deseasonalized data:

X̂t? = 1373.39 + 56.93t


31 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

R> t <- 1:20


R> x<-c(1440.2,1494.4,1598.5,1553.6,1645.1,1767.6,1820.3,
+ 1847.4,1805.1,1863.8,1959.6,2114.0,2052.2,2054.9,
+ 2205.7,2280.7,2493.3,2559.0,2626.0,2242.0)
R> trend.line <- lm(x~t)
R> trend.line
Call:
lm(formula = x ~ t)

Coefficients:
(Intercept) t
1373.39 56.93

32 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

Regression Line for Deseasonalized Data


3000



2500


deseasonalized revenue




2000

● ●

● ●
● ●



1500




1000

0 5 10 15 20

time

X̂t? = 1373.39 + 56.93t

33 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

Year Revenue Seasonal Index Forecast Reseasonalize Predictions Square Error


t Tt SI X̂t? ResP = SI × X̂t? (Tt − ResP)2
1992 1 1026.00 0.7124 1430.3 1018.983 97.005
1992 2 1056.00 0.7066 1487.3 1050.927 51.550
1992 3 1182.00 0.7394 1544.2 1141.832 2950.947
1992 4 2861.00 1.8415 1601.1 2948.503 2257.815
1993 5 1172.00 0.7124 1658.0 1181.217 167.376
1993 6 1249.00 0.7066 1715.0 1211.841 2765.401
1993 7 1346.00 0.7394 1771.9 1310.219 2341.483
1993 8 3402.00 1.8415 1828.8 3367.862 343.651
1994 9 1286.00 0.7124 1885.8 1343.450 6503.070
1994 10 1317.00 0.7066 1942.7 1372.755 6225.828
1994 11 1449.00 0.7394 1999.6 1478.607 1603.193
1994 12 3893.00 1.8415 2056.6 3787.221 3299.437
1995 13 1462.00 0.7124 2113.5 1505.684 3759.864
1995 14 1452.00 0.7066 2170.4 1533.669 13358.150
1995 15 1631.00 0.7394 2227.4 1646.995 467.893
1995 16 4200.00 1.8415 2284.3 4206.581 12.770
1996 17 1776.25 0.7124 2341.2 1667.917 23123.700
1996 18 1808.25 0.7066 2398.2 1694.584 25875.590
1996 19 1941.75 0.7394 2455.1 1815.382 29205.720
1996 20 4128.75 1.8415 2512.0 4625.940 72893.410
Sum - - - - - 197303.852

X̂t? = 1373.39 + 56.93t


p
RMSE = 197303.852/20 = 99.32 34 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

Plot the Forecast - 2 Years (8 Quarters) Ahead


5000



4000


Toys R US revenue


3000


2000


● ●

● ● ●
● ●
● ●
● ●
1000

● ●

1992 1993 1994 1995 1996 1997 1998 1999

Year

35 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

Differencing to detrend and deseasonalized a series

Sometimes, the first difference operator, ∆Xt = Xt − Xt−1 , is


used to remove the trend of the series (as we will discuss
later).

e.g., if Xt = β0 + β1 t then Xt−1 = β0 + β1 (t − 1),

so that ∆Xt = β1 (Constant).


Similarly, a seasonal component with a span period m ∈ Z+ in
the time series can be removed by differencing the series at
lag m. That is ∆m Xt = Xt − Xt−m (as we will discuss later).

36 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

Eliminating the Cycle Component (Multiplicative Models)

After creating the moving averages (MAt = Tt Ct ) and estimating


the trend (Tt ) series, we can estimate the cycle component from
the following equation
MAt
Ct = .
Tt

37 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

Forecasts Using Exponential Smoothing Method


Definition
For a sequence of observations {X1 , X2 , ...} (assuming there is no
systematic trend or seasonal effects), the exponential smoothing
series {X̂t : t ≥ 1} is given by the formula

X̂t+1 = X̂t + α(Xt − X̂t ),

or equivalently,
X̂t+1 = αXt + (1 − α)X̂t ,
where 0 < α < 1 is the exponentially weighted moving average
(referred to as the smoothing parameter).

Xt is the actual observed value at time t.


X̂t+1 is the one-step-ahead forecast value, where X̂1 = X1 .
38 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

Exponential Smoothing Weighted MA (EWMA)


Whereas the smoothing based on the moving averages
method puts equal weights on the past observations,
exponential smoothing method assign exponentially decreasing
weights over time.
The value of α determines the amount of smoothing.
A value of α near 1 gives a little smoothing for estimating the
mean level and X̂t+1 ≈ Xt .
A value of α near 0 gives highly smoothed estimates of the
mean level and takes little account of the most recent
observation.
A typical value for α is 0.2.
In practice, we determine the value of α for which the mean
n
X (Xt − X̂t )2
squared errors MSE = is minimized.
n
t=2
39 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

Example - Forecasts Using Exponential Smoothing

For the following daily observations, use the exponential smoothing


method with two different smoothing parameters α = 0.2 and 0.4
to forecast the 1-day ahead. What is the optimal value of α?

Day Sales (Xt )


1 39
2 44
3 40
4 45
5 38
6 43
7 39

40 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

Example - Forecasts Using Exponential Smoothing

Day Sales α = 0.2 α = 0.4


t Xt Forecast (X̂t ) Square errors (e 2 ) Forecast (X̂t ) Square errors (e 2 )
1 39 39 - 39 -
2 44 39 25 39 25
3 40 40 0 41 1
4 45 40 25 40.6 19.36
5 38 41 9 42.36 19.0096
6 43 40.4 6.76 40.616 5.6834
7 39 40.92 3.69 41.5696 6.6028

For α = 0.2: X̂t+1 = 0.2Xt + 0.8X̂t . For α = 0.4: X̂t+1 = 0.4Xt + 0.6X̂t .
X̂1 = X1 = 39 X̂1 = X1 = 39
X̂2 = 0.2(39) + 0.8(39) = 39 X̂2 = 0.4(39) + 0.6(39) = 39
X̂3 = 0.2(44) + 0.8(39) = 40 X̂3 = 0.4(44) + 0.6(39) = 41
X̂4 = 0.2(40) + 0.8(40) = 40 X̂4 = 0.4(40) + 0.6(41) = 40.6
X̂5 = 0.2(45) + 0.8(40) = 41 X̂5 = 0.4(45) + 0.6(40.6) = 42.36
X̂6 = 0.2(38) + 0.8(41) = 40.4 X̂6 = 0.4(38) + 0.6(42.36) = 40.616
X̂7 = 0.2(43) + 0.8(40.4) = 40.92 X̂7 = 0.4(43) + 0.6(40.616) = 41.5696
X̂8 = 0.2(39) + 0.8(40.92) = 40.54 X̂8 = 0.4(39) + 0.6(41.5696) = 40.5418
MSE (α = 0.2) = 11.58 and MSE(α = 0.4) = 12.78 so that α = 0.2 is the optimal.

41 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

Exponential Smoothing in R

HoltWinters(x, alpha = NULL, beta = NULL, gamma =


NULL, seasonal = c("additive", "multiplicative"),
start.periods = 2, l.start = NULL, b.start = NULL,
s.start = NULL,optim.start = c(alpha = 0.3, beta =
0.1, gamma = 0.1), optim.control = list())
Exponential smoothing is a special case of the Holt-Winters
algorithm, implemented in the R function HoltWinters()
where the additional parameters beta and gamma set to be
FALSE.
If we do not specify a value for alpha, the function
HoltWinters() will estimates the value that minimises the
MSE .

42 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

Example - Forecasts Using Exponential Smoothing in R

R> x <- c(27,34,31,24,18,19,17,12,26,14,18,33,31,31,19,


+ 17,10,25,26,18,18,10,4,20,28,21,18,23,19,16)
R> y <- ts(x)
R> ## Consider alpha=0.2
R> out1 <- HoltWinters(y,alpha=0.2,beta=FALSE,gamma=FALSE)
R> #out1$fitted##Output 1st column is the smoothing series
R> #
R> ## Let R estimates alpha
R> out2 <- HoltWinters(y,beta=FALSE,gamma=FALSE)
## The following code plot the original series and EWMA
R> plot(y)## Plot the original series and EWMA
R> lines(out1$fitted[,1],col="red")#alpha=0.2
R> lines(out2$fitted[,1],col="blue")#estimated alpha
43 of 189
Introduction to time series
Fundamental concepts
First and second order properties
Time Series Decomposition
Stationary and non-stationary models
Estimating the Seasonality
Autoregressive Models
Estimation the Trend
List of Some Useful R Functions and Homework

Example - Plot the Forecasts with Confidence Intervals

R> plot(out1,xlim=c(2,38))
R> pred <- predict(out1,n.ahead=6,prediction.interval=T)
R> lines(pred[,1],col="blue")
R> lines(pred[,2],col="green")
R> lines(pred[,3],col="green")
Holt−Winters filtering
35
30
25
Observed / Fitted

20
15
10
5

5 10 15 20 25 30 35

Time

Figure : Exponentially weighted moving average. 44 of 189


Introduction to time series
First and second order properties
Autocovariance and Autocorrelation Functions
Stationary and non-stationary models
Sample Autocovariance and Autocorrelation
Autoregressive Models
List of Some Useful R Functions and Homework

The Mean and the Autocovariance Functions


Definition
The mean function of a time series {Xt } is defined to be
µt = E(Xt ), whereas the variance function is defined to be
Var (Xt ) = E[(Xt − µt )2 ].

The function µt specifies the first order properties of the time


series.
Definition
The autocovariance function of a time series {Xt } is defined to be
γX (s, t) = Cov (Xs , Xt ) = E(Xs − µs )(Xt − µt ), for any two time
points t and s.

The function γ(s, t) specifies the second order properties of the


time series.
45 of 189
Introduction to time series
First and second order properties
Autocovariance and Autocorrelation Functions
Stationary and non-stationary models
Sample Autocovariance and Autocorrelation
Autoregressive Models
List of Some Useful R Functions and Homework

Remarks

If h = t − s, the parameter γX (h) is called the hth order or lag


h autocovariance of {Xt }. Thus, γX (0) = Var (Xt ).
The difference of two moments in time is called lag.
∀h, γX (h) = Cov (Xt , Xt+h ) = Cov (Xt , Xt−h ) = γX (−h).
∀h, the autocorrelation ρX (h) = ρX (−h).
If ai ’s constants, Xi ’s and Yj ’s r.v.’s i = 1, . . . , n, j = 1, . . . , m

Xn n
X XX
Cov ( ai Xi ) = ai2 Var (Xi ) + 2 ai aj Cov (Xi , Xj )
i=1 i=1 i<j

Xn m
X n X
X m
Cov ( Xi , Yj ) = Cov (Xi , Yj )
i=1 j=1 i=1 j=1

46 of 189
Introduction to time series
First and second order properties
Autocovariance and Autocorrelation Functions
Stationary and non-stationary models
Sample Autocovariance and Autocorrelation
Autoregressive Models
List of Some Useful R Functions and Homework

The Plot of the Autocovariance and Autocorrelation


(Correlogram) Functions
The plot of γ(h) against the lag h values is called the
autocovariance function (ACVF ).
The plot of ρ(h) against the lag h values is called the
autocorrelation function (ACF ).
Note that ρ(0) = 1 and −1 ≤ ρ(h) ≤ 1 for all h.
Autocovariance function Autocorrelation function
250

1.0
200

0.8
0.6
150
ACF (cov)

0.4
ACF
100

0.2
50

0.0
−0.2
0

0 1 2 3 4 5 0 5 10 15

Lag Lag

47 of 189
Introduction to time series
First and second order properties
Autocovariance and Autocorrelation Functions
Stationary and non-stationary models
Sample Autocovariance and Autocorrelation
Autoregressive Models
List of Some Useful R Functions and Homework

Sample Autocovariance and Autocorrelation


The lag h sample autocovariance is defined as
n
1 X
γ̂X (h) = (xt − x̄)(xt−h − x̄),
n
t=h+1

or equivalently
n−h
1X
γ̂X (h) = (xt − x̄)(xt+h − x̄),
n
t=1
1 Pn
where x̄ = n t=1 xt .
The lag h sample autocorrelation is defined as
Pn−h
γ̂X (h) (xt − x̄)(xt+h − x̄)
−1 ≤ ρ̂X (h) = = t=1Pn 2
≤1
γ̂X (0) t=1 (xt − x̄)
48 of 189
Introduction to time series
First and second order properties
Autocovariance and Autocorrelation Functions
Stationary and non-stationary models
Sample Autocovariance and Autocorrelation
Autoregressive Models
List of Some Useful R Functions and Homework

Example - Sample Autocorrelation


Find ρ̂(h) where h = 1, 2, and 3

49 of 189
Introduction to time series
First and second order properties
Autocovariance and Autocorrelation Functions
Stationary and non-stationary models
Sample Autocovariance and Autocorrelation
Autoregressive Models
List of Some Useful R Functions and Homework

Example - Sample Autocovariance and Autocorrelation

Find γ̂X (h) and ρ̂X (h) where h = 0, 1, and 3.


14
1 X
γ̂X (0) = (xt − x̄)2 = 5.23
14
t Xt t Xt t=1
14
1990 2 1997 6 1 X
γ̂X (1) = (xt − x̄)(xt−1 − x̄) = −1.152
1991 1 1998 2 14
t=2
1992 3 1999 4 14
1 X
1993 5 2000 3 γ̂X (3) = (xt − x̄)(xt−3 − x̄) = −0.961
1994 1 2001 7 14
t=4
1995 6 2002 8 γ̂X (1)
ρ̂X (1) = = −0.22
1996 2 2003 1 γ̂X (0)
γ̂X (3)
ρ̂X (3) = = −0.184
γ̂X (0)

50 of 189
Introduction to time series
First and second order properties
Autocovariance and Autocorrelation Functions
Stationary and non-stationary models
Sample Autocovariance and Autocorrelation
Autoregressive Models
List of Some Useful R Functions and Homework

R-Code for the Previous Example


R> x <- c(2,1,3,5,1,6,2,6,2,4,3,7,8,1)
R> Cov <- acf(x, type ="covariance",lag.max=3,plot=F)
R> Cor <- acf(x, lag.max=3,plot=FALSE)
R> Cov
Autocovariances of series 'x', by lag

0 1 2 3
5.230 -1.152 0.456 -0.961
R> Cor
Autocorrelations of series 'x', by lag

0 1 2 3
1.000 -0.220 0.087 -0.184
51 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Stationary Models
Definition
Stationary models: assume that the process remains in statistical
equilibrium with probabilistic properties that do not change over
time, in particular varying about a fixed constant mean level and
with constant variance.

52 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Why does a time series has to be stationary?

Stationarity is defined uniquely, so there is only one way for


time series data to be stationary, but lots of ways for it to be
non-stationary. Thus stationarity is needed to model the
dependence structure uniquely.
It is preferred that the estimators of parameters such as the
mean and variance, if they exist, to not be changed over time.
In most cases, stationary data can be approximated with
stationary ARMA model as we will discuss later.
Stationary processes avoid the problem of spurious regression.

53 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Strong (Strictly) Stationarity

Definition
The joint distribution of Xt1 , Xt2 , . . . , Xtn is defined as:
FXt1 ,Xt2 ,...,Xtn (x1 , x2 , . . . , xn ) = P(Xt1 ≤ x1 , Xt2 ≤ x2 , . . . , Xtn ≤ xn ),
where x1 , x2 , . . . , xn are any real numbers.

Definition
A time series {Xt } is said to be Strong (or Strictly) Stationary if
for any time points t1 , t2 , . . . , tn ∈ Z, where n ≥ 1 and any scaler
shift (lag) h ∈ Z, the joint distribution of Xt1 , Xt2 , . . . , Xtn is the
same as the joint distribution of Xt1 +h , Xt2 +h , . . . , Xtn +h ; i.e.,
FXt1 ,Xt2 ,...,Xtn (x1 , x2 , . . . , xn ) = FXt1 +h ,Xt2 +h ,...,Xtn +h (x1 , x2 , . . . , xn )

54 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Weak (Covariance) Stationarity

Definition
Time Invariant Process: A process has time invariant if it does not
depend on time.

Definition
A time series {Xt } is said to be Weak Stationary or Covariance
Stationary or Second-order Stationary if
The mean µt = E(Xt ) = µ is independent of t.
For all t&h, γX (h) = Cov (Xt , Xt+h ) = E[(Xt − µ)(Xt+h − µ)]
is time-invariant; i.e., the covariance function depends only on
the time separation h and not the actual time t.

55 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Remarks

An independent and identically distributed (i.i.d) stochastic


process satisfies γ(h) = ρ(h) = 0 ∀h 6= 0.
As the joint distribution of Xt and Xt+h determines µt and
γX (h) if both exist,then the strict stationarity implies weak
stationarity, while the converse is not true in general.
If {Xt } is a Gaussian stochastic process, then the weak
stationarity is equivalent to strong stationarity (why ?).
A stationary time series {Xt } is Ergodic if sample moments
p
converge in probability (→) to population moments; i.e. if
p p p
x̄ → µ, γ̂X (h) → γX (h), and ρ̂X (h) → ρX (h).

56 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Some Useful Trigonometric Functions

cos(x ± y ) = cos(x)cos(y ) ∓ sin(x)sin(y )


sin(x ± y ) = sin(x)cos(y ) ± cos(x)sin(y )
sin(2x) = 2sin(x)cos(x), cos(2x) = cos 2 (x) − sin2 (x)
2sin(x)sin(y ) = cos(x − y ) − cos(x + y )
2cos(x)cos(y ) = cos(x − y ) + cos(x + y )
sin(2πk + x) = sin(x), where k = 0, ±1, ±2, ±3, . . .
cos(2πk + x) = cos(x), where k = 0, ±1, ±2, ±3, . . .
sin(π − x) = +sin(x), sin(π + x) = −sin(x)
cos(π − x) = −cos(x), cos(π + x) = −cos(x)
sin(π/2 − x) = +cos(x), sin(π/2 + x) = +cos(x)
cos(π/2 − x) = +sin(x), cos(π/2 + x) = −sin(x)
57 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Stationary and Non-stationary Examples


Example 1:
Which of the following time sequence is a weak stationary?
Xt = t , where t ∼ i.i.d.(0, 1) for all t ∈ R.
Solution:
E(Xt ) = E(t ) = 0, which is independent of t; and
γX (h) = E(t t+h ) = 0 ∀h 6= 0, & γX (h) = 1, for h = 0,
which is also independent of t. Hence, the process is a weak
stationary.
Xt = t + t , where t ∼ i.i.d.(0, 1) for all t ∈ R.
Solution:
E(Xt ) = E(t + t ) = t, dependents of t. Hence, the process is
not stationary. Note that γX (h) is independent of t because
γX (h) = E([t + t − t][t + h + t+h − t − h]) = 0 ∀h 6= 0, &
γX (h) = 1, for h = 0.
58 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Stationary and Non-stationary Examples


Example 2:
Check whether the following model is a weak stationary or not?
Xt = Asin(t + B), where A is a random variable with a zero mean
and a unit variance and B is a random variable with a Uniform
distribution (−π, π) independent of A.
Solution:
E(Xt ) = E(Asin(t + B)) = E(A)E(sin(t + B)) = 0, which is
independent of t.
n o
γX (h) = E(Xt Xt+h ) = E Asin(t + B)Asin(t + h + B)
n1 o
= E(A2 )E (cos(h) − cos(2t + 2B + h))
2
(see the next slide)
59 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Stationary and Non-stationary Examples (Cont.)


Recall that Y ∼ Uniform(a, b), if its probability density function
 1
fY (y ) = b−a , a ≤ y ≤ b
0, otherwise.
2
where E(Y ) = and Var (Y ) = (b−a)
1
a+b 12 .
1 1 n o
γX (h) = cos(h) − E cos(2t + 2B + h)
2 2Z
1 1 π 1
= cos(h) − cos(2t + 2B + h). dB
2 2 −π 2π
1 1
= cos(h) − [sin(2t + 2B + h)]π−π
2 8π
1
= cos(h), which is independent of t,
2
Hence, the process is a second-order stationary.
60 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

White Noise Process


Definition
The stochastic process {t } is said to be a strong White Noise
process with mean zero and variance σ2 and written as
i.i.d.
t ∼ WN(0, σ2 ) if and only if it is independent and identically
distributed (i.i.d.) with zero mean and covariance function
 2
σ , h = 0;
γ (h) = E (t t+h ) =
0, h 6= 0.

Thus an i.i.d. white noise process has


constant mean (usually this constant equals zero).
constant variance.
zero autocovariance, except at lag zero.
61 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

White Noise Process


Definition
For a second-order properties, the assumption of i.i.d. is not
required, only the absence of correlation is needed. In this case the
process {t } is said to be a weak White Noise process.

Remarks:
If t ∼ N(0, σ2 ), {t } is a Gaussian White Noise process.
Two random variables X and Y are uncorrelated when their
covariance coefficient is zero.
Two random variables X and Y are independent when their
joint probability distribution is the product of their marginal
probability distributions: ∀x, y ⇒ PX ,Y (x, y ) = PX (x)PY (y ).
If X and Y are independent, then they are also uncorrelated.
However, in general, the converse is not always true.
62 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

A Simulated Gaussian White Noise Series

R> set.seed(646)
R> white.noise <- rnorm(200)
R> ts.wn<- ts(white.noise)
R> par(mfrow=c(1,3))
R> plot(ts.wn,ylab="Gaussian Series",main="White Noise")
R> acf(ts.wn, main="Autocorrelation Function")
R> acf(ts.wn, type="partial", main="Partial ACF")
White Noise Autocorrelation Function Partial ACF

0.15
1.0
2

0.10
0.8
1

0.05
0.6
Gaussian Series

Partial ACF
0

ACF

0.00
0.4
−1

−0.05
0.2
−2

−0.10
0.0
−3

0 50 100 150 200 0 5 10 15 20 5 10 15 20

Time Lag Lag

63 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Backward (Lag), Forward (Lead) and Difference Operators


The lag, lead and difference notations are very convenient way to
write linear time series models and to characterize their properties.
Definition
The lag (backshift) operator denotes by B(.) on an element of a
time series is used to produce the previous element.

Definition
The lead (forward) operator denotes by F(.) on an element of a
time series is used to shift the time index forward by one unit.

Definition
The difference operator denotes by ∆ expresses the difference
between two consecutive random variables (or their realizations).
64 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

BXt = Xt−1 ,
B2 Xt = B(BXt ) = BXt−1 = Xt−2 ,
More generally, Bh Xt = Xt−h , where h is any integer.
FXt = Xt+1 ,
F2 Xt = F(FXt ) = FXt+1 = Xt+2 ,
More generally, Fh Xt = Xt+h , where h is any integer.
∆Xt = Xt − Xt−1 = (1 − B)Xt ,
∆2 Xt = ∆(∆Xt ) = ∆Xt − ∆Xt−1 = (1 − B)∆Xt =
(1 − B)2 Xt = Xt − 2Xt−1 + Xt−2 ,
More generally, ∆d Xt = (1 − B)d Xt , where d is any integer.
Note that
Positive values of h define lags, while negative values define
leads: B−h Xt = Fh Xt = Xt+h ,
B0 Xt = F0 Xt = ∆0 Xt = Xt .
65 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Backshift, Forward and Difference Operators in R


To get (1 − Bh )Xt = Xt − Xt−h for all lag integer 1 ≤ h < t,
type the R-code:
> diff(X, lag=h)
Note that the opposite of diff() is cumsum()
To get ∆d Xt = (1 − B)d Xt for all order of integer difference
1 ≤ d < t, type the R-code:
> diff(X, differences = d)
In general, to get (1 − Bh )d Xt for all lag and difference
integers 1 ≤ h × d < t, type the R-code:
> diff(X, lag=h, differences = d)
Note that the R-code:
> diff(diff(X)) produces the same result as
> diff(X, differences = 2)
66 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Transform Data to Stationary: Detrending vs. Differencing


Example: Global temperature data for the years 1880-2009. The
model is Xt = β0 + β1 t + t . Detrending is êt = Xt − β̂0 − β̂1 t.
Differencing is ∆Xt = Xt − Xt−1 .
Global temperature and fitted line − 1880−2009
0.4
0.0
−0.4

1880 1900 1920 1940 1960 1980 2000

Time

Detrended global temperature


0.3




● ●
● ● ●

● ● ●
● ● ● ●
● ●
0.1

● ● ● ● ●
● ● ● ● ● ●
● ● ● ● ● ● ●
● ● ● ● ●
● ● ● ● ● ● ●
● ● ● ● ● ● ●
● ● ● ● ● ●
● ● ● ● ● ● ● ● ●
● ● ● ●
● ● ● ● ●
● ● ● ● ● ● ● ●
−0.1

● ●
● ● ● ● ●
● ● ● ● ● ●
● ● ● ● ● ●
● ● ● ● ● ● ● ●
● ● ● ● ● ● ●

● ● ● ●
● ●
−0.3


1880 1900 1920 1940 1960 1980 2000

Time

First differences global temperature


0.3




● ● ●
● ● ● ● ● ● ● ●
0.1

● ● ● ● ● ●

● ● ●


● ● ● ● ● ● ●
● ● ● ● ● ● ● ● ● ● ● ● ● ● ●
● ● ● ● ● ● ● ●
● ● ● ● ● ● ● ● ● ● ●
● ● ● ●
● ● ● ● ● ● ● ●
● ● ●
● ● ● ● ● ● ● ● ●
● ●
−0.1

● ● ● ● ● ● ● ●
● ● ● ● ●
● ● ● ● ●
● ● ● ● ●
● ● ●

● ● ●
● ● ● ● ●
−0.3

1880 1900 1920 1940 1960 1980 2000

Time
67 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Detrending vs. Differencing


Global temperature
1.0
0.6
ACF

0.2
−0.2

0 10 20 30 40

Lag

detrended global temperature


1.0
0.6
ACF

0.2
−0.2

0 10 20 30 40

Lag

first differences
1.0
0.6
ACF

0.2
−0.2

0 10 20 30 40

Lag

68 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Linear Processes

For a stationary time series without trends or seasonal effects.


That is; if necessary, any trends, seasonal or cyclical effects have
already been removed from the series, we might construct a linear
model for a time series with autocorrelation. The most important
special cases of linear processes are:
Autoregressive Model (AR ),
Moving-Average Model (MA ),
Autoregressive Moving Average Model (ARMA ),
Autoregressive-Integrated-Moving Average Model (ARIMA ).
These models can be used for predicting the future (forecasting)
development of a time series.

69 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Autoregressive (AR ) Models With Zero Mean

Definition
A time series {Xt }, with zero mean, satisfies

Xt = φ1 Xt−1 + φ2 Xt−2 + . . . + φp Xt−p + t

(t is White Noise and φi are parameters, i = 1, 2, . . . , p) is called


Autoregressive Process of order p, denoted by AR (p).

With backshift operator, the process is: Φp (B)Xt = t ,


Φp (B) = 1 − φ1 B − φ2 B2 − . . . − φp Bp polynomial of degree p
The AR (p) model, ∀p ≥ 1, is a stationary model if and only if
the |roots| of 1 − φ1 B − φ2 B2 − . . . − φp Bp = 0 > 1.

70 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Autoregressive (AR ) Models With Mean µ


If the mean, µ, of Xt is not zero, replace Xt by Xt − µ to get:
Xt −µ = φ1 (Xt−1 −µ)+φ2 (Xt−2 −µ)+. . .+φp (Xt−p −µ)+t
or write
Xt = α + φ1 Xt−1 + φ2 Xt−2 + . . . + φp Xt−p + t ,
where α = µ(1 − φ1 − φ2 − . . . − φp ).
For example, the AR (2) model
Xt = 1.5 + 1.2Xt−1 − 0.5Xt−2 + t
is Xt − µ = 1.2(Xt−1 − µ) − 0.5(Xt−2 − µ) + t , where
1.5 = µ(1 − 1.2 − (−0.5)). Thus the model has mean µ = 5
Xt − 5 = 1.2(Xt−1 − 5) − 0.5(Xt−2 − 5) + t
71 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

A Simulated AR (1) Series: Xt = 0.8Xt−1 + t

R> set.seed(12345)
R> ar1.sim<-arima.sim(n=100,list(order=c(1,0,0),ar=0.8))
R> par(mfrow=c(1,3))
R> plot(ar1.sim, ylab="X(t)", main="A simulated AR(1)")
R> acf(ar1.sim, main="Autocorrelation Function")
R> acf(ar1.sim, type="partial", main="Partial ACF")
A simulated AR(1) Autocorrelation Function Partial ACF
1.0
0.8

0.6
4

0.6

0.4
Partial ACF
0.4
2

ACF
X(t)

0.2
0.2
0

0.0

0.0
−0.2
−2

−0.2

0 20 40 60 80 100 0 5 10 15 20 5 10 15 20

Time Lag Lag

72 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

A Simulated AR (1) Series: Xt = −0.8Xt−1 + t

R> set.seed(12345)
R> ar11.sim<-arima.sim(n=100,list(order=c(1,0,0),ar=-0.8))
R> par(mfrow=c(1,3))
R> plot(ar11.sim, ylab="X(t)", main="A simulated AR(1)")
R> acf(ar11.sim, main="Autocorrelation Function")
R> acf(ar11.sim, type="partial", main="Partial ACF")
A simulated AR(1) Autocorrelation Function Partial ACF

0.2
1.0
2

0.0
0.5

−0.2
0

Partial ACF
ACF
X(t)

0.0

−0.4
−2

−0.5

−0.6
−4

0 20 40 60 80 100 0 5 10 15 20 5 10 15 20

Time Lag Lag

73 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

A Simulated AR (2) Series

R> set.seed(1965)
R> ar2<-arima.sim(n=200,list(order=c(2,0,0),ar=c(1.1,-0.3))
R> par(mfrow=c(1,3))
R> plot(ar2, ylab="X(t)", main="A simulated AR(2)")
R> acf(ar2, main="Autocorrelation Function")
R> acf(ar2, type="partial", main="Partial ACF")
A simulated AR(2) Autocorrelation Function Partial ACF

0.8
1.0
4

0.6
0.8
2

0.4
0.6

Partial ACF
0

ACF
X(t)

0.4

0.2
−2

0.2

0.0
−0.2
0.0
−4

0 50 100 150 200 0 5 10 15 20 5 10 15 20

Time Lag Lag

Figure : A simulated AR (2) process: Xt = 1.1Xt−1 − 0.3Xt−2 + t


74 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Moving Average (MA ) Models


Definition
A time series {Xt }, with zero mean, satisfies

Xt = t + θ1 t−1 + θ2 t−2 + . . . + θq t−q

(t is White Noise and θj are parameters, j = 1, 2, . . . , q) is called


Moving Average Process of order q, denoted by MA (q).

With backshift operator, the process is: Xt = Θq (B)t ,


Θq (B) = 1 + θ1 B + θ2 B2 + . . . + θq Bq polynomial of degree q
The MA (q), ∀q ≥ 1, process is always stationary (regardless
of the values of θj , j = 1, 2, . . . , q).
If the |roots| of 1 + θ1 B + θ2 B2 + . . . + θq Bq = 0 > 1, the
MA (q) process is said to be invertible.
75 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

A Simulated MA (1) Series: Xt = t + 0.8t−1

R> set.seed(6436)
R> ma1.sim<-arima.sim(n=200,list(order=c(0,0,1),ma=0.8))
R> par(mfrow=c(1,3))
R> plot(ma1.sim, ylab="X(t)", main="A simulated MA(1)")
R> acf(ma1.sim, main="Autocorrelation Function")
R> acf(ma1.sim, type="partial", main="Partial ACF")
A simulated MA(1) Autocorrelation Function Partial ACF
1.0
4

0.4
0.8
3
2

0.6

0.2
Partial ACF
1

ACF
X(t)

0.4
0

0.0
0.2
−1

−0.2
0.0
−2

−0.2
−3

0 50 100 150 200 0 5 10 15 20 5 10 15 20

Time Lag Lag

76 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

A Simulated MA (1) Series: Xt = t − 0.8t−1

R> set.seed(6436)
R> ma11.sim<-arima.sim(n=200,list(order=c(0,0,1),ma=-0.8))
R> par(mfrow=c(1,3))
R> plot(ma11.sim, ylab="X(t)", main="A simulated MA(1)")
R> acf(ma11.sim, main="Autocorrelation Function")
R> acf(ma11.sim, type="partial", main="Partial ACF")
A simulated MA(1) Autocorrelation Function Partial ACF
1.0

0.1
2

0.0
0.5

−0.1
Partial ACF
0

ACF
X(t)

−0.2
0.0

−0.3
−2

−0.4
−4

−0.5

0 50 100 150 200 0 5 10 15 20 5 10 15 20

Time Lag Lag

77 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

A Simulated MA (2) Series

R> set.seed(14762)
R> ma2<-arima.sim(n=200,list(order=c(0,0,2),ma=c(0.8,0.6)))
R> par(mfrow=c(1,3))
R> plot(ma2, ylab="X(t)", main="A simulated MA(2)")
R> acf(ma2, main="Autocorrelation Function")
R> acf(ma2, type="partial", main="Partial ACF")
A simulated MA(2) Autocorrelation Function Partial ACF
1.0
4

0.6
0.8
2

0.4
0.6

Partial ACF
ACF
X(t)

0.4

0.2
0.2
−2

0.0
0.0

−0.2
−4

0 50 100 150 200 0 5 10 15 20 5 10 15 20

Time Lag Lag

Figure : A simulated MA(2) process: Xt = t + 0.8t−1 + 0.6t−2


78 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Autoregressive-Moving Average (ARMA ) Models

Definition
A time series {Xt }, with zero mean, is called
Autoregressive-Moving Average of order (p, q) and denoted by
ARMA (p, q) if it can be written as
p
X q
X
Xt = φi Xt−i + θj t−j ,
|i=1 {z } j=0
| {z }
Autoregressive Part Moving Average Part

t is White Noise, φi & θj for i = 1, 2, . . . , p, j = 0, 1, . . . , q, are


the parameters of the Autoregressive & the Moving Average parts
respectively, where θ0 = 1.

79 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Autoregressive-Moving Average (ARMA ) Models


With backshift operator, the process can be rewritten as:
Φp (B)Xt = Θq (B)t ,
| {z } | {z }
Autoregressive Part Moving Average Part
where Φp (B) = 1 − pi=1 φi Bi and Θq (B) = 1 + qj=1 θj Bj .
P P
If the mean, µ, of Xt is not zero, replace Xt by Xt − µ to get:
Φp (B)(Xt − µ) = Θq (B)t .
can also be written as:
Xt = α + φ1 Xt−1 + . . . + φp Xt−p +t + θ1 t−1 + . . . + θq t−q ,
| {z } | {z }
Autoregressive Part Moving Average Part
where α = µ(1 − φ1 − φ2 − . . . − φp ) and µ is called the
intercept obtained from the output of the R function arima().
80 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Three Conditions for ARMA Models

The ARMA model is assumed to be stationary, invertible and


identifiable, where:
The condition for the stationarity is the same for the pure
AR (p) process: i.e., the roots of
1 − φ1 B − φ2 B2 − . . . − φp Bp lie outside the unit circle.
The condition for the invertibility is the same for the pure
MA (q) process: i.e., the roots of
1 + θ1 B + θ2 B2 + . . . + θq Bq lie outside the unit circle.
The identifiability condition means that the model is not
redundant: i.e., Φp (B) = 0 and Θq (B) = 0 have no common
roots. (see next page)

81 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Example: Model Redundancy (Shared Common Roots in


ARMA Models)
Consider the ARMA (1,2):
Xt = 0.2Xt−1 + t − 1.1t−1 + 0.18t−2 ,
this model can be written as
(1 − 0.2B)Xt = (1 − 1.1B + 0.18B2 )t
or equivalently
(1 − 0.2B)Xt = (1 − 0.2B)(1 − 0.9B)t
cancelling (1 − 0.2B) from both sides to get:
Xt = (1 − 0.9B)t
Thus, the process is not really an ARMA (1,2), but it is a
MA (1)≡ARMA (0,1).
82 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

A Simulated ARMA (1,1) Series

R> set.seed(6436);par(mfrow=c(1,3))
R> arma1.sim<-arima.sim(n=200,list(order=c(1,0,1),
+ ar=0.8,ma=-0.6))
R> plot(arma1.sim,ylab="X(t)",main="A simulated ARMA(1,1)")
R> acf(arma1.sim, main="Autocorrelation Function")
R> acf(arma1.sim, type="partial", main="Partial ACF")
A simulated ARMA(1,1) Autocorrelation Function Partial ACF

0.3
1.0
3

0.8

0.2
2

0.6
1

Partial ACF

0.1
ACF
X(t)

0.4
0

0.0
−1

0.2
−2

0.0

−0.1
−3

0 50 100 150 200 0 5 10 15 20 5 10 15 20

Time Lag Lag

Figure : A simulated ARMA (1,1) process: Xt = 0.8Xt−1 + t − 0.6t−1


83 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

A Simulated ARMA (2,1) Series

R> set.seed(6436);par(mfrow=c(1,3))
R> arma21.sim<-arima.sim(n=200,list(order=c(2,0,1),
+ ar=c(0.8,-0.4),ma=-0.6))
R> plot(arma21.sim,ylab="X(t)",main="A simulated ARMA(2,1)"
R> acf(arma21.sim, main="Autocorrelation Function")
R> acf(arma21.sim, type="partial", main="Partial ACF")
A simulated ARMA(2,1) Autocorrelation Function Partial ACF
4

1.0

0.3
3

0.8

0.2
2

0.6

0.1
1

Partial ACF
0.4
ACF
X(t)

0.0
0

0.2
−1

−0.1
0.0
−2

−0.2
−0.2
−3

0 50 100 150 200 0 5 10 15 20 5 10 15 20

Time Lag Lag

Figure : A simulated process: Xt = 0.8Xt−1 − 0.4Xt−2 + t − 0.6t−1


84 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

A Simulated ARMA (1,2) Series

R> set.seed(6436);par(mfrow=c(1,3))
R> arma12.sim<-arima.sim(n=200,list(order=c(1,0,2),
+ ar=0.8,ma=c(-0.6,0.4)))
R> plot(arma12.sim,ylab="X(t)",main="A simulated ARMA(1,2)"
R> acf(arma12.sim, main="Autocorrelation Function")
R> acf(arma12.sim, type="partial", main="Partial ACF")
A simulated ARMA(1,2) Autocorrelation Function Partial ACF
4

1.0

0.5
0.8

0.4
2

0.3
0.6

Partial ACF

0.2
ACF
X(t)

0.4
0

0.1
0.2

0.0
−2

0.0

−0.1
−0.2

0 50 100 150 200 0 5 10 15 20 5 10 15 20

Time Lag Lag

Figure : A simulated process: Xt = 0.8Xt−1 + t − 0.6t−1 + 0.4t−2


85 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Some Examples of ARMA (p, q) Process

AR (1) ≡ARMA (1,0) Process may be written as:

Xt = φXt−1 + t or (1 − φB)Xt = t

AR (2) ≡ARMA (2,0) Process may be written as:

Xt = φ1 Xt−1 + φ2 Xt−2 + t or (1 − φ1 B − φ2 B2 )Xt = t

MA (1) ≡ARMA (0,1) Process may be written as:

Xt = t + θt−1 or Xt = (1 + θB)t

MA (2) ≡ARMA (0,2) Process may be written as:

Xt = t + θ1 t−1 + θ2 t−2 or Xt = (1 + θ1 B + θ2 B2 )t


86 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Some Examples of ARMA (p, q) Process


ARMA (1, 1) Process may be written as:
Xt = φXt−1 + t + θt−1 or (1 − φB)Xt = (1 + θB)t
ARMA (2, 1) Process may be written as:
Xt = φ1 Xt−1 +φ2 Xt−2 +t +θt−1 or (1−φ1 B−φ2 B2 )Xt = (1+θB)t
ARMA (1, 2) Process may be written as:
Xt = φXt−1 +t +θ1 t−1 +θ2 t−2 or (1−φB)Xt = (1+θ1 B+θ2 B2 )t
ARMA (2, 2) Process may be written as:
Xt = φ1 Xt−1 +φ2 Xt−2 +t +θ1 t−1 +θ2 t−2 or Φ2 (B)Xt = Θ2 (B)t ,
where Φ2 (B) = 1 − φ1 B − φ2 B2 & Θ2 (B) = 1 + θ1 B + θ2 B2
87 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Autoregressive-Integrated-Moving Average (ARIMA )


Models

Definition
A time series {Xt }, with zero mean, is called
Autoregressive-Integrated-Moving Average of order (p, d, q),
denoted by ARIMA (p, d, q) if the dth differences of the {Xt }
series is an ARMA (p, q) process.
The process in the backshift operator may be written as:

Φp (B)(1 − B)d Xt = Θq (B)t ,

where Φp (B) = 1 − φ1 B − φ2 B2 − . . . − φp Bp polynomial of degree


p, Θq (B) = 1 + θ1 B + θ2 B2 + . . . + θq Bq polynomial of degree q

88 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Remarks
ARIMA models are applied in some cases where data show
evidence of non stationarity, where an initial differencing step
can be applied one or more times to eliminate the non
stationarity.
AR (p) ≡ARIMA (p, 0, 0),
MA (q) ≡ARIMA (0, 0, q),
ARI (p, d) ≡ARIMA (p, d, 0),
IMA (d, q) ≡ARIMA (0, d, q),
ARMA (p, q) ≡ARIMA (p, 0, q),
WN ≡ARIMA (0,0,0),
I (d) ≡ARIMA (0, d, 0). Note that I (1) is the well-known
random walk, Xt = Xt−1 + t , that is widely used to describe
the behavior of the series of a stock price (as we will discuss
later).
89 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Some Examples of ARIMA (p, d, q) Process

ARI (1,1)≡ARIMA (1,1,0) Process may be written as:

Xt = φXt−1 + Xt−1 − φXt−2 + t ≡ (1 − φB)(1 − B)Xt = t

ARIMA (1,1,1) Process may be written as:

(1 − φB)(1 − B)Xt = (1 + θB)t

ARIMA (2,1,1) Process may be written as:

(1 − φ1 B − φ2 B2 )(1 − B)Xt = (1 + θB)t

ARIMA (1,2,2) Process may be written as:

(1 − φB)(1 − B)2 Xt = (1 + θ1 B + θ2 B2 )t


90 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

A Simulated ARIMA (2,1,1) Series

R> set.seed(6436);par(mfrow=c(1,3))
R> arima.sim<-arima.sim(n=200,list(order=c(2,1,1),
+ ar=c(0.8,-0.4),ma=-0.6))
R> plot(arima.sim,ylab="X(t)",main="A simulated ARIMA(2,1,1
R> acf(arima.sim, main="Autocorrelation Function")
R> acf(arima.sim, type="partial", main="Partial ACF")
A simulated ARIMA(2,1,1) Autocorrelation Function Partial ACF

1.0
1.0

0.8
20

0.8

0.6
0.6
15

Partial ACF
ACF
X(t)

0.4
0.4
10

0.2
0.2
5

0.0
0.0
0

0 50 100 150 200 0 5 10 15 20 5 10 15 20

Time Lag Lag

Figure : (1 − 0.8B + 0.4B2 )(Xt − Xt−1 ) = t − 0.6t−1


91 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Seasonal ARIMA (SARIMA ) Models

A non-stationary process is often possess a seasonal component


that repeats itself after a regular period of time, where the smallest
time period denoted by s is called the seasonal period.

Monthly observations: s = 12 (12 observations per a year).


Quarterly observations: s = 4 (4 observations per a year).
Daily observations: s = 365 (365 observations per a year).
Weekly observations: s = 52 (52 observations per a year).
Daily observations by week: s = 5 (5 working days).

92 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Seasonal ARIMA (SARIMA ) Models


Definition
The multiplicative Seasonal ARIMA (SARIMA ) model denoted by
ARIMA (p, d, q) × (P, D, Q)s , where s is the number of seasons:

Φp (B)ΦP (Bs )(1 − B)d (1 − Bs )D Xt = Θq (B)ΘQ (Bs )t ,

Φp (B) = 1 − Ppi=1 φi Bi polynomial in B of degree p,


P
Θq (B) = 1 + qi=1 θi Bi polynomial in B of degree q,
ΦP (Bs ) = 1 − Pi=1 Φi Bis polynomial in Bs of degree P,
P

ΘQ (Bs ) = 1 + Q is s
P
i=1 Θi B polynomial in B of degree Q,
with no common roots between ΦP (Bs ) and ΘQ (Bs ), p, d, and q
are the order of non-seasonal AR model, MA model and ordinary
differencing respectively, whereas P, D, and Q are the order of
Seasonal Autoregressive (SAR ) model, Seasonal Moving Average
(SMA ) model, and seasonal differencing respectively. 93 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Remarks

The idea is that SARIMA are ARIMA (p, d, q) models whose


residuals t are ARIMA (P, D, Q), whose operators are defined
on Bs and successive powers, where p, q, and d are the
non-seasonal AR order, non-seasonal MA order, and
non-seasonal differencing respectively, while P, Q and D are
the seasonal AR (SAR ) order, seasonal MA (SMA ) order, and
seasonal differencing at lag s respectively.
Seasonal differencing ∆s Xt = (1 − Bs )Xt = Xt − Xt−s will
remove seasonality in the same way that ordinary differencing
∆Xt = Xt − Xt−1 will remove a polynomial trend.

94 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Some Examples of SARIMA (p, d, q) × (P, D, Q)s Process

ARIMA (0, 1, 0) × (0, 1, 0)5 model can be written as:

(1 − B)(1 − B5 )Xt = t

ARIMA (0, 1, 0) × (0, 1, 1)4 model can be written as:

(1 − B)(1 − B4 )Xt = (1 + ΘB4 )t

ARIMA (1, 0, 0) × (0, 1, 1)12 model can be written as:

(1 − φB)(1 − B12 )Xt = (1 + ΘB12 )t

ARIMA (0, 1, 1) × (0, 1, 1)12 model can be written as:

(1 − B)(1 − B12 )Xt = (1 + θB)(1 + ΘB12 )t


95 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Seasonal Patterns and Correlogram

Correlogram to detect autocorrelations

1.0
600

0.8
500
Passengers (1000's)

0.6
400

ACF

0.4
300

0.2
200

0.0
100

−0.2
1950 1952 1954 1956 1958 1960 0.0 0.5 1.0 1.5

Time Lag

Figure : International air passenger bookings in US for years 1949 − 1960.

There is a clear seasonal variation. At the time, bookings were


highest during the summer months June-August and lowest during
the autumn month of November and winter month of February.
96 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Remove Trend and Seasonality Using decompose()

R> par(mfrow=c(1,2))
R> AP.decom <- decompose(AirPassengers, "multiplicative")
R> plot(ts(AP.decom$random[7:138]))
R> acf(AP.decom$random[7:138]); par(mfrow=c(1,1))
Series AP.decom$random[7:138]
1.10

1.0
0.8
ts(AP.decom$random[7:138])

1.05

0.6
1.00

0.4
ACF

0.2
0.95

−0.2 0.0
0.90

0 20 40 60 80 100 120 0 5 10 15 20

Time Lag

The correlogram suggests either MA (2) or that the seasonal


adjustment has not been entirely effective. (Try SARIMA models!).
97 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Example of SARIMA (p, d, q) × (P, D, Q)s Process


The plots of the average monthly sea level at Darwin, Australia
from January 1988 to December 1999 (in metres), available from
http://www.sci.usq.edu.au/courses/STA3303/resources/
climatology/darwinsl.txt, show consistent seasonal pattern.

98 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

A Simulated SARIMA (2, 0, 0) × (0, 1, 1)12 Series


portes source: http://site.iugaza.edu.ps/emahdi/courses/
time-series-analysis-stat3305/lectures/
R> library("portes")#Try sim_sarima in sarima package!
R> set.seed(13); phi<-c(1.3,-0.35); theta.season<-0.8
R> Z1<-varima.sim(list(ar=phi,d=0,ma.season=theta.season,
+ d.season=1,period=12),n=100,trunc.lag=50)
R> par(mfrow=c(1,2)); plot(Z1); acf(Z1,lag.max =40)
Series 1

1.0
2
1

0.6
Series 1

ACF

0.2
−2

−0.2
−4

0 20 40 60 80 100 0 10 20 30 40

Time Lag

(1 − 1.3B + 0.5B2 )(1 − B12 )Xt = t + 0.8t−12 99 of 189


Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Second order properties of MA (1) Models


For the MA (1) process: Xt = t + θt−1 where t ∼ WN(0, σ 2 ).
The mean of Xt : E(Xt ) = 0 (which is independent of t).
The variance: γX (0) = E(2t + 2θt t−1 + θ2 2t−1 ) = (1 + θ2 )σ 2 .
The autocovariance and the autocorrelation are independent of t
without any restrictions on the parameter θ as shown below:

γX (h) = E(t t+h ) + θE(t−1 t+h ) + θE(t t+h−1 ) + θ2 E(t−1 t+h−1 )


 (1 + θ2 )σ 2 , h = 0

= θσ 2 , h = ±1,
0, h = ±2, ±3, . . .


 1, h=0
γX (h)  θ
ρX (h) = = 2
, h = ±1
γX (0)   1+θ
0, h = ±2, ±3, . . .
100 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Second Order Properties of MA (q) Models


q
X
For MA (q) process: Xt = θi t−i , θ0 = 1, t ∼ WN(0, σ 2 ).
i=0
The mean of Xt : E(Xt ) = 0 (which is independent of t).P
The variance: γX (0) = (1 + θ12 + θ22 + . . . + θq2 )σ 2 = σ 2 qi=0 θi2 .
The autocovariance and the autocorrelation are independent of t
without any
 restrictions on the parameter θ as shown below:
Xq
 σ2

θi θi−h , h = 0, ±1, ±2, . . . , ±q
γX (h) =
 i=h
0, h>q

 Pq
i=h θi θi−h
γX (h)  P q 2
, h = 0, ±1, ±2, . . . , ±q
ρX (h) = = i=0 θi
γX (0) 
0, h>q

101 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

ACF for Simulated MA (1) and MA (2) process


MA(1), theta = 0.7 MA(1), theta = −0.7
1.0

1.0
0.8

0.5
0.6
ACF

ACF
0.4

0.0
0.2
0.0

−0.5
−0.2

0 5 10 15 20 0 5 10 15 20

Lag Lag

MA(2), theta1 = 0.7 and theta2 = 0.3 MA(2), theta1 = −0.7 and theta2 = 0.3
1.0

1.0
0.8

0.5
0.6
ACF

ACF
0.4

0.0
0.2
0.0

−0.5

0 5 10 15 20 0 5 10 15 20

Lag Lag

102 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Example: The ACF of MA (q) Models


Example: Find the first and second moments of the process
Xt = t + 0.6t−1 − 0.3t−2 , where  ∼ WN(0, σ 2 ).
Solution:
The process is MA (2) with θ0 = 1, θ1 = 0.6, and θ2 = −0.3
E(Xt ) = 0
Var (Xt ) = γX (0) = (1 + θ12 + θ22 )σ 2 = 1.45σ 2
X2
2
γX (1) = σ θi θi−1 = (θ1 θ0 + θ2 θ1 )σ 2 = 0.42σ 2
i=1
X2
γX (2) = σ 2 θi θi−2 = θ2 θ0 σ 2 = −0.30σ 2
i=2
γX (1) γX (2)
ρX (1) = = 0.27, ρX (2) = = 0.19, ρX (h) = 0∀h > 2.
γX (0) γX (0)
103 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Invertibility of MA (q) Models

If we replace θ by 1/θ, the autocorrelation function for


θ
MA (1), ρ = 1+θ 2 , will not be changed. There are thus two

processes, Xt = t + θt−1 and Xt = t + 1θ t−1 , show


identical autocorrelation pattern, and hence the MA (1)
coefficient is not uniquely identified.
For the general moving average process, MA (q), there is a
similar identifiability problem.
The problem can be resolved by assuming that the operator
1 + θ1 B + θ2 B2 + . . . + θq Bq be invertible, i.e.,
all absolute roots of Θq (B): 1 + θ1 B + θ2 B2 + . . . + θq Bq = 0
lie outside the unit circle.

104 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Invertibility of MA (q) Models


Definition
The MA (q) process is invertible if it can be represented as a
convergent infinite AR form AR (∞).

The MA (q), Xt = Θq (B)t , where Θq (B) = 1 + qj=1 θj Bq and


P

t ∼ WN(0, σ 2 ), in an infinite AR representation AR (∞):

Θq (B)−1 Xt = Θq (B)−1 Θq (B)t = t

⇒ Π∞ (B)Xt = t ,
1− ∞
P∞
where Π∞ (B) = Θq (B)−1 i i
P
P= i=1 πi B = − i=0 πi B with

π0 = −1, and πi satisfy i=0 |πi | < ∞.
Note that the condition of finite sum ( ∞
P
i=0 |πi | < ∞) ensures
that the AR (∞) series is convergent (i.e., MA (q) is invertible).
105 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

MA (1) in an Infinite AR Representation AR (∞)


Consider the MA (1) process Xt = (1 + θB)t . where we multiply
both sides by (1 + θB)−1 to get

1
t = Xt = Π∞ (B)Xt ,
1 + θB

where Π∞ (B) = (1 + θB)−1 = 1 − ∞


P i
P∞ i
i=1 πi B = − i=0 πi B .
1
Recall that the Taylor series of , |x| < 1 is the geometric
1−x
series 1 + x + x 2 + x 3 + . . ., which implies to

1 1 X
= = 1 − θB + θ2 B2 − θ3 B3 + . . . = (−θ)i Bi
1 + θB 1 − (−θB)
i=0

So that, (see the next slide)


106 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

MA (1) in an Infinite AR Representation AR (∞)


X ∞
X
i i
(−θ) B = −πi Bi .
i=0 i=0

By equating the coefficients of the corresponding powers Bi we get

πi = (−1)i+1 (θ)i ∀i ≥ 0.

⇒ MA (1) process Xt = (1 + θB)t in an AR (∞) representation is



X
Xt = πi Bi Xt + t , where πi = (−1)i+1 θi , i = 1, 2, . . . .
i=1

Note that the absolute root of 1 + θB = 0 implies that | 1θ | > 1 or


equivalently |θ| < 1 ensures that the MA (1) is an invertible series.
107 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Examples - Invertibility of MA (1) Models

1.) The MA (1) process Xt = t + 0.4t−1 is an invertible process


because |θ| = |0.4| < 1 (or equivalently the root of
(1 + 0.4B) = 0 in absolute value is B = 1/0.4 = 2.5 > 1).
This process can be written in the AR (∞) representation

X
Xt = πi Xt−i + t , where πi = (−1)i+1 (0.4)i , i = 1, 2, . . . ,
i=1

Xt = t + 0.4Xt−1 − (0.4)2 Xt−2 + (0.4)3 Xt−3 − . . .

2.) The MA (1) process Xt = 1 + 1.8t−1 , is not invertible


process because the root of (1 + 1.8B) = 0 in absolute value
is B = 1/1.8 = 0.56 < 1. Thus, we can not write this process
in an AR (∞) representation.
108 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

MA (q) in an Infinite AR Representation AR (∞)

Because Π∞ (B) = Θq (B)−1 , the coefficients πi can be obtained by


equating the coefficients in the relation Θq (B)Π∞ (B) = 1. Thus,

Θq (B)Π∞ (B) = (1 + θ1 B + . . . + θq Bq )(1 − π1 B − π2 B2 − . . .)


= 1 − (π1 − θ1 )B − (π2 + θ1 π1 − θ2 )B2 − . . .
− (πj + θ1 πj−1 + . . . + θq−1 πj−q+1 + θq πj−q )Bj − . . .

by equating coefficients of various powers Bj in the relation


Θq (B)Π∞ (B) = 1 for j = 1, 2, . . ., we have

πj = −θ1 πj−1 − θ2 πj−2 − . . . − θq−1 πj−q+1 − θq πj−q ,

where π0 = −1 and πj = 0, for j < 0.

109 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Example: MA (2) in an AR (∞) Representation


Example: An example of an invertible MA (2) process is

Xt = t − 0.1t−1 + 0.42t−2 .

The |roots| of (1 − 0.1B + 0.42B2 ) = (1 − 0.7B)(1 + 0.6B) = 0


1
are B = 1/0.7 = 1.43 > 1 and |B| = | −0.6 | = 1.67 > 1. Thus, the
process is invertible.
Now the process in an AR (∞) representation is written as follows:

X
Xt = πi Xt−i + t ,
i=1

where πj = −θ1 πj−1 − θ2 πj−2 − . . . − θq πj−q , q = 2,


θ1 = −0.1, θ2 = 0.42, θj = 0 ∀j > q, π0 = −1, and πj = 0 ∀j < 0.
(see the next slide).
110 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Continue with the Previous Example


πj = −θ1 πj−1 − θ2 πj−2 − . . . − θq πj−q , where q = 2,
θ1 = −0.1, θ2 = 0.42, θj = 0 ∀j > q, π0 = −1, and πj = 0 ∀j < 0.
π1 = −θ1 π0 − 0 − . . . = −(−0.1)(−1) = −0.1
π2 = −θ1 π1 − θ2 π0 = −(−0.1)(−0.1) − (0.42)(−1) = 0.41
π3 = −θ1 π2 − θ2 π1 − 0 − . . .
= −(−0.1)(0.41) − (0.42)(−0.1) − 0 = 0.083
π4 = −θ1 π3 − θ2 π2 − 0 − . . .
= −(−0.1)(0.083) − (0.42)(0.41) − 0 = −0.1639
.. ..
. = .
Thus, the MA (2) process in the AR (∞) representation is
Xt = t − 0.1Xt−1 + 0.41Xt−2 + 0.083Xt−3 − 0.1639Xt−4 + . . .
111 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Example: MA (2) in an AR (∞) Representation

Example: Consider the MA (2) process

Xt = t − t−1 + 0.5t−2 .

Recall that the√roots of the quadratic equation ax 2 + bx + c = 0


−b ± b2 − 4ac
are x = . Thus |roots| of 1 − B + 0.5B2 = 0 are
2ap
| − (−1) ± (−1)2 − 4(0.5)(1)| √
|B| = = |1 ± i|, where i = −1.
2(0.5)
Recall that√the absolute value of a complex number
|a +√ib| = a2 + b2 , so that the absolute values of the two roots
are 2 ≈ 1.41 > 1; Hence, the process is invertible. (see the next
slide)

112 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Continue with the Previous Example


Now the process in an AR (∞) representation is written as follows:

X
Xt = πi Xt−i + t ,
i=1
πj = −θ1 πj−1 − θ2 πj−2 , θ1 = −1, θ2 = 0.5, π0 = −1 & π−1 = 0.
π1 = −θ1 π0 = −(−1)(−1) = −1
π2 = −θ1 π1 − θ2 π0 = −(−1)(−1) − (0.5)(−1) = −0.5
π3 = −θ1 π2 − θ2 π1 = −(−1)(−0.5) − (0.5)(−1) = 0
π4 = −θ1 π3 − θ2 π2 = −(−1)(0) − (0.5)(−0.5) = 0.25
.. .
. = ..
Thus, the MA (2) process in the AR (∞) representation is
Xt = t − Xt−1 − 0.5Xt−2 + 0.25Xt−4 + . . .
113 of 189
Introduction to time series
First and second order properties Weak and Strong (Strictly) Stationarity
Stationary and non-stationary models White Noise Process
Autoregressive Models ARIMA and SARIMA Models
List of Some Useful R Functions and Homework

Finding the Roots in R


In R, the function polyroot(a), where a is the vector of
polynomial coefficients in increasing order can be used to find
the zeros of a real or complex polynomial of degree n − 1
p(x) = a1 + a2 x + . . . + an x n−1 .
The function Mod() can be used to check whether the roots
of p(x) have modulus > 1 or not.
Example
The MA (4) process
Xt = t − 0.3t−1 + 0.7t−2 − 1.2t−3 + 0.1t−4 is not invertible as
R> roots <- polyroot(c(1,-0.3, 0.7,-1.2,0.1))
R> Mod(roots)
[1] 0.8829242 0.8829242 1.1250090 11.4024243
114 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

AR (1) in MA (∞) Representation


Let Xt is an AR (1) process, where t ∼ WN(0, σ 2 ).
Xt = φXt−1 + t
Xt = φ(φXt−2 + t−1 ) + t
= t + φt−1 + φ2 Xt−2
= t + φt−1 + φ2 (φXt−3 + t−2 )
= t + φt−1 + φ2 t−2 + φ3 Xt−3
.. .
. = ..
X ∞
Xt = φj t−j
j=0

which is in the MA (∞) representation. Note that (apart of t−j )


this representation may be seen as a geometric series, which is
converge if and only if |φ| < 1, (see the next slide).
115 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

AR (1) in MA (∞) Representation - Second Order


Properties of AR (1) Models

The AR (1) process with backshift operator may be written as


t
(1 − φB)Xt = t , where t ∼ WN(0, σ 2 ) ⇒ Xt = .
1 − φB
1
If |φB| < 1, Taylor series of = 1 + φB + φ2 B2 + . . .
1 − φB

X
⇒ Xt = (1 + φB + φ2 B2 + φ3 B3 + . . .)t = φj t−j
j=0

Which is in the MA (∞) Wold representation. (See the next slide.)

116 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

X
The AR (1) process in MA (∞) representation: Xt = φi t−i
i=0

X
The mean of Xt : E(Xt ) = φi E(t−i ) = 0 (independent of t).
i=0
∞ ∞
X X σ2
The variance: γX (0) = E( φ2i 2t−i ) = σ 2 φ2i = .
1 − φ2
i=0 i=0
The autocovariance is independent of t as follows:
∞ X
X ∞
γX (h) = E(Xt Xt+h ) = φi φj E(t−i t+h−j )
i=0 j=h
∞ ∞
X X φh
= σ2 φi φi+h = φh σ 2 φ2i = σ 2
1 − φ2
i=0 i=0
γX (h)
The autocorrelation: ρX (h) = = φh (independent of t).
γX (0)
117 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Autocorrelation Function of AR (1) Model- Example

R> set.seed(125)
R> ar.1<-arima.sim(n=200,list(order=c(1,0,0),ar=0.8))
R> ar.2<-arima.sim(n=200,list(order=c(1,0,0),ar=-0.5))
R> par(mfrow=c(1,2))
R> acf(ar.1,main="Phi = 0.8"); acf(ar.2,main="Phi = -0.5")
R> par(mfrow=c(1,1))
Phi = 0.8 Phi = −0.5
1.0

1.0
0.8
0.8

0.6
0.6

0.4
ACF

ACF
0.4

0.2
0.0
0.2

−0.2
0.0

−0.4

0 5 10 15 20 0 5 10 15 20

Lag Lag

118 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Example: The Autocorrelation of AR (1)

Example: The autocorrelation of the stationary AR (1) process:

Xt = 0.6Xt−1 + t , t ∼ WN(0, 4)

are
ρX (h) = φh = 0.6h ∀h ≥ 0,
Thus ρX (0) = 1, ρX (1) = 0.6, ρX (2) = 0.62 = 0.36, . . ..
R> Lag <- 0:8
R> rho <- 0.6^Lag
R> round(rho,3)
[1] 1.000 0.600 0.360 0.216 0.130 0.078 0.047 0.028 0.017

119 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Second Order Properties of AR (p) Models

Consider the AR (p) model:


p
X
Xt = φi Xt−i + t , where t ∼ WN(0, σ 2 ),
i=1

The expectation of Xt is E(Xt ) = 0 (Why?).


The variance of Xt is Var (Xt ) = γX (0) = pi=1 φi γX (i) + σ 2 ,
P
where γX (i) = Cov (Xt , Xt−i ). Note that the variance of Xt is
obtained by multiplying both sides of the above equation by
Xt and taking expectation, where E(Xt t ) = σ 2 (Why?).
(see the next slide)!

120 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Second Order Properties of AR (p) Models (Cont.)


Pp
Multiplying both sides of the equation Xt = i=1 φi Xt−i + t by
Xt−h and taking expectations gives
p
X
γX (h) = φi γX (h − i) ∀h > 0
i=1
Now divide both sides by γX (0) to get the autocorrelations
p
X
ρX (h) = φi ρX (h − i) ∀h > 0
i=1
These are the Yule-Walker equations that we will discuss later (see
Partial Autocorrelation Function (PACF ) Section).
The population autocorrelations ρX (h) can be found recursively by
solving the Yule-Walker equations given the values of φi . In
practice, as we will see later, we use the sample autocorrelation
coefficients to estimate the values of φi . 121 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Stationarity of AR (1) Models

Definition
The condition for the Autoregressive process of order 1, AR (1):
Xt = φXt−1 + t , to be stationary is that the absolute value of the
root of (1 − φB) = 0 must lie outside the unit circle. That is, the
AR (1) is stationary if |B| = | φ1 | > 1, or equivalently, if |φ| < 1,

(1 − 0.4B)Xt = t is a stationary process because the absolute


value of the root of (1 − 0.4B) = 0 is |B| = |1/0.4| = 2.5 > 1.
(1 + 1.8B)Xt = t is not stationary process because |root| of
1
(1 + 1.8B) = 0 is |B| = | −1.8 | = 0.56 < 1.
Xt = 0.5Xt−1 + t is a stationary process because |0.5| < 1.

122 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Stationarity of AR (2) Models


Theorem
For the AR (2) model, Xt = φ1 Xt−1 + φ2 Xt−2 + t , the necessary
and sufficient conditions for stationarity are:

|φ2 | < 1
φ1 + φ2 < 1
φ2 − φ1 < 1

Examples
Xt = 1.1Xt−1 − 0.4Xt−2 + t is stationary.
Xt = 0.6Xt−1 − 1.3Xt−2 + t is not stationary (|φ2 | ≮ 1).
Xt = 0.6Xt−1 + 0.8Xt−2 + t is not stationary (φ1 + φ2 ≮ 1).
Xt = −0.4Xt−1 + 0.7Xt−2 + t is not stationary (φ2 − φ1 ≮ 1).
123 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Stationarity of AR (p) Models


Definition
The condition for the Autoregressive process of order p, AR (p):
Xt = φ1 Xt−1 + . . . + φp Xt−p + t , to be stationary is that the
absolute values of the roots of (1 − φ1 B − φ2 B2 − . . . − φp Bp ) = 0
must lie outside the unit circle.

Xt = −0.4Xt−1 + 0.21Xt−2 + t is a stationary process as


|roots| of (1 + 0.4B − 0.21B2 ) = (1 − 0.3B)(1 + 0.7B) = 0
1 1
are | 0.3 | = 3.3 > 1 and | −0.7 | = 1.4 > 1.
Xt = 1.7Xt−1 − 0.42Xt−2 + t is not a stationary process
because the absolute value of one of the two roots of
(1 − 1.7B + 0.42B2 ) = (1 − 0.3B)(1 − 1.4B) = 0 lies inside
1
the unit circle, which is | 1.4 | = 0.71 < 1.
Xt = −0.25Xt−2 + t is a stationary process because
√ |roots| of
2
1 + 0.25B = 0 are | ± 2i| = 2 > 1, where i = −1. 124 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Causality of AR (p) in an Infinite MA (∞) Representation

The AR (p), Φp (B)Xt = t , where Φp (B) = 1 − pj=1 φj Bp and


P

t ∼ WN(0, σ 2 ), is causal process if it can be written in an infinite


MA representation MA (∞):

Φp (B)−1 Φp (B)Xt = Φp (B)−1 t



X
−1
⇒ Xt = Φp (B) t = Ψ∞ (B)t = ψi t−i
i=0
−1 , Ψ (B) = 1 +
P∞ i
where
P∞ Ψ ∞ (B) = Φp (B) ∞ i=1 ψi B , ψi satisfy
i=0 |ψi | < ∞ with ψ0 = 1. (see the next slide).

125 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

AR (p) in an Infinite MA (∞) Representation

Because Ψ∞ (B) = Φp (B)−1 , the coefficients ψi can be obtained by


equating the coefficients in the relation Φp (B)Ψ∞ (B) = 1. Thus,

Φp (B)Ψ∞ (B) = (1 − φ1 B − . . . − φp Bp )(1 + ψ1 B + ψ2 B2 + . . .)


= 1 + (ψ1 − φ1 )B + (ψ2 − φ1 ψ1 − φ2 )B2 + . . .
+ (ψj − φ1 ψj−1 − . . . − φp ψj−p )Bj + . . .

by equating coefficients of various powers Bj in the relation


Φp (B)Ψ∞ (B) = 1 for j = 1, 2, . . ., we have

ψj = φ1 ψj−1 + φ2 ψj−2 + . . . + φp−1 ψj−p+1 + φp ψj−p ,

where ψ0 = 1 and ψj = 0, for j < 0.

126 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Example: AR (p) in an Infinite MA (∞) Representation


Example: An example of a stationary AR (2) process:

Xt = 0.1Xt−1 − 0.42Xt−2 + t

where p = 2, φ1 = 0.1, φ2 = −0.42, φj = 0 ∀j > p.


The |roots| of (1 − 0.1B + 0.42B2 ) = (1 − 0.7B)(1 + 0.6B) = 0
1
are B = 1/0.7 = 1.43 > 1 and |B| = | −0.6 | = 1.67 > 1. Thus, the
process is stationary-causal.
Now the process in a MA (∞) representation is written as follows:

X
Xt = ψi t−1 ,
i=0

where ψj = φ1 ψj−1 + φ2 ψj−2 + . . . + φp−1 ψj−p+1 + φp ψj−p , with


ψ0 = 1 and ψj = 0 ∀j < 0. (see the next slide).
127 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Continue with the Previous Example


Use ψj = φ1 ψj−1 + φ2 ψj−2 + . . . + φp ψj−p , where p = 2,
φ1 = 0.1, φ2 = −0.42, φj = 0 ∀j > p, ψ0 = 1 and ψj = 0 ∀j < 0,
ψ1 = φ1 ψ0 = (0.1)(1) = 0.1
ψ2 = φ1 ψ1 + φ2 ψ0 = (0.1)(0.1) + (−0.42)(1) = −0.41
ψ3 = φ1 ψ2 + φ2 ψ1
= (0.1)(−0.41) + (−0.42)(0.1) + 0 = −0.083
ψ4 = φ1 ψ3 + φ2 ψ2
= (0.1)(−0.083) + (−0.42)(−0.41) + 0 = 0.1639
.. ..
. = .
Thus, the AR (2) process in the MA (∞) representation:
Xt = t + 0.1t−1 − 0.41t−2 − 0.083t−3 + 0.1639t−4 + . . .
128 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Impulse Response Sequence of ARMA (p, q) Models


Recall that the ARMA (p, q) models may be represented as
Φp (B)Xt = Θq (B)t ,
p
X q
X
i
Φp (B) = 1 − φi B and Θq (B) = 1 + θj Bj .
i=1 j=1

Under the stationarity conditions, we have the MA (∞) form


Xt = Ψ∞ (B)t ,
where

X
Ψ∞ (B) = Φp (B)−1 Θq (B) = ψi Bi ,
i=0
Ψ is known as the impulse response sequence.
(see next page)
129 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Impulse Response Sequence of ARMA (p, q) Models

Similar to the pure AR model situation, the coefficients ψi can be


derived by equating the coefficients of the relation

Φp (B)Ψ∞ (B) = Θq (B).

The ψj satisfy

ψj = φ1 ψj−1 + φ2 ψj−2 + . . . + φp ψj−p + θj , j = 1, 2, . . . .

where ψ0 = 1, ψj = 0∀j < 0, and θj = 0∀j > q (see next page)

130 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Impulse Response Sequence of ARMA (p, q) Models


Under the invertibility conditions, we have the convergent casual
infinite AR representation as
t = Π∞ (B)Xt ,
where

X
Π∞ (B) = Θq (B)−1 Φp (B) = 1 − πi Bi .
i=1
Similar to the pure MA model situation, the coefficients πi can be
determined from the equating coefficients of relation
Θq (B)Π∞ (B) = Φp (B).
The πj satisfy
πj = −θ1 πj−1 − θ2 πj−2 − . . . − θq πj−q + φj , j = 1, 2, . . .
where π0 = −1, πj = 0∀j < 0, and φj = 0∀j > p.
131 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

ACF of ARMA (1, 1) Models


Consider the casual ARMA (1,1) model:

Xt = φXt−1 + t + θt−1 , where |φ| < 1 ⇒

γ(h) = Cov (Xt+h , Xt ) = E(Xt+h Xt )


= E[(φXt+h−1 + t+h + θt+h−1 )Xt ]
= φE(Xt+h−1 Xt ) + E(t+h Xt ) + θE(t+h−1 Xt ) (1)

Recall that Xt = ∞
P
j=0 ψj t−j , where the coefficients ψj can be
1 + θB
calculated from the relation Ψ∞ (B) = Φp (B)−1 Θq (B) =
1 − φB
(see impulse response sequence of ARMA (p, q) models), where
ψ0 = 1 and ψ1 = φ + θ for the ARMA (1,1) case.
132 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

ACF of ARMA (1, 1) Models


X ∞
X
E(t+h Xt ) = E(t+h ψj t−j ) = ψj E(t+h t−j )
j=0 j=0

σ2,

ψ0 for h = 0;
= (2)
0, for h ≥ 1.


X ∞
X
E(t+h−1 Xt ) = E(t+h−1 ψj t−j ) = ψj E(t+h−1 t−j )
j=0 j=0

 ψ1 σ 2 , for h = 0;

= ψ0 σ 2 , for h = 1; (3)
0, for h ≥ 2.

Furthermore, ψ0 = 1 and ψ1 = φ + θ.
133 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

ACF of ARMA (1, 1) Models


Putting all Equations (1-3) together we obtain
γ(h) = φE(Xt+h−1 Xt ) + E(t+h Xt ) + θE(t+h−1 Xt )
 φγ(1) + σ 2 (1 + φθ + θ2 ), for h = 0;

= φγ(0) + σ 2 θ, for h = 1;
φγ(h − 1), for h ≥ 2.

Note that γ(h) = φγ(h − 1), h ≥ 2 has an iterative form


γ(2) = φγ(1)
γ(3) = φγ(2) = φ2 γ(1)
... = ...
γ(h) = φh−1 γ(1),
with initial conditions:
γ(0) = φγ(1) + σ 2 (1 + φθ + θ2 )


γ(1) = φγ(0) + σ 2 θ 134 of 189


Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

ACF of ARMA (1, 1) Models


Solving for γ(0) and γ(1) we obtain
(1 + 2φθ + θ2 )
γ(0) = σ 2
1 − φ2
(1 + φθ)(φ + θ)
γ(1) = σ 2
1 − φ2
This gives us
(1 + φθ)(φ + θ) h−1
γ(h) = σ 2 φ , for h ≥ 1
1 − φ2
Finally dividing by γ(0) to get the ACF of ARMA (1,1)
(1 + φθ)(φ + θ) h−1
ρ(h) = φ , for h ≥ 1
1 + 2φθ + θ2
135 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

An ARMA (1,1) Simulated Process - Example

R> set.seed(13675)
R> sim<-arima.sim(n=200,list(order=c(1,0,1),ar=0.9,ma=0.5))
R> par(mfrow=c(1,2))
R> plot(sim,ylab=""); acf(sim,lag.max=85)
R> par(mfrow=c(1,1))
Series sim

1.0
5

0.8
0

0.6
ACF

0.4
−5

0.2
0.0
−10

−0.2

0 50 100 150 200 0 20 40 60 80

Time Lag

Figure : A simulated ARMA (1,1) process: Xt = 0.9Xt−1 + t + 0.5t−1 136 of 189


Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

ACF for ARMA (p, q) Models


Assume that the ARMA (p, q) model
Φp (B)Xt = Θq (B)t ,
is causal-stationary, that is the roots of Φp (B) are outside the unit
circle. Then we can write
X∞
Xt = ψj t−j ,
j=0

where the coefficients ψj can be calculated from the relation


Ψ∞ (B) = Φp (B)−1 Θq (B). It follows that E(Xt ) = 0.
Note that Xt and t+j are uncorrelated for j > 0 and ψj = 0
for j < 0.
As in the case for ARMA (1,1), We can obtain a homogeneous
differential equation in terms of γ(h) with some initial
conditions as follows
137 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

ACF for ARMA (p, q) Models


p
X q
X
For an ARMA model Xt = φj Xt−j + θj t−j , with θ0 = 1,
j=1 j=0

γ(h) = Cov (Xt+h , Xt ) = E(Xt+h Xt )


Xp Xq
= E[( φj Xt+h−j + θj t+h−j )Xt ]
j=1 j=0
p
X q
X  ∞
X 
= φj E[Xt+h−j Xt ] + θj E t+h−j ψi t−i
j=1 j=0 i=0
Xp q
X
= φj γ(h − j) + σ 2 θj ψj−h , for h ≥ 0. (4)
j=1 j=h

This gives the general homogeneous difference equation for γ(h).


(see the next slide!)
138 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

ACF for ARMA (p, q) Models

γ(h) − φ1 γ(h − 1) − · · · − φp γ(h − p) = 0, for h ≥ max(p, q + 1)


with initial conditions
X p q
X
γ(h) − φj γ(h − j) = σ 2 θj ψj−h , for 0 ≤ h ≤ max(p, q + 1).
j=1 j=h

Dividing these two equations through by γ(0) will allow us to solve


for the ACF of ARMA (p, q) models, ρ(h) = γ(h)/γ(0).
ρ(h) − φ1 ρ(h − 1) − · · · − φp ρ(h − p) = 0, for h ≥ max(p, q + 1)
with initial conditions
p q
X σ2 X
ρ(h) − φj ρ(h − j) = θj ψj−h , for 0 ≤ h ≤ max(p, q + 1).
γ(0)
j=1 j=h
139 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

ACF of an AR (p) Model


For a causal AR (p), it follows from the previous slide that

ρ(h) − φ1 ρ(h − 1) − · · · − φp ρ(h − p) = 0, for h ≥ p.

with initial conditions


σ2

ρ(0) − φ1 ρ(−1) − φ2 ρ(−2) − · · · − φp ρ(−p) =


γ(0)




 ρ(1) − φ1 ρ(0) − φ2 ρ(−1) − · · · − φp ρ(1 − p) =0



ρ(2) − φ1 ρ(1) − φ2 ρ(0) − · · · − φp ρ(2 − p) =0
 .. ..
. .




ρ(p − 1) − φ1 ρ(p − 2) − φ2 ρ(p − 3) − · · · − φp ρ(−1) =0




ρ(p) − φ1 ρ(p − 1) − φ2 ρ(p − 2) − · · · − φp ρ(0) =0

where ρ(−h) = ρ(h), for h = 1, 2, . . . , p.


140 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Example - ACF of an AR (2) Model


Consider the case AR (2). Then

ρ(h) − φ1 ρ(h − 1) − φ2 ρ(h − 2) = 0, for h ≥ 2.

with initial conditions


 ρ(0) − φ1 ρ(−1) − φ2 ρ(−2) = σ 2 /γ(0) (what is the value of γ(0)?)

ρ(1) − φ1 ρ(0) − φ2 ρ(−1) = 0


ρ(2) − φ1 ρ(1) − φ2 ρ(0) = 0 (you may neglect this condition!)

where ρ(−h) = ρ(h), ∀h. Hence,



 ρ(0) = 1
φ1
ρ(1) = 1−φ 2
ρ(h) = φ1 ρ(h − 1) + φ2 ρ(h − 2), for h ≥ 2.

141 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Cramer’s Rule

Definition
Consider the system of n linear equations for n unknowns,
represented in matrix multiplication form: AX = b, where the
n × n matrix A has a nonzero determinant, and the vector
X = (x1 , x2 , . . . , xn )T is the column vector of the unknowns
variables. Then the system has a unique solution, whose individual
values for the unknowns are given by

det(Ai )
xi = , i = 1, 2, . . . , n,
det(A)

where Ai is the matrix formed by replacing the i-th column of A by


the column vector b.

142 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Partial Autocorrelation Function (PACF )


The partial autocorrelation function (PACF ), φh,h , is a useful
tool used in identifying the order of the AR (p) process.
The partial autocorrelation measures the correlation between
two random variables Xt and Xt+h at different lags h after
removing linear dependence of Xt+1 through Xt+h−1 : The
PACF thus represents the sequence of conditional correlations:
φh,h = Corr (Xt , Xt+h |Xt+1 , . . . , Xt+h−1 ), h = 1, 2, . . .
(see the next slide for the definition of the conditional
correlation!)
The autocorrelation function (ACF ) between two random
variables Xt and Xt+h at different lags h does not adjust for
the influence of the intervening lags: The ACF thus represents
the sequence of unconditional correlations.
143 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Partial Autocorrelation Function (PACF )

φh,h = Corr (Xt , Xt+h |Xt+1 , . . . , Xt+h−1 )


Cov [(Xt |Xt+1 , . . . , Xt+h−1 ), (Xt+h |Xt+1 , . . . , Xt+h−1 )]
= p p
Var (Xt |Xt+1 , . . . , Xt+h−1 ) Var (Xt+h |Xt+1 , . . . , Xt+h−1 )
Cov [(Xt − X̂t ), (Xt+h − X̂t+h )]
= q q ,
Var (Xt − X̂t ) Var (Xt+h − X̂t+h )
where
X̂t = α1 Xt+1 + α2 Xt+2 + . . . + αh−1 Xt+h−1 ,
X̂t+h = β1 Xt+1 + β2 Xt+2 + . . . + βh−1 Xt+h−1 ,
and αi , βi (1 ≤ i ≤ h − 1) are the mean squared linear regression
coefficients obtained from minimizing the E(Xt − X̂t )2 and
E(Xt+h − X̂t+h )2 respectively.
144 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Yule-Walker Equations and PACF for AR (p) Process


The Yule-Walker equations can be used to derive the partial
autocorrelation coefficients at lags 1, 2, . . . , h as follows:
1 Fit the regression model, where the dependent variable Xt
from a zero mean stationary process is regressed on the h
lagged variables Xt−1 , Xt−2 , . . . , Xt−h . i.e.,
Xt = φh,1 Xt−1 + φh,2 Xt−2 + . . . + φh,h Xt−h + t ,
where φh,h denotes the h-th regression parameter and t is an
error term with mean 0 and uncorrelated with Xt−h , for h 6= 0.
2 Multiply this equation by Xt−1 , take expectations and divide
the results by the variance of Xt . Do the same operation with
Xt−2 , Xt−3 , . . . , Xt−h successively to get the following set of
h-Yule-Walker equations. (see the next slide)!
Yule-Walker equations is a technique that can be used to
estimate
Phthe autoregression parameters of the AR (h) model
Xt = i=1 φi Xt−i + t from data.
145 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Yule-Walker Equations and PACF for AR (p) Process


The h × h Yule-Walker Equations associated with AR (h) model:
ρ1 = φh,1 + φh,2 ρ1 + φh,3 ρ2 + . . . + φh,h ρh−1
ρ2 = φh,1 ρ1 + φh,2 + φh,3 ρ1 . . . + φh,h ρh−2
... = .......................................
ρh = φh,1 ρh−1 + φh,2 ρh−2 + φh,h−1 ρ1 + . . . + φh,h
Which may represented in a matrix form as: AX = b,
    
1 ρ1 ρ2 . . . ρh−1 φh,1 ρ1
 ρ1 1 ρ1 . . . ρh−2    φh,2   ρ2
   
 
 ρ2 ρ1 1 . . . ρh−3   φh,3  ρ3
=
   
 
 ... ... ... ... ...   ...   ... 
ρh−1 ρh−2 . . . ρ1 1 φh,h ρh
X is the column vector variables (φh,1 , φh,2 , . . . , φh,h )T and b is
the column vector (ρ1 , ρ2 , . . . , ρh )T (see the next slide)! 146 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Yule-Walker Equations and PACF for AR (p) Process


Use Cramer’s rule successively for j = 1, 2, . . . to get:
φ1,1 = ρ1

1 ρ1

ρ1 ρ2 ρ − ρ21
φ2,2 = = 2
1 ρ1
1 − ρ21
ρ1 1

1 ρ1 ρ1

ρ1 1 ρ2

ρ2 ρ1 ρ3 ρ (1 − ρ21 ) + ρ1 ρ2 (ρ2 − 2) + ρ31
φ3,3 = = 3
1 ρ1 ρ2
1 − 2ρ21 + 2ρ21 ρ2 − ρ22
ρ1 1 ρ1

ρ2 ρ1 1
.. .
. = .. (see next page)! 147 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Yule-Walker Equations and PACF for AR (p) Process


1 ρ1 ρ2 . . . ρh−2 ρ1

ρ1 1 ρ 1 . . . ρh−3 ρ2

.. ..
. ... ... ... ... .

ρh−1 ρh−2 ρh−3 . . . ρ 1 ρh
φh,h =
1 ρ1 ρ2 . . . ρh−2 ρh−1

ρ1 1 ρ1 . . . ρh−3 ρh−2

.. ..
. ... ... ... ... .

ρh−1 ρh−2 ρh−3 ... ρ1 1

148 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

A Numerical Example: How to Calculate PACF


Suppose that ρ1 = 0.7, ρ2 = 0.5, and ρ3 = 0.2, then

φ1,1 = ρ1 = 0.7

1 ρ1 1 0.7

ρ1 ρ2 0.7 0.5
φ2,2 = = = 0.0196
1 ρ1 1 0.7

ρ1 1 0.7 1

1 ρ1 ρ1 1 0.7 0.7

ρ1 1 ρ2 0.7 1 0.5

ρ2 ρ1 ρ3 0.5 0.7 0.2
φ3,3 = = = −0.392
1 ρ1 ρ2 1 0.7 0.5

ρ1 1 ρ1 0.7 1 0.7

ρ2 ρ1 1 0.5 0.7 1
149 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Remarks

The partial autocorrelation function φh,h is a function of the


autocorrelations ρ1 , . . . , ρh . Thus, −1 ≤ φh,h ≤ 1 ∀h > 0.
If {t } is a white noise process, then the partial autocorrelation
function φh,h = 0 for all h 6= 0, whereas φ0,0 = ρ0 = 1.
If the underlying process is AR(p), φh,h = 0 ∀h > p, so the
plot of the PACF should show a cutoff after lag p (see the
next slide!).
Replacing ρh (population autocorrelations) by ρ̂h (sample
autocorrelations) ∀h, will give the sample PACF φ̂h,h (see
Levinson-Durbin recursion method).

150 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

PACF for Simulated AR (1) and AR (2) Process


AR(1), phi = 0.7 AR(1), phi = −0.7
0.6

0.0
0.4

−0.2
Partial ACF

Partial ACF
0.2

−0.4
0.0

−0.6
5 10 15 20 5 10 15 20

Lag Lag

AR(2), phi1 = 0.4 and phi2 = 0.5 AR(2), phi1 = 1.1 and phi2 = −0.4
0.6

0.6
0.4
0.4
Partial ACF

Partial ACF

0.2
0.2

0.0
0.0

−0.2
−0.2

5 10 15 20 5 10 15 20

Lag Lag

151 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Sample PACF : Levinson-Durbin Recursive Method


In practice the sample PACF is obtained by Levinson-Durbin
recursion method starting with φ̂1,1 = ρ̂1 as follows:
Ph
ρ̂h+1 − j=1 φ̂h,j ρ̂h+1−j
φ̂h+1,h+1 = Ph ,
1− j=1 φ̂h,j ρ̂j

and φ̂h+1,j = φ̂h,j − φ̂h+1,h+1 φ̂h,h+1−j , for j = 1, 2, . . . , h.


ρ̂2 − ρ̂21
=⇒ φ̂1,1 = ρ̂1 , φ̂2,2 = , φ̂2,1 = φ̂1,1 − φ̂2,2 φ̂1,1 ,
1 − ρ̂21
ρ̂3 − φ̂2,1 ρ̂2 − φ̂2,2 ρ̂1
φ̂3,3 = , φ̂3,2 = φ̂2,2 − φ̂3,3 φ̂2,1 ,
1 − φ̂2,1 ρ̂1 − φ̂2,2 ρ̂2
φ̂3,1 = φ̂2,1 − φ̂3,3 φ̂2,2 , other φ̂h,h , φ̂h,j can be calculated similarly
152 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Levinson-Durbin Algorithm for PACF in AR Models


For AR (1),
φ̂1,1 = ρ̂1 , and φ̂h,h = 0 ∀h > 1.
For AR (2),
ρ̂2 − ρ̂21
φ̂1,1 = ρ̂1 , φ̂2,2 = , φ̂2,1 = φ̂1,1 −φ̂2,2 φ̂1,1 , φ̂h,h = 0 ∀h > 2.
1 − ρ̂21
For AR (3),
ρ̂2 − ρ̂21
φ̂1,1 = ρ̂1 , φ̂2,2 = , φ̂2,1 = φ̂1,1 − φ̂2,2 φ̂1,1 ,
1 − ρ̂21
ρ̂3 − φ̂2,1 ρ̂2 − φ̂2,2 ρ̂1
φ̂3,3 = , φ̂3,2 = φ̂2,2 − φ̂3,3 φ̂2,1 ,
1 − φ̂2,1 ρ̂1 − φ̂2,2 ρ̂2
φ̂3,1 = φ̂2,1 − φ̂3,3 φ̂2,2 , and φ̂h,h = 0 ∀h > 3.
153 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

A Numerical Example: How to Calculate PACF


Suppose that ρ̂1 = −0.188, ρ̂2 = −0.201, and ρ̂3 = 0.181, then

φ̂1,1 = ρ̂1 = −0.188,


ρ̂2 − ρ̂21 −0.201 − (−0.188)2
φ̂2,2 = = = −0.245,
1 − ρ̂21 1 − (−0.188)2
φ̂2,1 = φ̂1,1 − φ̂2,2 φ̂1,1 = −0.188 − (−0.245)(−0.188) = −0.234,
ρ̂3 − φ̂2,1 ρ̂2 − φ̂2,2 ρ̂1
φ̂3,3 =
1 − φ̂2,1 ρ̂1 − φ̂2,2 ρ̂2
0.181 − (−0.234)(−0.201) − (−0.245)(−0.188)
= = 0.097,
1 − (−0.234)(−0.188) − (−0.245)(−0.201)
φ̂3,2 = φ̂2,2 − φ̂3,3 φ̂2,1 = −0.245 − (0.097)(−0.234) = −0.222,
φ̂3,1 = φ̂2,1 − φ̂3,3 φ̂2,2 = (−0.234) − (0.097)(−0.245) = −0.210.
154 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

The PACF of Invertible MA and ARMA models


For an MA (1), Xt = t + θt−1 , |θ| < 1, one can show that
(−θ)h (1 − θ2 )
φh,h = , h ≥ 1.
1 − θ2(h+1)
For an invertible MA (q), no finite representation exists;
Hence, the PACF will never cut off, as in the case of an
AR (p).
Because an invertible ARMA model has an infinite
AR representation, the PACF will tails off (not cut off).
The PACF for MA models behaves much like the ACF for
AR models (tails off).
Also, the PACF for AR models behaves much like the ACF for
MA models (cuts off after the lags p and q for the AR (p) and
MA (q) models respectively.
155 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Testing Randomness Based on Individual Autocorrelations


To determine whether the values of the ACF , or the PACF ,
are negligible, we can use the approximation that they both

have standard deviation ≈ 1/ n.
At 5% significant level, the approximate confidence bounds
√ √
are ±1.96/ n ≈ ±2/ n, where the values outside this range
can be regarded as significant.
In R, the acf() and pacf() functions can compute and plot the
sample ACF and the sample PACF respectively, where the
confidence bounds are shown as blue dotted lines.
Testing randomness based on individual ACF or PACF is
misleading because ρ(h), h = 1, 2, ... are not independent.

This may produce a large number of ρ(h) exceeds ±2/ n
even if the underlying time series is a White Noise series.
We prefer the portmanteau test statistic for testing randomness (as
we will see later). 156 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Plot of Autocorrelation & Partial Autocorrelation Functions


Consider the regular time series data available from R with the
name lh. The data is 48 samples giving the luteinizing hormone in
blood samples at 10 minute intervals from a human female.
R> par(mfrow=c(1,2))
R> acf(lh) ##Try this code: acf(lh,plot=FALSE)
R> pacf(lh) ##Try this code: pacf(lh,plot=FALSE)
Series lh Series lh

0.6
1.0
0.8

0.4
0.6

Partial ACF

0.2
0.4
ACF

0.2

0.0
0.0

−0.2
−0.2

0 5 10 15 5 10 15

Lag Lag
157 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Random Walk Process

Definition
The process Xt = Xt−1 + t , where t is a White Noise WN(0, σ 2 )
for all t ≥ 1, (i.e., an AR (1) process with φ = 1) is called a
random walk (unit-root) processes.
In words, the value of X in period t is equal to its value in t − 1,
plus a random step due to the white-noise shock t .

158 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Random Walk Process is Not Stationary


The process Xt = Xt−1 + t is not stationary as seen below:
Consider X0 = a, where a is a constant ⇒

X1 = x0 + 1 = a + 1
X2 = x1 + 2 = a + 1 + 2
X3 = x2 + 3 = a + 1 + 2 + 3
.. .
. = ..
Xt
Xt = xt−1 + t = a + i
i=1

Although, the expected value of Xt = E(Xt ) = a, which


independent of time, the autocovariance and autocorrelation are
functions of time (see the next slide!)
159 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Random Walk Process is Not Stationary

The autocovariance function is:


t
X t+h
X
γX (h) = Cov (Xt , Xt+h ) = Cov (a + i , a + j ) = tσ 2 .
i=1 j=1

The autocorrelation function is:


Cov (Xt , Xt+h ) tσ 2 1
ρX (h) = p =p =p ,
Var (Xt )Var (Xt+h ) 2
tσ (t + h)σ 2 1 + h/t

Note that: For large t with h considerably less than t, ρX (h) is


nearly 1, Hence, the correlogram for a random walk is characterised
by positive autocorrelations with very slow decay down from unity.

160 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Test for Non-stationarity: Dickey-Fuller Test Statistics

For a simple AR (1) model Xt = φXt−1 + t , the unit root presents


if φ = 1. The model would be non-stationary in this case.
The regression model can be written as:

∆Xt = (φ − 1)Xt−1 + t = δXt−1 + t ,

where ∆ is the first difference operator.


This model can be estimated and testing for a unit root which
is equivalent to testing:

H0 : δ = 0 (or φ = 1) vs. H1 : δ < 0 (or φ < 1).

The test is a one-sided left tail test.

161 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Dickey-Fuller Unit Root Test Statistics

Dickey-Fuller test statistics derived (modified t-distribution)


to test whether a unit root is present in an AR model.
There are 3 versions of the Dickey-Fuller unit root test:
1 Test for a unit root without drift (constant) and without trend

∆Xt = δXt−1 + t .

2 Test for a unit root with drift only (no trend)

∆Xt = a0 + δXt−1 + t .

3 Test for a unit root with drift and deterministic time trend

∆Xt = a0 + a1 t + δXt−1 + t .

162 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Test for Non-stationarity: Dickey-Fuller Test Statistic


If Xt is stationary (i.e., |φ| < 1) then it can be shown
1
φ̂ ∼ N (φ, (1 − φ2 )).
n
Under the null hypothesis the above result gives
φ̂ ∼ N (1, 0),
which clearly does not make any sense.
Phillips (1987) showed that the sample moments of Xt
converge to random functions of Brownian motion so that
R1
d W (r )dW (r )
n(φ̂ − 1) → 0R 1 .
W (r )2 dr
0

φ̂ is not asymptotically normally distributed. Consequently,


the critical values of the test were computed by simulation.
We reject H0 if n(φ̂ − 1) is ≤ the tabulated critical value. 163 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Critical Values for the Dickey-Fuller Unit Root Statistics

164 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Test for Unit Roots in ARMA models: Augmented


Dickey-Fuller Test

The augmented Dickey-Fuller (ADF ) test tests the null


hypothesis that a time series Xt is not stationary against the
alternative that it is stationary, assuming that the dynamics in
the data have an ARMA structure.
In R, the ADF tests are implemented in the functions:
1 The function adf.test() in the package tseries.
2 The function nsdiffs() in the package forecast.
3 The function ur.df() in the package urca.
4 adfTest() and urdfTest() in the package fUnitRoots.
In each case, we reject the null hypothesis if p-value is less
than or equal to the significance level (usually α = 0.05).

165 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Transform a Non-stationary Series to a Stationary Series

We can eliminate the non-stationarity in a random walk


process by taking the first difference of Xt . Thus,
Yt = ∆Xt = Xt − Xt−1 = t is WN stationary process.
To achieve stationarity in the presence of d unit roots, we
must apply d differences to Xt ; Yt = ∆d Xt = (1 − B)d Xt .
Note that: A random walk often provides a good fit to data with
stochastic trends, such as stock price changes, although even
better fits are usually obtained from more general model
formulations, such as the ARIMA models.

166 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Simulated Random Walk Series

R> set.seed(348)
R> n<-100; epsilon<-rnorm(n); x<-epsilon[1]
R> for (i in 1:(n-1)){x[i+1] <- x[i]+epsilon[i+1]}
R> par(mfrow=c(1,4)) ## Try x <- cumsum(epsilon)
R> plot(ts(x),ylab="X(t)",main="Random walk series")
R> acf(ts(x), main="Autocorrelation function")
R> ts.plot(diff(x),main="First difference")
R> acf(ts(diff(x)),main="Autocorrelation-first difference")
Random walk series Autocorrelation function First difference Autocorrelation−first difference
1.0

1.0
2
10

0.8

0.8
1

0.6
0.6
5

diff(x)
ACF

ACF
X(t)

0.4
0.4

0.2
0.2

−1
0

0.0
0.0

−2

−0.2
−0.2

0 20 40 60 80 100 0 5 10 15 20 0 20 40 60 80 100 0 5 10 15

Time Lag Time Lag

167 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Example: Test for Unit root in an AR (1) Model

Example
Consider the AR (1) model X̂t = 0.946Xt−1 , where n = 34.
The Dickey-Fuller test statistic is

n(φ̂ − 1) = 34(0.946 − 1) = −1.836.

At the significant level α = 0.05, the tabulated critical value


associated with n = 25 is -1.95.
Since the value of the test statistic is not less than or equal to the
critical value , we will not reject H0 ; Hence, there exist a unit root.
We need to take a difference to transform it to stationary in order
to be able to estimate a model for the series.

168 of 189
Introduction to time series
Second Order Properties of AR Models
First and second order properties
Stationarity of AR Models
Stationary and non-stationary models
Partial Autocorrelation Function (PACF )
Autoregressive Models
Random Walk Process
List of Some Useful R Functions and Homework

Test for a unit root with drift and deterministic time trend

R> set.seed(348);n<-100;epsilon<-rnorm(n);x<-epsilon[1]
R> x <-cumsum(epsilon);plot(ts(x),ylab="X(t)")
R> library("tseries");adf.test(x)
Augmented Dickey-Fuller Test

data: x
Dickey-Fuller = -1.849, Lag order = 4, p-value = 0.6392
alternative hypothesis: stationary
10
5
X(t)

0 20 40 60 80 100 169 of 189


Introduction to time series
First and second order properties
List of Some Useful R Functions
Stationary and non-stationary models
Homework
Autoregressive Models
List of Some Useful R Functions and Homework

List of Some Useful R Functions

R-Function Description
read.table() reads data into a data frame.
scan() read data into a vector or list.
attach() makes names of column variables available.
ts() produces a time series object.
ts.plot() produces a time plot for one or more series.
points() add a sequence of points centered to a specified coordinates.
window() extracts a subset of a time series.
aggregate() creates an aggregated series.
time() extracts the time from a time series object.
decompose() decomposes a series into the components trend, seasonal effect, and residual.
summary() summarises an R object.
mean() returns the mean (average).
var() returns the variance with denominator n − 1.
sd() returns the standard deviation.
polyroot() find zeros of a real or complex polynomial.
Mod() check whether the roots of the polynomial have modulus > 1 or not.
adf.test() Computes the Augmented Dickey-Fuller test for the null that x has a unit root. (tseries package).
nsdiffs() Number of differences required for a stationary series. If test="adf", the Augmented Dickey-Fuller
test is used. (forecast package).
ur.df() Performs the augmented Dickey-Fuller unit root test. (urca package).
adfTest() Augmented Dickey-Fuller test for unit roots.(fUnitRoots package).
urdfTest() Augmented Dickey-Fuller test for unit roots.(fUnitRoots package).

170 of 189
Introduction to time series
First and second order properties
List of Some Useful R Functions
Stationary and non-stationary models
Homework
Autoregressive Models
List of Some Useful R Functions and Homework

List of Some Useful R Functions

R-Function Description
str() compactly display the structure of an arbitrary R object.
cov() returns the covariance with denominator n − 1.
cor() returns the correlation.
acf() returns the correlogram (or sets the argument to obtain autocovariance function).
pacf() returns the partial autocorrelation function.
lm() linear models (least squares fit).
predict() forecasts future values.
forecast() forecasting time series (forecast package).
HoltWinters() estimates the parameters of the Holt-Winters or exponential smoothing model.
coef() extracts the coefficients of a fitted model.
resid() extracts the residuals from a fitted model.
diff() returns suitably lagged and iterated differences.
round() round the values in its first argument to the specified number of decimal places.
set.seed() provide a seed for simulations ensuring that the simulations can be reproduced.
par(mfrow=c(i, j)) set a graphical device to insert i × j pictures on one plot.
ar() fit an autoregressive time series model to the data, by default selecting the complexity by AIC .
arima() fit an ARIMA model to a univariate time series.
auto.arima() returns best ARIMA model according to either AIC , AICc or BIC value (forecast package).
arima.sim() simulate from an ARIMA model.
varima.sim() Simulate data from seasonal/nonseasonal ARIMA models (portes package).

171 of 189
Introduction to time series
First and second order properties
List of Some Useful R Functions
Stationary and non-stationary models
Homework
Autoregressive Models
List of Some Useful R Functions and Homework

List of Some Useful R Functions

R-Function Description
AIC() Akaike information criterion for selection model.
BIC() Bayesian information criterion for selection model.
Box.test() compute the Box-Pierce or Ljung-Box portmanteau tests.
BoxPierce() compute the univariate or multivariate Box-Pierce portmanteau test (portes package).
LjungBox() compute the univariate or multivariate Ljung-Box portmanteau test (portes package).
MahdiMcLeod() compute the univariate Peňa-Rodrı́guez or multivariate Mahdi-McLeod portmanteau test (portes).
tsdiag() diagnostic plots for time series fits.
qqplot() produces a Q-Q plot of two datasets.
qqnorm() produces a normal Q-Q plot of points.
qqline() draw the diagonal line for normal Q-Q plots produced by qqnorm().
shapiro.test() Performs the Shapiro-Wilk test of normality.
layout() divides the device up into as many rows and columns as there are in matrix mat,
with the column-widths and the row-heights specified in the respective arguments.
boxplot() produce box-and-whisker plot(s) of the given (grouped) values.
start() extract and encode the times the first observation were taken.
end() extract and encode the times the last observation were taken.
frequency() returns the number of samples per unit time.
polyroot() finds zeros of polynomials and roots of the characteristic equation to check for stationarity.
det() calculate the Determinant of a Matrix.
matrix() create a matrix from a given set of values.

172 of 189
Introduction to time series
First and second order properties
List of Some Useful R Functions
Stationary and non-stationary models
Homework
Autoregressive Models
List of Some Useful R Functions and Homework

Homework

Install R on your own PC (laptop).


Install the following R packages: tseries, forecast,TSA,zoo
astsa, timeSeries, portes, sarima, MASS, lattice, nlme, MTS,
vars, mvtnorm, fracdiff, sspir, akima, fGarch, FitAR, FGN.
Download (as txt files) all data sets needed for this course,
create a ”ts” object, plot and explain each one:
http://www.stat.pitt.edu/stoffer/tsa4/,
http://staff.elena.aut.ac.nz/Paul-Cowpertwait/ts/,
and http://astro.temple.edu/~wwei/data.html
Read the summarized time series analysis on CRAN task view:
https:
//cran.r-project.org/web/views/TimeSeries.html.

173 of 189
Introduction to time series
First and second order properties
List of Some Useful R Functions
Stationary and non-stationary models
Homework
Autoregressive Models
List of Some Useful R Functions and Homework

Homework
For the macroeconomic Longley’s Economic Regression Data with
the name longley available from the R package datasets:
1 Import the data into R.

2 Use the summary() function with this data. Explain it?

3 Calculate the correlation coefficient between the number of

unemployed (Unemployed) and the population size


(Population) for those who are ≥ 14 years of age.
4 Plot Unemployed (Y −axis) versus Population (X −axis)

using the function plot().


5 Plot Unemployed (Y −axis) as time series data versus Year

(X −axis).
6 Fit the linear regression using R with the lm() function,

where Unemployed is the dependent variable.


7 Interpret the results you get from 6.

174 of 189
Introduction to time series
First and second order properties
List of Some Useful R Functions
Stationary and non-stationary models
Homework
Autoregressive Models
List of Some Useful R Functions and Homework

Homework

The data jj (Quarterly earnings per share for 1960Q1 to 1980Q4 of


the U.S. company, Johnson & Johnson, Inc.) available from the R
package astsa.
1 Compute the base trend using centered moving averages.
2 Use 1 to estimate the normalized seasonal indices.
3 Deseasonalize the series
4 Fit a regression line to the deseasonalized observations.
5 Use trend to make deseasonalized predictions.
6 Reseasonalize predictions and plot the forecasts.

175 of 189
Introduction to time series
First and second order properties
List of Some Useful R Functions
Stationary and non-stationary models
Homework
Autoregressive Models
List of Some Useful R Functions and Homework

Homework
The data with the name w6 available from the link
http://astro.temple.edu/~wwei/data.html represents the
realization yearly US Tobacco production from the 1871 to 1984
(in millions of pounds).
1 Import the data into R using scan() R function.

2 Plot the data as time series object.

3 Estimate the trend line using the least squares method.

4 Compute the mean squares of the residuals.

5 Plot the forecast 4 years ahead.

6 Smooth the series using a three-year moving average.

7 Smooth the series using a four-year moving average.

8 Use the decompose() R function to estimate trends and

seasonal effects. Explain!


176 of 189
Introduction to time series
First and second order properties
List of Some Useful R Functions
Stationary and non-stationary models
Homework
Autoregressive Models
List of Some Useful R Functions and Homework

Homework

For the data with the name w1 available from the link
http://astro.temple.edu/~wwei/data.html.
1 Use the exponential smoothing method with four different
smoothing parameters α = 0.2, 0.4, 0.5 and 0.8 to forecast the
1-day ahead.
2 What is the optimal value of the smoothing parameter α?.
3 Use the R function HoltWinters() to find the exponential
smoothing model for the above values of α.
4 Plot the original series together with the smoothing series.

177 of 189
Introduction to time series
First and second order properties
List of Some Useful R Functions
Stationary and non-stationary models
Homework
Autoregressive Models
List of Some Useful R Functions and Homework

Homework
Compute and plot the values of γ̂X (h), ρ̂X (h), and φ̂h,h for lag
h = 0, 1, 2, 3 and 4. Check your results using R.
t Xt t Xt
1 2 11 1
2 1 12 1
3 4 13 4
4 3 14 3
5 3 15 5
6 5 16 6
7 2 17 4
8 1 18 2
9 0 19 5
10 3 20 3
178 of 189
Introduction to time series
First and second order properties
List of Some Useful R Functions
Stationary and non-stationary models
Homework
Autoregressive Models
List of Some Useful R Functions and Homework

Homework

Which of the following time sequence is stationary (Weak


Stationary)?
Xt = a + bt + t , where a and b are constants,
t ∼ i.i.d.(0, 1) for all t ∈ R.
Xt = t + tt−1 , t ∼ i.i.d.(0, 1) for all t ∈ R.
Xt = (−1)t A, where A is a random variable with a zero mean
and a unit variance.
Xt = cos(t + Y ), where Y is a random variable with a
Uniform distribution (0, 2π) and t ∈ R.
Xt = Usin(2πt) + Vcos(2πt), where U and V are independent
random variables, each with mean 0 and variance 1.

179 of 189
Introduction to time series
First and second order properties
List of Some Useful R Functions
Stationary and non-stationary models
Homework
Autoregressive Models
List of Some Useful R Functions and Homework

Homework

1 Classify the following AR models (that is, state if they are


AR (1), or AR (2), etc.), determine the mean of each model,
and rewrite each one using the backshift operator.
Yt − 0.17Yt−1 + 0.19Yt−2 = t
Zn+1 = 77 − 0.55Zn − 0.24Zn−1 + 0.19Zn−2 + n+1
At + 0.2At−1 + 0.035At−2 − 0.13At−3 = 5 + t
2 Classify the following MA models (that is, state if they are
MA (1), or MA (2), etc.), determine the mean of each model,
and rewrite each one using the backshift operator.
Bt+1 = 10 + t+1 − 0.06t + 0.35t−1
Qt − t − 0.29t−1 + 0.19t−2 + 0.61t−3 − 0.26t−4 = 0
Wt − t + 0.4t−1 + 0.25t−2 − 0.2t−3 = 9

180 of 189
Introduction to time series
First and second order properties
List of Some Useful R Functions
Stationary and non-stationary models
Homework
Autoregressive Models
List of Some Useful R Functions and Homework

Homework

1 Find the autocovariance and autocorrelation functions of the


time series models:

Xt = 0.45xt−1 − 0.2Xt−2 + t , where t ∼ (0, 9)

Xt = t + 0.2t−1 − 0.1t−2 , where t ∼ (0, 9)


2 Derive γX (h) and ρX (h) for the ARMA (1,1).

181 of 189
Introduction to time series
First and second order properties
List of Some Useful R Functions
Stationary and non-stationary models
Homework
Autoregressive Models
List of Some Useful R Functions and Homework

Homework
1 Classify the following ARMA models (that is, state if they are
ARMA (1,1), or ARMA (1,2), etc.), determine the mean of
each model, rewrite each one using the backshift operator,
and finally convert each model into a pure AR model, and into
a pure MA model.
Yt − 0.17Yt−1 + 0.19Yt−2 = 12 + t + 0.2t−1
Zn+1 = 0.5Zn − 0.2Zn−1 + n+1 − 0.6n − 0.18n−1 + 0.1n−2
At + 0.2At−1 + 0.04At−2 = t + 0.4t−1 + 0.3t−2 − 0.2t−3
2 Find the autocovariance and autocorrelation functions of the
time series models:

Xt = 0.45xt−1 − 0.2Xt−2 + t , where t ∼ (0, 9)

Xt = t + 0.2t−1 − 0.1t−2 , where t ∼ (0, 9)


182 of 189
Introduction to time series
First and second order properties
List of Some Useful R Functions
Stationary and non-stationary models
Homework
Autoregressive Models
List of Some Useful R Functions and Homework

Homework

1 Which one of the following MA (q) process is invertible?.


Represent the invertible process in an AR (∞) form.
Xt = (1 + B)t
Xt = t − 1.3t−1 + 0.36t−2
Xt = t + 1.6t−1 − 0.36t−2
Xt = (1 + 0.8B − 0.6B2 )t
2 Which one of the following AR (p) process is stationary?.
Represent the stationary process in a MA (∞) form.
Xt = 1.3Xt−1 + t .
(1 + 0.5B)Xt = t .
Xt = 1.25Xt−1 − 0.25Xt−2 + t .
(1 − 1.5B + 0.56B2 )Xt = t .
(1 − 0.2B + 0.8B2 )Xt = t .

183 of 189
Introduction to time series
First and second order properties
List of Some Useful R Functions
Stationary and non-stationary models
Homework
Autoregressive Models
List of Some Useful R Functions and Homework

Homework

1 Which one of the following ARMA (p, q) process is invertible


and stationary?. Represent the invertible process in an AR (∞)
form and represent the stationary process in a MA (∞) form.
Xt = −0.5Xt−1 + t + 0.7t−1
Xt = 0.7Xt−1 + 0.2Xt−2 + t − 0.5t−1
Xt = 0.9Xt−1 − 0.4Xt−2 + t + 1.2t−1 − 0.3t−2
2 Determine the ACF of the following processes,  ∼ WN(0, σ 2 )
and represent each of them in an infinite form.
Xt = t − 1.1t−1 + 0.28t−2 ,
Xt = 0.7Xt−1 + t .
Xt = 0.1Xt−1 + 0.3Xt−2 + t .
Xt = 0.3Xt−1 + 0.54Xt−2 − 0.112Xt−3 + t .

184 of 189
Introduction to time series
First and second order properties
List of Some Useful R Functions
Stationary and non-stationary models
Homework
Autoregressive Models
List of Some Useful R Functions and Homework

Homework

1 Write the following equations in operator form:


Ẋt = φ1 Ẋt−1 + . . . + φp Ẋt−p + t , where Ẋ = X − µ.
Xp Xq
Ẋ = φi Ẋt−i + θj t−j , where Ẋ = X − µ and θ0 = 1
i=1 j=0
2 Consider the ARMA (1,1): Xt = φXt−1 + t + θt−1 , where
φ = −θ. Show that this mode is not really an ARMA (1,1),
but it is a White Noise model ARMA (0,0).
3 Consider the ARMA (2,1):
Xt = −0.3Xt−1 + 0.18Xt−2 + t + 0.6t−1 . Show that this
mode is not really an ARMA (2,1), but it is an
AR (1)≡ARMA (1,0).

185 of 189
Introduction to time series
First and second order properties
List of Some Useful R Functions
Stationary and non-stationary models
Homework
Autoregressive Models
List of Some Useful R Functions and Homework

Homework

For the following models written using backshift operators, expand


the model and write down the model in standard form. In addition,
write down the model using ARIMA notation.
Xt = (1 − φB − θB2 )t
(1 − φB)Xt = t
(1 − φ1 B − Φ2 B2 )Xt = (1 + θ1 B + θ2 B2 )t
(1 − ΦB)(1 − B)2 Xt = (1 + θB)t
(1 − ΦB4 )Xt = t
3 (Z − µ) = Θ (B) , where p = 3, q = 2, and
Φp (B)(1 − B)P t q t
Φp (B) = 1 − pi=1 φi Bi and Θq (B) = 1 + qi=1 θi Bi
P

186 of 189
Introduction to time series
First and second order properties
List of Some Useful R Functions
Stationary and non-stationary models
Homework
Autoregressive Models
List of Some Useful R Functions and Homework

Homework

For the following models written using backshift operators, expand


the model and write down the model in standard form. In addition,
write down the model using ARIMA notation.
(1 + 0.3B)(1 − B)2 Xt = t .
(1 + 0.3B)(1 − B)(1 − B12 )Xt = t .
Xt = (1 − 0.4B)(1 + 0.3B7 )t .
(1 − B7 )Xt = (1 − 0.5B − 0.2B2 )t .
(1−0.27B)(1−0.51B365 )(1−B)(1−B365 )2 Xt = (1−0.46B)t .
(1 − 0.4B)(1 − B)Xt = (1 + 0.5B12 + 0.6B24 )t .
Xt = (1 + 0.7B)(1 + 0.7B12 )t
(1 − 0.7B)(1 − 0.1B12 )Xt = (1 + 0.6B4 )t

187 of 189
Introduction to time series
First and second order properties
List of Some Useful R Functions
Stationary and non-stationary models
Homework
Autoregressive Models
List of Some Useful R Functions and Homework

Homework
Use the build in R function arima.sim() to simulate the following
i.i.d.
models (assume that n = 300 and t ∼ N(0, 1)). Plot each one
and compute the autocovariance, autocorrelation, and partial
autocorrelation functions using the acf() and pacf() R functions.
Xt = 0.4Xt−1 + t
Xt = −0.7Xt−1 + t
Xt = t + 0.2t−1
Xt = 0.1Xt−1 + 0.3Xt−2 + t
Xt = t − 0.6t−1 + 0.3t−2
Xt = 0.2Xt−1 + t + 0.8t−1
Xt = 0.7Xt−1 + 0.2Xt−2 + t + 0.5t−1
Xt = 0.9Xt−1 − 0.4Xt−2 + t + 1.2t−1 − 0.3t−2
188 of 189
Introduction to time series
First and second order properties
List of Some Useful R Functions
Stationary and non-stationary models
Homework
Autoregressive Models
List of Some Useful R Functions and Homework

Homework
Write a short piece of R-code (use set.seed(345)) to simulate
the AR (1) model Xt = φXt + t of length 500, where
t ∼ N(2, 8), for each
φ : φ = 0.9, −0.9, 0.6, −0.6, 0.3, −0.3, 1.5, −1.5, 0.1, −0.1.
Plot each simulated series and plot each ACF and PACF .
Comment on your findings: What effect does the value of φ
have on the stationarity of the series?
Write a short piece of R-code (use set.seed(345)) to simulate
the MA (1) model Xt = t + θt−1 of length 500, where
t ∼ N(0, 1), for each
θ : θ = 0.9, −0.9, 0.6, −0.6, 0.3, −0.3, 1.5, −1.5, 0.1, −0.1.
Plot each simulated series and plot each ACF and PACF .
Comment on your findings: What effect does the value of θ
have on the stationarity of the series?
189 of 189

You might also like