Notes2 - Quiz 2

Download as pdf or txt
Download as pdf or txt
You are on page 1of 33

Auto-Regressive Model: AR(p)

• In an autoregression model, we forecast the variable of interest using a linear combination of past values
of the variable. If the model uses last p values it is called AR(p) model.

• We need some restriction for such series to be stationary.


• For an AR(1) model:
• For an AR(2) model:

• For higher order conditions are complex,


however R takes care of them.
Moving Average Model: MA(q)

• Rather than using past values of the forecast variable in a regression, a moving average model uses past
forecast errors in a regression-like model. If the model uses last q errors for forecasting we call it MA(q).

• Notice that each value of Yt can be thought of as a weighted moving average of the past few forecast
errors. However, moving average models should not be confused with the moving average smoothing.
• A moving average model is used for irregular component.
• while moving average smoothing is used for estimating the trend-cycle of past values.

• For such series to be stationary we need

For an MA(1) model:


For an MA(2) model:
Difference between AR and MA
AR(2) MA(2)

AR models MA models show


show gradual gradual decline in
decline in ACF ACF PACF and a single
and a single peak at the order
peak at the in ACF
order in PACF

PACF

Note: If both the plots (ACF & PACF) shows gradual decline, the series might be non-
stationary and a difference is required.
Autoregressive Moving Average Model: ARMA(p,q)
Non-stationary time series

The first step in the analysis of any TS is to plot the


data. If the TS behaves as if they have no fixed mean
level, or has a changing variance, then the data is
non-stationary
Non-stationary time series
Linear trend
Linear trend
ARIMA models
Time series
Decomposition
Decomposition Basics

• The well used nottem data set (Average Monthly


Temperatures at Nottingham, 1920-1939)
Decomposition Basics

• We decompose any time series in three


parts

• Seasonal component
• Trend
• Random Fluctuations.

• Use stl() function in R


nottem_stl = stl(nottem, s.window="periodic")
Decomposition Basics

Decompose AirPassanger dataset


Decomposition: Seasonality

• Seasonal Effect • Seasonal adjustment


• A systematic and calendar related effect. • The process of estimating and removing
• Higher sales during festive season. systematic and calendar related
(Lunar calendar in India) influences.
• Higher water consumption during summer. • Time series needs to be seasonally
adjusted.
• As seasonal effect may hide true
• Complex seasonal effect underlying movement.
• Weekly sales for last week of the month or
first week of the month.
• Depends on month has four weeks or five.
• Number of trading day effect.
• Number of working day varies for every
month year to year, that in turn effect
productivity, level of activity.
Decomposition: Seasonality

• Seasonality arises from systematic, periodic and calendar related influences:

• Natural conditions.
• Higher water consumption during summer.
• Higher production of milk during winter.

• Business and Administrative procedures.


• Higher traveling during summer, synced to school term.
• Higher sales on first weekend of the month, synced to monthly salary.

• Social and Cultural Behaviour.


• Gold purchase on Dhanteras.
• Muhurat Trading on Diwali.

• Calendar related effect.


• Trading day effect, number of working days in a month changes year to year. E.g. 4 weekends in June
2020, where as 5 weekends in June 2019.
• Moving Holiday effects. More prominent in India as most Holidays are linked to lunar calendar.
Decomposition: Trend

• Long term movement in a time series.

• Without calendar related influences.


• Without random movement.

• Change in average/ mean level.


Decomposition: Residual

• The irregular component.

• What remains after removing seasonal and trend component.


• It results from short term fluctuations, which are neither systematic nor predictable.

• In a time series any component can be dominating.


Time Series
Forecasting
Techniques
Time Series Forecasting: Quick Framework

1. Plot and see if data has seasonality and trend.

2. Identify if you need additive or multiplicative decomposition.

3. Choose appropriate algorithm:

Seasonality Trend Correlations Suggested Algorithms


No Yes No Simple Moving Average Smoothing
Yes Yes No Make seasonal adjustment
No Yes No Simple Exponential Smoothing
No Yes No Holts Exponential Smoothing
Yes Yes No Holt-Winters Exponential Smoothing
Yes Yes Yes ARIMA
Simple Moving Average: SMA(k)

For example, the file


http://robjhyndman.com/tsdldata/misc/kings.dat
contains data on the age of death of successive
kings of England, starting with William the
Conqueror (source: Hipel and Mcleod, 1994).
SMA(3)

Remainder of
SMA(3)
Simple Moving Average: SMA(k)

For example, the file


http://robjhyndman.com/tsdldata/misc/kings.dat
contains data on the age of death of successive
kings of England, starting with William the SMA(8)

Conqueror (source: Hipel and Mcleod, 1994).

Remainder of
SMA(8)
Seasonality Adjustments
time year month passenger
For example, the data AirPassanger 1 1 1 112
2 1 2 118
3 1 3 132
• Find the period of seasonality. 4 1 4 129
5 1 5 121
• Take average for that period. 6 1 6 135
7 1 7 148
8 1 8 148
• Subtract the periodic average from each time period. 9 1 9 136
10 1 10 119
11 1 11 104
• Take (weighted) average, to find the seasonal effect. 12 1 12 118
13 2 1 115
14 2 2 126
OR 15 2 3 141
16 2 4 135
decompose_air<-decompose(AirPassengers,type="additive") 17 2 5 125
decompose_air$seasonal 18 2 6 149
19 2 7 170
20 2 8 170
stl_air<-stl(AirPassengers,s.window = "periodic")
21 2 9 158
stl_air$time.series[,1] 22 2 10 133
23 2 11 114
24 2 12 140
25 3 1 145
Exponential Moving Average: EMA(k)

• If you have a time series with constant level and no • Total annual rainfall in inches for London, from 1813-1912.
seasonality, you can use simple exponential smoothing
for short term forecasts.

• Most common value used for n is 2

• One can decide to give more weight to most recent Yt


Exponential Moving Average: EMA(k)
Exponential Moving Average: EMA(k)

• EMA(3) • EMA(8)
Holts Exponential Moving Average

• Holt (1957) extended simple exponential smoothing to allow the forecasting of data with a trend.
• It uses two different smoothening.

‘h’ time period ahead forecast

‘l’ is like local mean.

‘b’ is like trend.

• Local mean changes are smoothened.


• Slope changes are also smoothened.
Exponential Moving Average: EMA(k)
Holts Exponential Moving Average

• Annual diameter of woman’s skirts at the hem, from


1866 to 1911.
skirtsF <- HoltWinters(skirts, gamma=F)
skirtsF; skirtsF$SSE

## Holt-Winters exponential smoothing with trend and without seasonal comp


onent.
##
## Call:
## HoltWinters(x = skirts, gamma = F)
##
## Smoothing parameters:
## alpha: 0.8383481
## beta : 1
## gamma: FALSE
##
## Coefficients:
## [,1]
## a 529.308585
## b 5.690464

## [1] 16954.18
Holts Exponential Moving Average
# Forecast into the future
• Data with trend, no seasonality skirtsF19 <- forecast(skirtsF, h=19)
plot(skirtsF19)
Holt-Winters Exponential Moving Average

• Holt (1957) and Winters (1960) extended Holt’s method to capture seasonality.
• The Holt-Winters seasonal method comprises the forecast equation and three smoothing equations

‘h’ time period ahead forecast

‘l’ is like local mean.

‘b’ is like time trend.


‘s’ is seasonality influence.

• Here ‘m’ is frequency of seasonality. e.g. for AirPassanger m is 12


• In forecast we use last known value for the same season.
Holt-Winters Exponential Moving Average
hw_air<-HoltWinters(log(AirPassengers))
hw_air

## Holt-Winters exponential smoothing with trend and additive season


al component.
##
## Call:
## HoltWinters(x = log(AirPassengers))
##
## Smoothing parameters:
## alpha: 0.3266015
## beta : 0.005744138
## gamma: 0.8206654
##
## Coefficients:
## [,1]
## a 6.172308435
## b 0.008981893
## s1 -0.073201087
## s2 -0.140973564
## s3 -0.036703294
## s4 0.014522733
## s5 0.032554237
## s6 0.154873570
## s7 0.294317062
## s8 0.276063997
## s9 0.088237657
## s10 -0.032657089
## s11 -0.198012716
## s12 -0.102863837
hw_air$SSE

## [1] 0.2030765
Holt-Winters Exponential Moving Average

airF10<-forecast(hw_air, h= 10)
plot(airF10)
How to measure Goodness of Fit

• Most popular methods are

You might also like