Forecasting - Introduction
Forecasting - Introduction
Dinesh Kumar
Those who have knowledge don’t predict. Those who predict
don’t have knowledge
- Lao Tzu
I think there is a world market for may be 5 computers
- Thomas Watson, Chairman of IBM 1943
Aggregate
Aggregate Resource
Production
Forecasting Planning Planning
Item Master
Capacity
Production
Forecasting Planning
Planning
• Quantitative Techniques.
– Time series techniques
• Casual Models.
– Uses information about relationship between system
elements (e.g. regression).
Time Series Techniques
• Moving Average.
• Exponential Smoothing.
• Extrapolation.
• Trend Estimation.
• Auto-regression.
Time Series Analysis - Application
Trend Cyclical
Seasonal Irregular
Trend Component
• Persistent, overall upward or downward
pattern
• Due to population, economy, technology etc.
• Several years duration
Cyclical Component
• Repeating up and down movements
• Due to interaction of factors influencing
economy
• Usually 2-10 years duration
Seasonal Component
• Regular pattern of up and down movements
• Due to weather, customs etc.
• Occurs within one year
Irregular Component
• Erratic, unsystematic fluctuations
• Due to random variation or unforeseen events
– Strike
– Floods
• Short duration and non-repeating
Demand
Demand
Random
movement
Time Time
(a) Trend (b) Cycle
Demand
Demand
Time Time
(c) Seasonal pattern (d) Trend with seasonal pattern
Time series techniques for Forecasting
Why Time Series Analysis ?
• Time series analysis helps to identify and
explain:
– Any systemic variation in the series of data which
is due to seasonality.
– Cyclical pattern that repeat.
– Trends in the data.
– Growth rates in the trends.
Seasonal Vs Cyclical
• When a cyclical pattern in the data has a period of one
year, it is referred as seasonal variation.
Multiplicative:
Additive:
Mixed:
• Smoothing Techniques:
– Moving average smoothing
– Exponential smoothing
Moving Average (Rolling Average)
• Simple moving average.
– Used mainly to capture trend and smooth short
term fluctuations.
– Most recent data are given equal weights.
• Weighted moving average
– Uses unequal weights for data
Simple moving average
• The forecast for period t+1 (Ft+1) is given by
the average of the ‘n’ most recent data.
1 t
Ft 1 Di
n i t n 1
Ft 1 Forecast for period t 1
Di Data corresponding to time period i
Moving average example – Electra TV Sales
1 t
Ft 1,2 Di
2 i t 1
1 t
Ft 1,4 Di
4 i t 3
Moving Average with n = 2 and n = 4
120
100
80
TV Sale
60
40
20
0
0 5 10 Time 15 20
E t Yt Ft
E t Forecast error at period t
Yt Actual value at time period t
Ft Forecast for time period t
Measures of aggregate error
Mean absolute error MAE 1 n
MAE Et
n t 1
• Smoothing Equations
Lt * Yt 1 (1 ) * Lt 1
L1 Y1
– Lt : Level at time period t (smoothed value)
• Forecast Equation
Ft Lt
Simple Exponential Smoothing Equations
• Smoothing Equations
100
80
Demand
60
40
20
0
0 5 10 15 20
xt = at + ut
Trend Model
The linear trend b is added to the level model’s equation:
xt = at +bt + ut
Holt’s Method
• Holt’s Equations
(i) Lt Yt (1 ) ( Lt 1 Tt 1)
(ii ) Tt ( Lt Lt 1) (1 ) Tt 1
• Forecast Equation
Ft 1 Lt Tt
F t m Lt mTt
Initial values of Lt and Tt
• L1 is in general set to Y1.
T1 (Y 2Y1 )
T1 (Y2 Y1 ) (Y3 Y2 ) (Y4 Y3 ) / 3
T1 (Yn Y1 ) /( n 1)
Double exponential Smoothing
120
100
80
60
40
20
0
0 2 4 6 8 10 12 14 16 18
Actual Forecast
Forecasting Power of a Model
Theil’s Coefficient
2
n
Ft 1 Yt 1
n -1
t t
Y F 2
Y
U1 t 1
, U2 t 1 t
2
n n n -1
Yt 1 Yt
Yt 2 Ft2 Y
t 1 t 1 t 1 t
Method U1 U2
Moving Average with 2 0.1488 0.8966
periods
Double exponential 0.1304 0.7864
Smoothing
Auto Regressive, Integrated Moving
Average
ARIMA
• ARIMA has the following three components:
• Auto-regressive component: Function of past
values of the time series.
and s.d.
• Then the autoregressive process of order p or
AR(p) process is
Yt 0 1 t 2 t 2 ... q t q t
AR(p) Model
MA(q) Model
ARIMA(p,1,q) Process
X t 0 1 X t 1 2 X t 2 ... p X t p
0 1 t 2 t 2 ... q t q t
Where Xt = Yt – Yt-1
Auto-Correlation
• Auto-correlation is the correlation between successive
observations over time.
nk
Yt k Y Yt Y
rk t 1 n
(Yi Y ) 2
t 1
Auto-correlation Function
• A k-period plot of autocorrelations is called
autocorrelation function (ACF) or a
correlogram.
ACF Plot for Harmon Foods
PACF Plot for Harmon Foods
Hypothesis test for autocorrelation
• To test whether the autocorrelation at lag k is significantly
different from 0, the following hypothesis test is used:
• H0: rk = 0
• HA: rk ≠ 0
• For any k, reject H0 if |rk| > 1.96/√n. Where n is the number
of observations.
Harmon Foods Example
=1.96/√48
= 0.29
Partial Auto-Correlation
• Partial auto-correlation of lag k is auto-correlation between Yt
and Yt+k after the removal of linear dependence of Yt+1 to Yt+k-1.
• To test whether the autocorrelation at lag k is significantly
different from 0, the following hypothesis test is used:
• H0: k = 0
• H A: k ≠ 0
• For any k, reject H0 if | k| > 1.96/√n. Where n is the number
of observations.
Harmon Foods: Partial Auto-
Correlation
Pure AR AND MA Process
• Non-stationary process has ACF significant for large number of
lags.
AIC = -2LL + 2m
Where m is the number of variables estimated in the model