Time Series Analysis and Forecasting
Quantitative Approaches to
Forecasting
Time Series Patterns
Forecast Accuracy
Moving Averages and Exponential
Smoothing
Forecasting Methods
Forecasting methods can be classified as
qualitative or quantitative.
Qualitative methods generally involve the use
of expert judgment to develop forecasts.
Such methods are appropriate when historical
data on the variable being forecast are either
not applicable or unavailable.
We will focus exclusively on quantitative
forecasting methods in this chapter.
Forecasting Methods
Quantitative forecasting methods can be used
when:
past information about the variable being
forecast is available,
the information can be quantified, and
it is reasonable to assume that the pattern
of the past will continue into the future.
In such cases, a forecast can be developed
using a time series method or a causal
method.
Quantitative Forecasting Methods
Quantitative methods are based on an
analysis of historical data concerning one or
more
A timetime series.
series is a set of observations
measured at successive points in time or over
successive periods of time.
If the historical data used are restricted to
past values of the series that we are trying to
forecast, the procedure is called a time series
method.
If the historical data used involve other time
series that are believed to be related to the
time series that we are trying to forecast, the
procedure is called a causal method.
Time Series Methods
The objective of time series analysis is to
discover a pattern in the historical data or
time series and then extrapolate the pattern
into the future.
The forecast is based solely on past values of
the variable and/or past forecast errors.
Causal Methods
Causal forecasting methods are based on the
assumption that the variable we are
forecasting has a cause-effect relationship
with one or more other variables.
Looking at regression analysis as a forecasting
tool, we can view the time series value that
we want to forecast as the dependent
variable.
If we can identify a good set of related
independent, or explanatory, variables we
may be able to develop an estimated
regression equation for forecasting the time
series.
Regression Analysis
By treating time as the independent variable
and the time series as a dependent variable,
regression analysis can also be used as a time
series method.
Time-series regression refers to the use of
regression analysis when the sole
independent variable is time.
Cross-sectional regression refers to the use of
regression analysis when the independent
variable(s) is(are) something other than time.
Forecasting Methods
Forecasting
Methods
Quantitative Qualitative
Causal Time Series
Time Series Patterns
A time series is a sequence of measurements
taken every hour, day, week, month, quarter,
year, or at any other regular time interval.
The pattern of the data is an important factor
in understanding how the time series has
behaved in the past.
If such behavior can be expected to continue
in the future, we can use it to guide us in
selecting an appropriate forecasting method.
Time Series Plot
A useful first step in selecting an appropriate
forecasting method is to construct a time
series plot.
A time series plot is a graphical presentation
of the relationship between time and the time
series variable.
Time is on the horizontal axis, and the time
series values are shown on the vertical axis.
Components of Time Series
• Trend (Secular Trend or Long Term Variation): It is a longer-term change. Here
we take into account the number of observations available and make a subjective
assessment of what is long term. It represents a relatively smooth, steady, and
gradual movement of a time series in the same direction. The trend may be linear or
non-linear (curvilinear). Some examples of secular trends are increase in prices,
increase in pollution, increase in literacy rate, decrease in deaths due to advances in
science etc.
• Seasonal Fluctuations (Seasonal Variation): Many of the time series data
exhibit a seasonal variation within the annual period. Seasonal Fluctuations describe
any regular variation with a period of less than one year. For example, the cost of
various types of fruits and vegetables, clothes, unemployment figures, average daily
rainfall, increase in the sale of tea in winter, increase in the sale of ice cream in
summer, etc., all show seasonal variations.
• Cyclic Fluctuations (Cyclical Variations): Cyclical variation is a non-seasonal
component that varies in a recognizable cycle. These variations are considered a
more dangerous effect on business and economic activity. For example, economic
data affected by business cycles with a period varying between about 5 and 7 years.
The cyclical variation is periodic in nature and repeats itself like a business cycle,
Components of Time Series
• Irregular Fluctuations (Random Variations): These are such variations in the
business which do not repeat in a definite pattern. These include all types of
variations except trend, seasonal and cyclical movements. For example, shortage of
supply due to strike in the factory, power failure, flood, earth quake, war, etc. These
random movements may occur along with trend, seasonal and cyclical variations.
Time Series Plot
Example: Rosco Drugs
Sales of Comfort brand headache tonic (in
bottles) for the past 10 weeks at Rosco Drugs are
shown below.
Rosco Drugs would like
Week Sales Week Sales
to identify the under-
1 110 6 120
lying pattern in the
2 115 7 130
data
3 125 8 115
to guide it in selecting
4 120 9 110
an appropriate 5 125 10 130
forecast-
ing method.
Time Series Plot
Time Series Plot for the
Comfort Sales Data
135
130
Sales (Bottles) 125
120
115
110
105
100
0 2 4 6 8 10 12
Week
Time Series Patterns
The common types of data patterns that can
be identified when examining a time series
plot include:
Horizontal
Trend
Seasonal
Trend & Seasonal
Cyclical
Time Series Patterns
Horizontal
Pattern
• A horizontal pattern exists when the data
fluctuate around a constant mean.
• Changes in business conditions can often
result in a time series that has a horizontal
pattern shifting to a new level.
• A change in the level of the time series makes
it more difficult to choose an appropriate
forecasting method.
Time Series Patterns
Trend Pattern
• A time series may show gradual shifts or
movements to relatively higher or lower
values over a longer period of time.
• Trend is usually the result of long-term
factors such as changes in the
population, demographics, technology, or
consumer preferences.
• A systematic increase or decrease might
be linear or nonlinear.
• A trend pattern can be identified by analyzing
multiyear movements in historical data.
Time Series Patterns
Seasonal Pattern
• Seasonal patterns are recognized by seeing the
same repeating pattern of highs and lows over
successive periods of time within a year.
• A seasonal pattern might occur within
a day, week, month, quarter, year, or some
other interval no greater than a year.
• A seasonal pattern does not necessarily
refer to the four seasons of the year
(spring, summer, fall, and winter).
Time Series Patterns
Trend and Seasonal
Pattern
• Some time series include a combination of a
trend and seasonal pattern.
• In such cases we need to use a forecasting
method that has the capability to deal with
both trend and seasonality.
• Time series decomposition can be used to
separate or decompose a time series into
trend and seasonal components.
Time Series Patterns
Cyclical Pattern
• A cyclical pattern exists if the time series plot
shows an alternating sequence of points
below and above the trend line lasting more
• than
Often,one
theyear.
cyclical component of a time series
is due to multiyear business cycles.
• Business cycles are extremely difficult, if not
impossible, to forecast.
• In this chapter we do not deal with cyclical
effects that may be present in the time series.
Selecting a Forecasting Method
The underlying pattern in the time series is an
important factor in selecting a forecasting
method.
Thus, a time series plot should be one of the
first things developed when trying to
determine what forecasting method to use.
If we see a horizontal pattern, then we need to
select a method appropriate for this type of
pattern.
If we observe a trend in the data, then we
need to use a method that has the capability
to handle trend effectively.
Forecast Accuracy
Measures of forecast accuracy are used to
determine how well a particular forecasting
method is able to reproduce the time series
data that are already available.
Measures of forecast accuracy are important
factors in comparing different forecasting
methods.
By selecting the method that has the best
accuracy for the data already known, we hope
to increase the likelihood that we will obtain
better forecasts for future time periods.
Forecast Accuracy
The key concept associated with measuring
forecast accuracy is forecast error.
Forecast Error = Actual Value -
Forecast
A positive forecast error indicates the
forecasting method underestimated the actual
value.
A negative forecast error indicates the
forecasting method overestimated the actual
value.
Forecast Accuracy
Mean Error
A simple measure of forecast accuracy is the
mean
or average of the forecast errors. Because
positive and
negative forecast errors tend to offset one
another, the
Mean Absolute
mean error Error
is likely (MAE)
to be small. Thus, the
meanThiserror
measure avoids the problem of positive
is not
and a very useful
negative errors measure.
offsetting one another. It is
the mean of the absolute values of the forecast
errors.
Forecast Accuracy
Mean Squared Error (MSE)
This is another measure that avoids the
problem of positive and negative errors
offsetting one another. It is the average of the
squared forecast errors.
Mean Absolute Percentage Error
(MAPE)
The size of MAE and MSE depend upon the
scale of the data, so it is difficult to make
comparisons for different time intervals. To
make such comparisons we need to work with
relative or percentage error measures. The
MAPE is the average of the absolute
percentage errors of the forecasts.
Forecast Accuracy
To demonstrate the computation of these
measures of forecast accuracy we will
introduce the simplest of forecasting methods.
The naïve forecasting method uses the most
recent observation in the time series as the
forecast for the next time period.
Ft+1 = Actual Value in Period
t
Forecast Accuracy
Example: Rosco Drugs
Sales of Comfort brand headache tonic (in
bottles) for the past 10 weeks at Rosco Drugs are
shown below.
If Rosco uses the naïve
forecast method to Week Sales Week Sales
forecast sales for 1 110 6 120
weeks 2 115 7 130
2 – 10, what are the 3 125 8 115
resulting MAE, MSE, 4 120 9 110
and MAPE values? 5 125 10 130
Forecast Accuracy
Naïve Forecast Absolute Squared Abs.%
Week Sales Forecast Error Error Error Error
1 110
2 115 110 5 5 25 4.35
3 125 115 10 10 100 8.00
4 120 125 -5 5 25 4.17
5 125 120 5 5 25 4.00
6 120 125 -5 5 25 4.17
7 130 120 10 10 100 7.69
8 115 130 -15 15 125 13.04
9 110 115 -5 5 25 4.55
10 130 110 20 20 400 15.38
Total 80 850 65.35
Forecast Accuracy
Naive Forecast
Accuracy
80
MAE= =8.89
9
850
MSE= =94.44
9
65.35
MAPE= =7.26 %
9
Moving Averages and Exponential
Smoothing
Now we discuss three forecasting methods
that are appropriate for a time series with a
horizontal pattern:
Moving Weighted Moving Averages
Averages
Exponential
Smoothing
They are called smoothing methods
because their objective is to smooth out the
random fluctuations in the time series.
They are most appropriate for short-range
forecasts.
Moving Averages
The moving averages method uses the
average of the most recent k data values in the
time series as the forecast for the next period.
Ft1
(most recent k data values) Y
t Yt 1 Yt k1
k k
where:
Ft+1= forecast of the time series for period t + 1
Each observation in the moving average
calculation receives the same weight.
Moving Averages
The term moving is used because every
time a new observation becomes available
for the time series, it replaces the oldest
observation in the equation.
As a result, the average will change, or
move, as new observations become
available.
Moving Averages
To use moving averages to forecast, we
must first select the order k, or number of
time series values, to be included in the
moving average.
A smaller value of k will track shifts in a
time series more quickly than a larger value
of k.
If more past observations are considered
relevant, then a larger value of k is better.
Moving Averages
Example: Rosco Drugs
If Rosco Drugs uses a 3-period moving average
to
forecast sales, what are the forecasts for weeks 4-
11? Week Sales Week Sales
1 110 6 120
2 115 7 130
3 125 8 115
4 120 9 110
5 125 10 130
Moving Averages
Week Sales 3-MA Forecast
1 110
(110 + 115 +
2 115 125)/3
3 125
4 120 116.7
5 125 120.0
6 120 123.3
7 130 121.7
8 115 125.0
9 110 121.7
10 130 118.3
11 118.3
Moving Averages
Moving Average
140
120
100
80
Value
Actual
Forecast
60
40
20
0
1 2 3 4 5 6 7 8 9 10
Data Point
Moving Averages
3-MA ForecastAbsoluteSquaredAbs.%
WeekSalesForecast Error Error Error Error
1 110
2 115
3 125
4 120 116.7 3.3 3.3 10.89 2.75
5 125 120.0 5.0 5.0 25.00 4.00
6 120 123.3 -3.3 3.3 10.89 2.75
7 130 121.7 8.3 8.3 68.89 6.38
8 115 125.0 -10.0 10.0 100.00 8.70
9 110 121.7 -11.7 11.7 136.89 10.64
10 130 118.3 11.7 11.7 136.89 9.00
Total 6.6 53.3 489.45 44.22
Moving Averages
3-MA Forecast
Accuracy
53.3
MAE= =7.61
7
489.45
MSE= =69.92
7
44.22
MAPE= =6.32 %
7
The 3-week moving average approach
provided
more accurate forecasts than the naïve
approach.
Weighted Moving Averages
Weighted Moving Averages
• To use this method we must first select the
number of data values to be included in the
• average.
Next, we must choose the weight for each of
the data values.
• The more recent observations are typically
given more weight than older observations.
• For convenience, the weights should sum to
1.
Weighted Moving Averages
Weighted Moving Averages
• An example of a 3-period weighted moving
average (3WMA) is:
3WMA = .2(110) + .3(115) + .5(125) = 119
Weights (.2, .3, Most recent of the
and .5) sum to 1 three observations
Weighted Moving Averages
Forecast Absolute Squared
Week Sales 3-WMA Abs. % Error
Error Error Error
1 110
2 115
3 125
(1/120)*100 =
4 120 119 1 1 1
0.8333
5 125 120.5 4.5 4.5 20.25 3.6
6 120 123.5 -3.5 3.5 12.25 2.9167
7 130 121.5 8.5 8.5 72.25 6.5385
8 115 126 -11 11 121 9.5652
9 110 120.5 -10.5 10.5 110.25 9.5455
10 130 115.5 14.5 14.5 210.25 11.154
3.5 53.5 547.25 44.153
MSE=78. MAPE=6.30
MAE=7.64
18 %
Exponential Smoothing
• This method is a special case of a weighted
moving averages method; we select only the
weight for the most recent observation.
• The weights for the other data values are
computed automatically and become smaller
as the observations grow older.
• The exponential smoothing forecast is a
weighted average of all the observations in
• the
The time
termseries.
exponential smoothing comes from
the exponential nature of the weighting
scheme for the historical values.
Exponential Smoothing
• Exponential Smoothing Forecast
Ft+1 = aYt + (1 – a)Ft
where:
Ft+1 = forecast of the time series for period t + 1
Yt = actual value of the time series in period t
Ft = forecast of the time series for period t
a = smoothing constant (0 < a < 1)
and let:
F2 = Y1 (to initiate the computations)
Exponential Smoothing
Exponential Smoothing
Forecast
• With some algebraic manipulation, we can
rewrite Ft+1 = aYt + (1 – a)Ft as:
Ft+1 = Ft + a(Yt – Ft)
• We see that the new forecast Ft+1 is equal to
the previous forecast Ft plus an adjustment,
which is a times the most recent forecast
error, Yt – Ft.
Exponential Smoothing
Example: Rosco Drugs
If Rosco Drugs uses exponential
smoothing
to forecast sales, which value for the
smoothing constant , .1 or .8, gives better
forecasts? WeekSales WeekSales
1 110 6 120
2 115 7 130
3 125 8 115
4 120 9 110
5 125 10 130
Exponential Smoothing
Using Smoothing Constant Value
= .1
F2 = Y1
Weeks Sales
F=3 = .1Y2 + .9F2 = .1(115) + .9(110)
110
1 110
2 115
F = .1Y3 + .9F3 = .1(125) + .9(110.5)
=4 110.5
3 125 F5 111.95
= = .1Y4 + .9F4 = .1(120) + .9(111.95)
4 120 F = .1Y5 + .9F5 = .1(125) + .9(112.76)
=6 112.76
5 125
=7 113.98
F = .1Y6 + .9F6 = .1(120) + .9(113.98)
6 120
7 130
F = .1Y7 + .9F7 = .1(130) + .9(114.58)
=8 114.58
8 115 F9 116.12
= = .1Y8 + .9F8 = .1(115) + .9(116.12)
9 110
F = .1Y9 + .9F9 = .1(110) + .9(116.01) =
=10116.01
10 130
115.41
Exponential Smoothing
Using Smoothing Constant Value = .8
F2 =
=F3110
= .8(115) + .2(110) =
F4 = .8(125) + .2(114)
114 =
F5 = .8(120) + .2(122.80) =
122.80
F6 = .8(125) + .2(120.56) =
120.56
124.11
F7 = .8(120) + .2(124.11) =
F8 = .8(130) + .2(120.82) =
120.82
F9 = .8(115) + .2(128.16) =
128.16
F10= .8(110) + .2(117.63) =
117.63
111.53
Exponential Smoothing (a = .1)
a = .1 ForecastAbsoluteSquaredAbs.%
WeekSalesForecast Error Error Error Error
1 110
2 115 110.00 5.00 5.00 25.00 4.35
3 125 110.50 14.50 14.50 210.25 11.60
4 120 111.95 8.05 8.05 64.80 6.71
5 125 112.76 12.24 12.24 149.94 9.79
6 120 113.98 6.02 6.02 36.25 5.02
7 130 114.58 15.42 15.42 237.73 11.86
8 115 116.12 -1.12 1.12 1.26 0.97
9 110 116.01 -6.01 6.01 36.12 5.46
10 130 115.41 14.59 14.59 212.87 11.22
Total 82.95 974.22 66.98
Exponential Smoothing (a = .1)
Forecast Accuracy
82.95
MAE= =9.22
9
974.22
MSE= =108.25
9
66.98
MAPE= =7.44 %
9
Exponential smoothing (with a = .1)
provided less accurate forecasts than the
3-MA approach.
Exponential Smoothing (a = .8)
a = .8 ForecastAbsoluteSquaredAbs.%
WeekSalesForecast Error Error Error Error
1 110
2 115 110.00 5.00 5.00 25.00 4.35
3 125 114.00 11.00 11.00 121.00 8.80
4 120 122.80 -2.20 2.20 7.84 1.83
5 125 120.56 4.44 4.44 19.71 3.55
6 120 124.11 -4.11 4.11 16.91 3.43
7 130 120.82 9.18 9.18 84.23 7.06
8 115 128.16 -13.16 13.16 173.30 11.44
9 110 117.63 -7.63 7.63 58.26 6.94
10 130 111.53 18.47 18.47 341.27 14.21
Total 75.19 847.52 61.61
Exponential Smoothing (a = .8)
Forecast Accuracy
75.19
MAE= =8.35
9
847.52
MSE= =94.17
9
61.61
MAPE= =6.85 %
9
Exponential smoothing (with a = .8) provided
more accurate forecasts than ES with a = .1,
but less accurate than the moving average
(with k = 3).
Trend Projection
If a time series exhibits a linear trend, the
method of least squares may be used to
determine a trend line (projection) for future
forecasts.
Least squares, also used in regression analysis,
determines the unique trend line forecast which
minimizes the mean square error between the
trend line forecasts and the actual observed
values for the time series.
The independent variable is the time period and
the dependent variable is the actual observed
value in the time series.
Linear Trend Regression
Using the method of least squares, the formula
for the trend projection is:
Tt = b0 + b1t
where: Tt = linear trend forecast in
period t
b0 = intercept of the linear
trend line
b1 = slope of the linear trend
line
t = time period
Linear Trend Regression
For the trend projection equation Tt = b0 + b1t
n
(t t )(Y t Y)
b1 t 1 n b0 Y b1t
(t
t 1
t )2
where: Yt = value of the time series in period t
n = number of time periods
𝑌¯ average values of the time series
(observations)
=
¯
𝑡= average value of t
Linear Trend Regression
Example: Auger’s Plumbing
Service
The number of plumbing repair jobs performed by
Auger's Plumbing Service in the last nine months is
listed on the right.
Month Jobs
Month Jobs
Forecast the number
of March 353 August 409
repair jobs Auger's April 387 September 399
will May 342 October 412
perform in December June 374 November 408
using the least July 396
squares
method.
Linear Trend Regression
2
(month) t t t (t t ) Y(tYt Y ) (t t )(Yt Y )
(Mar.) 1 -4 16 353 -33.67 134.68
(Apr.) 2 -3 9 387 0.33 -0.99
(May) 3 -2 4 342 -44.67 89.34
(June) 4 -1 1 374 -12.67 12.67
(July) 5 0 0 396 9.33 0
(Aug.) 6 1 1 409 22.33 22.33
(Sep.) 7 2 4 399 12.33 24.66
(Oct.) 8 3 9 412 25.33 75.99
(Nov.) 9 4 16 408 21.33 85.32
Sum 45 60 3480 444.00
Linear Trend Regression
¯𝑡 =45/9=5 𝑌¯ =3480/9=386.667
4
𝑏 0=𝑌¯ −𝑏1 ¯𝑡 =386.667−7.4(5)=349.67
Trend line: Tt = 349.67 + 7.4t
T10 = 349.67 + (7.4)(10) =
423.67
Nonlinear Trend Regression
Sometimes time series have a curvilinear or
nonlinear trend.
A variety of nonlinear functions can be used to
develop an estimate of the trend in a time
series.
One example is this quadratic trend equation:
Tt = b0 + b1t + b2t2
Another example is this exponential trend
equation:
Tt = b0(b1)t
Nonlinear Trend Regression
Example: Cholesterol Drug
Revenue
The annual revenue in millions of dollars for a
cholesterol drug for the first 10 years of sales is shown
below. A curvilinear
function appears to Year Revenue Year Revenue
1 23.1 6 43.2
be needed to model
the long-term trend. 2 21.3 7 59.5
3 27.4 8 64.4
4 34.6 9 74.2
5 33.8 10 99.3
Nonlinear Trend Regression
Cholesterol Drug Revenue
120
100
Revenue ($millions)
f(x) = 0.921590909090909 x² − 2.10598484848485 x + 24.1816666666667
80
60
40
20
0
0 2 4 6 8 10 12
Year
Seasonality without Trend
To the extent that seasonality exists, we need
to incorporate it into our forecasting models to
ensure accurate forecasts.
We will first look at the case of a seasonal
time series with no trend and then discuss
how to model seasonality with trend.
Seasonality without Trend
Example: Umbrella Sales
Year Quarter 1 Quarter 2 Quarter 3 Quarter 4
1 125 153 106 88
2 118 161 133 102
3 138 144 113 80
4 109 137 125 109
5 130 165 128 96
Sometimes it is difficult to identify patterns in
a time series presented in a table.
Plotting the time series can be very
informative.
Seasonality without Trend
Umbrella Sales Time Series Plot
Seasonality without Trend
The time series plot does not indicate any long-
term trend in sales.
However, close inspection of the plot does
reveal a seasonal pattern.
The first and third quarters have moderate
sales,
the second quarter the highest sales, and
the fourth quarter tends to be the lowest
quarter in terms of sales.
Seasonality without Trend
Recall from an earlier chapter that dummy
variables can be used to deal with categorical
independent variables in a multiple regression
model.
We will treat the season as a categorical
variable.
Recall that when a categorical variable has k
levels, k – 1 dummy variables are required.
If there are four seasons, we need three
dummy variables.
Qtr1 = 1 if Quarter 1, 0 otherwise
Qtr2 = 1 if Quarter 2, 0 otherwise
Qtr3 = 1 if Quarter 3, 0 otherwise
Seasonality without Trend
General Form of Estimated Regression Equation
is:
Yˆ b b (Qtr 1) b (Qtr 2) b (Qtr 3)
0 1 2 3
Estimated Regression Equation is:
Sales 95.0 29.0(Qtr 1) 57.0(Qtr 2) 26.0(Qtr 3)
The forecasts of quarterly sales in year 6 are:
Quarter 1: Sales = 95 + 29(1) + 57(0) +
26(0) = 124
Quarter 2: Sales = 95 + 29(0) + 57(1) +
26(0) = 152
Quarter 3: Sales = 95 + 29(0) + 57(0) +
26(1) = 121
Quarter 4: Sales = 95 + 29(0) + 57(0) +
26(0) = 95
Seasonality and Trend
We will now extend the regression approach to
include situations where the time series
contains both a seasonal effect and a linear
trend.
We will introduce an additional independent
variable to represent time.
Seasonality and Trend
Example: Terry’s Tie
Shop Business at Terry's Tie Shop can be viewed as
falling into three distinct seasons: (1) Christmas
(November and December); (2) Father's Day (late
May to mid June); and (3) all other times.
Average weekly sales ($) during each of the
three seasons during the past four years are shown on
the next slide.
Seasonality and Trend
Example: Terry’s Tie Shop
Season
Year 1 2 3
1 1856 2012 985
2 1995 2168 1072
3 2241 2306 1105
4 2280 2408 1120
Determine a forecast for the average weekly
sales in year 5 for each of the three seasons.
Seasonality and Trend
There are three seasons, so we will need two
dummy variables.
Seas1 = 1 if Season 1, 0 otherwise
Seas2 = 1 if Season 2, 0 otherwise
General Form of Estimated Regression Equation
is:
Yˆ b b (Seas1) b (Seas2) b (t)
0 1 2 3
Estimated Regression Equation is:
Sales 797.0 1095.43(Seas1) 1189.47(Seas2) 36.47(t)
Seasonality and Trend
The forecasts of average weekly sales in the
three seasons of year 5 are:
Seas. 1: Sales = 797 + 1095.43(1) + 1189.47(0) + 36.47(13)
= 2366.5
Seas. 2: Sales = 797 + 1095.43(0) + 1189.47(1) + 36.47(14)
= 2497.0
Seas. 3: Sales = 797 + 1095.43(0) + 1189.47(0) + 36.47(15)
= 1344.0
Time Series Decomposition
Time series decomposition can be used to
separate or decompose a time series into
seasonal, trend, and irregular (error)
components.
While this method can be used for forecasting,
its primary applicability is to get a better
understanding of the time series.
Understanding what is really going on with a
time series often depends upon the use of
deseasonalized data.
Time Series Decomposition
Decomposition methods assume that the
actual time series value at period t is a
function of three components: trend,
seasonal, and irregular.
How these three components are combined to
give the observed values of the time series
depends upon whether we assume the
relationship is best described by an additive
or a multiplicative model.
Time Series Decomposition
Additive Model
An additive model follows the form:
Yt = Trendt + Seasonalt + Irregulart
where:
Trendt = trend value at time period t
Seasonalt = seasonal value at time period t
Irregulart = irregular value at time period t
An additive model is appropriate in situations
where the seasonal fluctuations do not
depend upon the level of the time series.
Time Series Decomposition
Multiplicative Model
A multiplicative model follows the form:
Yt = Trendt x Seasonalt x Irregulart
where:
Trendt = trend value at time period t
Seasonalt = seasonal value at time period t
Irregulart = irregular value at time period t
A multiplicative model is appropriate, for
example, if the seasonal fluctuations grow
larger as the sales volume increases because
of a long-term linear trend.
Time Series Decomposition
• Example: Terry’s Tie Shop
Season
Year 1 2 3
1 1856 2012 985
2 1995 2168 1072
3 2241 2306 1105
4 2280 2408 1120
Determine a forecast for the average weekly
sales in year 5 for each of the three seasons.
Calculating the Seasonal Indexes
Step 1. Calculate the centered moving
averages.
There are three distinct seasons in each
year. Hence, take a three-season moving
average to eliminate seasonal and irregular
factors. For example:
1st CMA = (1856 + 2012 + 985)/3 = 1617.67
2nd CMA = (2012 + 985 + 1995)/3 = 1664.00
Etc.
Calculating the Seasonal Indexes
Step 2. Center the CMAs on integer-valued
periods.The first centered moving average
computed in step 1 (1617.67) will be
centered on season 2 of year 1. Note that
the moving averages from step 1 center
themselves on integer-valued periods
because n is an odd number.
Calculating the Seasonal Indexes
Dollar Moving
Year Season Sales (Yt) Average
(1856 + 2012 +
1 1 185 985)/3
2 6 1617.67
3 201 1664.00
2 1 2
199 1716.00
2 5 1745.00
3 985
216 1827.00
3 1 8
224 1873.00
2 107
1 1884.00
3 2
230 1897.00
4 1 228
6 1931.00
2 0
110 1936.00
3 240
5
8
112
Calculating the Seasonal Indexes
The centered moving average values tend to
“smooth
out” both the seasonal and irregular
fluctuations in
The
the centered moving averages represent the
time series.
trend in
the data and any random variation that was
not
removed by using the moving averages to
smooth
the data.
Calculating the Seasonal Indexes
Step 3. Determine the seasonal & irregular factors
(St It ). By dividing each actual by the moving
average for the same time period, we
identify the combined seasonal-irregular
effect in the time series.
St It = Yt /(Moving Average for period t )
Calculating the Seasonal Indexes
Dollar Moving
Year Season Sales (Yt) Average StIt
2012/1617.
1 1 185 67
2 6 1617.67 1.244
3 201 1664.00 .592
2 1 2 1716.00 1.163
2 1745.00 1.242
3 985 1827.00 .587
3 1 199 1873.00 1.196
2 5 1884.00 1.224
3 216 1897.00 .582
4 1 8 1931.00 1.181
2 107 1936.00 1.244
3 2
224
1
Calculating the Seasonal Indexes
Step 4. Determine the average seasonal factors.
Averaging all St It values corresponding to
that season:
Season 1: (1.163 + 1.196 + 1.181) /3 = 1.180
Season 2: (1.244 + 1.242 + 1.224 + 1.244) /4 = 1.238
Season 3: (.592 + .587 + .582) /3 = .587
3.005
Calculating the Seasonal Indexes
Step 5. Scale the seasonal factors (St ).
Average the seasonal factors = (1.180 +
1.238 + .587)/3 = 1.002. Then, divide each
seasonal factor by the average of the
seasonal factors.
Season 1: 1.180/1.002 = 1.178
Season 2: 1.238/1.002 = 1.236
Season 3: .587/1.002 = .586
3.000
Calculating the Seasonal Indexes
Dollar Moving Scaled
Year Season Sales (Yt) Average StIt St
1 1 185 1.178
2 6 1617.67 1.244 1.236
3 201 1664.00 .592 .586
2 1 2 1716.00 1.163 1.178
2 1745.00 1.242 1.236
3 985 1827.00 .587 .586
3 1 199 1873.00 1.196 1.178
2 5 1884.00 1.224 1.236
3 216 1897.00 .582 .586
4 1 8 1931.00 1.181 1.178
2 107 1936.00 1.244 1.236
3 2 .586
224
1
Using the Deseasonalizing Time Series
to Identify Trend
Step 6. Determine the deseasonalized data.
Divide the data point values, Yt , by St .
Using the Deseasonalizing Time Series
to Identify Trend
1856/1.17
Dollar Moving
8 Scaled
Year Season Sales (Yt) Average StIt St Yt/St
1 1 185 1.178 1576
2 6 1617.67 1.244 1.236 1628
3 201 1664.00 .592 .586 1681
2 1 2 1716.00 1.163 1.178 1694
2 1745.00 1.242 1.236 1754
3 985 1827.00 .587 .586 1829
3 1 199 1873.00 1.196 1.178 1902
2 5 1884.00 1.224 1.236 1866
3 216 1897.00 .582 .586 1886
4 1 8 1931.00 1.181 1.178 1935
2 107 1936.00 1.244 1.236 1948
3 2 .586 1911
224
1
Using the Deseasonalizing Time Series
to Identify Trend
Step 7. Determine a trend line of the
deseasonalized
Using thedata.
least squares method for t = 1, 2, ...,
12,
gives:
Tt = b0 + b1t
Deseasonalized Salest = 1580.11 + 33.96t
Using the Deseasonalizing Time Series
to Identify Trend
Step 8. Determine the deseasonalized
predictions.
Substitute t = 13, 14, and 15 into the
least
squares equation:
T13 = 1580.11 + (33.96)(13) = 2022
T14 = 1580.11 + (33.96)(14)
T15 =
= 1580.11
2056 + (33.96)(15) = 2090
Seasonal Adjustments
Step 9. Take into account the
seasonality.
Multiply each deseasonalized prediction
by its seasonal factor to give the following
forecasts for year 5:
Season 1: (1.178)(2022) = 2382
Season 2: (1.236)(2056) = 2541
Season 3: ( .586)(2090) = 1225
End of Chapter 17, Part B