Time Series Analysis Methos
Time Series Analysis Methos
it is now possible to get the R garbage collector to do A. C. Harvey. Forecasting, Structural Time Series Models and
this via a finalizer. the Kalman Filter. Cambridge University Press, 1989. 5
Peter J. Brockwell and Richard A. Davis. Introduction to W. N. Venables and B. D. Ripley. Modern Applied Statistics
Time Series and Forecasting. Springer-Verlag, New York, with S-PLUS. Springer-Verlag, New York, third edition,
1996. 5, 6 1999. 2
P. J. Diggle. Time Series: A Biostatistical Introduction. Oxford W. N. Venables and B. D. Ripley. Modern Applied Statistics
University Press, Oxford, 1990. 4 with S. Springer-Verlag, New York, fourth edition, 2002.
2, 4
J. Durbin and S. J. Koopman, editors. Time Series Analysis
by State Space Methods. Oxford University Press, Oxford, Mike West and Jeff Harrison. Bayesian Forecasting and Dy-
2001. ISBN 0-19-852354-8. 4, 5 namic Models. Springer-Verlag, New York, second edi-
tion, 1997. ISBN 0-387-94725-6. 5
G. Gardner, A. C. Harvey, and G. D. A. Phillips. Algorithm
as154. an algorithm for exact maximum likelihood es-
timation of autoregressive-moving average models by Brian D. Ripley
means of kalman filtering. Applied Statistics, 29:311–322. University of Oxford, UK
4 [email protected]
by David Meyer
Exponential smoothing
Exponential smoothing methods forecast time series
by discounted past observations. They have be- Let Yt denote a univariate time series. Exponential
come very popular because of their (relative) sim- smoothing assumes that the forecast Ŷ for period
plicity compared to their good overall performance. t + h based on period t is given by a variable level
Common applications range from business tasks â at period t
(e.g., forecasting of sales or stock fluctuations) to Ŷt+h = ât (1)
environmental studies (e.g., measurements of atmo-
which is recursively estimated by a weighted aver-
spheric components or rainfall data)—with typically
age of the observed and the predicted value for Yt :
no more a priori knowledge than the possible exis-
tence of trend of seasonal patterns. Such methods ât = α Yt + (1 − α )Ŷt
are sometimes also called naive because no covariates
= α Yt + (1 − α ) ât−1
are used in the models, i.e., the data are assumed to
be self-explaining. Their success is rooted in the fact
0 < α < 1 called the smoothing parameter; the
that they belong to a class of local models which au-
smaller it is chosen, the less sensitive Ŷ becomes for
tomatically adapt their parameters to the data dur-
changes (i.e., the smoother the forecast time series
ing the estimation procedure and therefore implic-
will be). The initial value â1 is usually chosen as Y1 .
itly account for (slow) structural changes in the train-
An equivalent formulation for ât is the so called
ing data. Moreover, because the influence of new
error-correction form:
data is controlled by hyperparameters, the effect is
a smoothing of the original time series. ât = ât−1 + α (Yt − ât−1 )
Among the simplest methods is the ordinary ex- = ât−1 + α (Yt − Ŷt ) (2)
ponential smoothing, which assumes no trend and = ât−1 + α et
no seasonality. Holt’s linear trend method (see Holt,
1957) and Winters extensions (see Winters, 1960) add showing that ât can be estimated by ât−1 plus an er-
a trend and a seasonal component (the latter either ror e made in period t. (We will come back to this
additive or multiplicative). Their methods are still later on.)
surprisingly popular, although many extensions and Exponential smoothing can be seen as a special
more general frameworks do exist. We describe case of the Holt-Winters method with no trend and
briefly the methods implemented in package ts and no seasonal component. As an example, we use the
Nile data for which an intercept-only model seems Holt’s linear trend method
appropriate1 . We will try to predict the values from
1937 on, choosing α = 0.2: An obvious extension is to include an additional
library(ts) trend component, say b̂t . The forecast equation be-
data(Nile) comes:
past <- window(Nile, end = 1936)
Ŷt+h = ât + h · b̂t
future <- window(Nile, start = 1937)
alpha <- 0.2
with updating formulas expressing similar ideas to
To obtain a level-only model, we set the other hyper- those for exponential smoothing. There is an addi-
parameters β and γ (see the next sections) to 0: tional formula for the trend component:
model <- HoltWinters(past, alpha = alpha,
beta = 0, gamma = 0) ât = αYt + (1 − α )( ât−1 + b̂t−1 )
The object model contains an object of type b̂t = β ( ât − ât−1 ) + (1 − β) b̂t−1
"HoltWinters". The fitted values (obtained by
fitted(model)) should of course be identical2 to The use is a local fit of a straight line with the coef-
those given by filter: ficients adjusted for new values. We now have one
more parameter to choose: β. A straightforward ap-
filter(alpha * past, filter = 1 - alpha,
method = "recursive", init = past[1])
proach to find the optimal values for both α and β
is to look for the OLS estimates, i.e., the parame-
We predict 43 periods ahead (until the end of the se- ter combination minimizing the sum of squared er-
ries): rors of the one-step-ahead predictions. That is what
pred <- predict(model, n.ahead = 43) HoltWinters does for all the unspecified parameters,
illustrated below for the Australian residents data:
The plot method for Holt-Winters objects shows the
observed and fitted values, and optionally adds the data(austres)
predicted values (see figure 1): past <- window(austres, start = c(1985, 1),
plot(model, predicted.values = pred) end = c(1989, 4))
lines(future) future <- window(austres, start = c(1990, 1))
(model <- HoltWinters(past, gamma = 0))
Holt−Winters filtering
Holt-Winters exponential smoothing with trend
1400
Call:
1200
Smoothing parameters:
1000
Observed / Fitted
alpha: 0.931416
beta : 0.494141
gamma: 0
800
Coefficients:
[,1]
a 16956.68845
600
b 63.58486
The start values for ât and b̂t are Y [2] and Y [2] − Y [1],
1880 1900 1920 1940 1960 1980
respectively; the parameters are estimated via optim
Time
using method = "L-BFGS-B", restricting the parame-
ters to the unit cube.
Figure 1: Exponential smoothing for the Nile data up Another feature is the optional computation of
to 1937, and the predictions in 1937 of the rest of the confidence intervals for the predicted values, which
series. The original series is in shown in gray. by default are plotted along with them (see figure 2):
The coefficients (here only the level) are listed in the pred <- predict(model, n.ahead = 14,
output produced by print() or can be extracted with prediction.interval = TRUE)
coef(). plot(model, pred); lines(future)
1 At least piecewise: note that one might expect a structural break in 1899 due to the construction of the first Ashwan dam
2 Currently the initial value is only observation 2 as it is for the trend model (for which this is sensible). This will be changed in R 1.5.1.
trend component.
As an example, we apply the multiplicative ver-
Observed / Fitted
data(UKgas)
past <- window(UKgas, end = c(1980, 4))
future <- window(UKgas, start = c(1981, 1))
16000
par(mfrow = c(2,1))
Figure 2: Holt’s linear trend method applied to data plot(UKgas, main = "Original Time Series")
on Autralian residents, with 95% confidence limits in plot(model, pred)
dashed blue lines. Original data are plotted in gray,
fitted and predicted values in red. Original Time Series
1000
600
(1 − L)Ŷt = α e t−1 Hyndman, R., Koehler, A., Snyder, R., & Grose, S. (2001).
A state space framework for automatic forecasting using
= (1 − θL)et exponential smoothing methods. Journal of Forecasting.
To appear. 9
(θ = 1 − α), which is (in terms of estimates) the stan- Winters, P. (1960). Forecasting sales by exponentially
dard form of an ARIMA(0,1,1)-process. Finally, we weighted moving averages. Management Science, 6, 324–
note that we actually face state-space models: given 342. 7
process Yt , we try to estimate the underlying pro-
cesses at , bt and st which cannot be observed directly.
But this is the story of structural time series and the David Meyer
function structTS, told in an another article by Brian Technische Universität Wien, Austria
D. Ripley . . . [email protected]