Elements of Forecasting
Elements of Forecasting
Introduction to Forecasting:
1. Forecasting in Action a. Operations planning and control b. Marketing c. Economics d. Financial speculation e. Financial risk management f. Capacity planning g. Business and government budgeting i. Demography j. Crisis management
Forecasts and decisions The object to be forecast Forecast types The forecast horizon The information set Methods and complexity, the parsimony principle, and the shrinkage principle
Why graphical analysis is important Simple graphical techniques Elements of graphical style Application: graphing four components of real GNP
Modeling trend Estimating trend models Forecasting trend Selecting forecasting models using the Akaike and Schwarz criteria Application: forecasting retail sales
The nature and sources of seasonality Modeling seasonality Forecasting seasonal series Application: forecasting housing starts
Characterizing Cycles
Covariance stationary time series White noise The lag operator Wolds theorem, the general linear process, and rational distributed lags Estimation and inference for the mean, autocorrelation and partial autocorrelation functions Application: characterizing Canadian employment dynamics
Moving-average (MA) models Autoregressive (AR) models Autoregressive moving average (ARMA) models Application: specifying and estimating models for forecasting employment
Forecasting Cycles
Optimal forecasts Forecasting moving average processes Forecasting infinite-ordered moving averages Making the forecasts operational The chain rule of forecasting Application: forecasting employment
Putting it all Together: A Forecasting Model with Trend, Seasonal and Cyclical Components
Assembling what we've learned Application: forecasting liquor sales Recursive estimation procedures for diagnosing and selecting forecasting models
Forecasting with Regression Models Conditional forecasting models and scenario analysis Accounting for parameter uncertainty in confidence intervals for conditional forecasts Unconditional forecasting models Distributed lags, polynomial distributed lags, and rational distributed lags Regressions with lagged dependent variables, regressions with ARMA disturbances, and transfer function models Vector autoregressions Predictive causality Impulse-response functions and variance decompositions Application: housing starts and completions
Evaluating a single forecast Evaluating two or more forecasts: comparing forecast accuracy Forecast encompassing and forecast combination Application: OverSea shipping volume on the Atlantic East trade lane
Stochastic trends and forecasting Unit roots: estimation and testing Application: modeling and forecasting the yen/dollar exchange rate Smoothing Exchange rates, continued
Volatility Measurement, Modeling and Forecasting The basic ARCH process The GARCH process Extensions of ARCH and GARCH models Estimating, forecasting and diagnosing GARCH models Application: stock market volatility
Chatfield, C. (1996), The Analysis of Time Series: An Introduction, Fifth Edition. London: Chapman and Hall. Granger, C.W.J. and Newbold, P. (1986), Forecasting Economic Time Series, Second Edition. Orlando, Florida: Academic Press. Harvey, A.C. (1993), Time Series Models, Second Edition. Cambridge, Mass.: MIT Press. Hamilton, J.D. (1994), Time Series Analysis. Princeton: Princeton University Press.
Special insights: Armstrong, J.S. (Ed.) (1999), The Principles of Forecasting. Norwell, Mass.: Kluwer Academic Forecasting. Makridakis, S. and Wheelwright S.C. (1997), Forecasting: Methods and Applications, Third Edition. New York: John Wiley.
Bails, D.G. and Peppers, L.C. (1997), Business Fluctuations. Englewood Cliffs: Prentice Hall.
Taylor, S. (1996), Modeling Financial Time Series, Second Edition. New York: Wiley.
Journals Journal of Forecasting International Journal of Forecasting Journal of Business Forecasting Methods and Systems
Journal of Business and Economic Statistics Review of Economics and Statistics Journal of Applied Econometrics
Software
Online Information
Discrete Random Variable Discrete Probability Distribution Continuous Random Variable Probability Density Function Moment Mean, or Expected Value Location, or Central Tendency Variance Dispersion, or Scale Standard Deviation Skewness Asymmetry Kurtosis Leptokurtosis Normal, or Gaussian, Distribution
Marginal Distribution Joint Distribution Covariance Correlation Conditional Distribution Conditional Moment Conditional Mean Conditional Variance Population Distribution Sample Estimator Statistic, or Sample Statistic Sample Mean Sample Variance Sample Standard Deviation Sample Skewness Sample Kurtosis
Fitted values:
Residuals:
Multiple regression:
F-statistic 30.89
R-squared 0.58
or
Regression of y on x and z LS // Dependent Variable is Y Sample: 1960 2007 Included observations: 48 Variable C X Z Coefficient 9.884732 1.073140 -0.638011 Std. Error 0.190297 0.150341 0.172499 t-Statistic 51.94359 7.138031 -3.698642 Prob. 0.0000 0.0000 0.0006 10.08241 1.908842 3.429780 3.546730 27.82752 0.000000
R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood Durbin-Watson stat
Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion F-statistic Prob(F-statistic)
Scatterplot of y versus x
Six Considerations Basic to Successful Forecasting 1. The Decision Environment and Loss Function
2. The Forecast Object Event outcome, event timing, time series. 3. The Forecast Statement Point forecast, interval forecast, density forecast, probability forecast
4. The Forecast Horizon h-step ahead forecast h-step-ahead extrapolation forecast 5. The Information Set
6. Methods and Complexity, the Parsimony Principle, and the Shrinkage Principle Signal vs. noise Smaller is often better Even incorrect restrictions can help
Decision Making with Symmetric Loss Demand High Build Inventory Reduce Inventory 0 $10,000 Demand Low $10,000 0
Decision Making with Asymmetric Loss Demand High Build Inventory Reduce Inventory 0 $20,000 Demand Low $10,000 0
Forecasting with Symmetric Loss High Actual Sales High Forecasted Sales Low Forecasted Sales 0 $10,000 Low Actual Sales $10,000 0
Forecasting with Asymmetric Loss High Actual Sales High Forecasted Sales Low Forecasted Sales 0 $20,000 Low Actual Sales $10,000 0
Quadratic Loss
Absolute Loss
Asymmetric Loss
1. Why Graphical Analysis is Important Graphics helps us summarize and reveal patterns in data Graphics helps us identify anomalies in data Graphics facilitates and encourages comparison of different pieces of data Graphics enables us to present a huge amount of data in a small space, and it enables us to make huge datasets coherent
2. Simple Graphical Techniques Univariate, multivariate Time series vs. distributional shape Relational graphics
3. Elements of Graphical Style Know your audience, and know your goals. Show the data, and appeal to the viewer. Revise and edit, again and again.
Anscombes Quartet (1) x1 y1 10.0 8.04 8.0 6.95 13.0 7.58 9.0 8.81 11.0 8.33 14.0 9.96 6.0 7.24 4.0 4.26 12.0 10.84 7.0 4.82 5.0 5.68 (2) x2 10.0 8.0 13.0 9.0 11.0 14.0 6.0 4.0 12.0 7.0 5.0 y2 9.14 8.14 8.74 8.77 9.26 8.10 6.13 3.10 9.13 7.26 4.74 (3) x3 10.0 8.0 13.0 9.0 11.0 14.0 6.0 4.0 12.0 7.0 5.0 y3 7.46 6.77 12.74 7.11 7.81 8.84 6.08 5.39 8.15 6.42 5.73 (4) x4 8.0 8.0 8.0 8.0 8.0 8.0 8.0 19.0 8.0 8.0 8.0 y4 6.58 5.76 7.71 8.84 8.47 7.04 5.25 12.50 5.56 7.91 6.89
Anscombes Quartet Regressions of yi on xi, i = 1, ..., 4. LS // Dependent Variable is Y1 Variable C X1 R-squared Coefficient 3.00 0.50 0.67 Std. Error T-Statistic 1.12 0.12 2.67 4.24 1.24
S.E. of regression
LS // Dependent Variable is Y2 Variable C X2 R-squared Coefficient 3.00 0.50 0.67 Std. Error T-Statistic 1.12 0.12 2.67 4.24 S.E. of regression 1.24
LS // Dependent Variable is Y3 Variable C X3 R-squared Coefficient 3.00 0.50 0.67 Std. Error T-Statistic 1.12 0.12 2.67 4.24 1.24
S.E. of regression
LS // Dependent Variable is Y4 Variable C X4 R-squared Coefficient 3.00 0.50 0.67 Std. Error T-Statistic 1.12 0.12 2.67 4.24 1.24
S.E. of regression
Liquor Sales
Scatterplot Matrix 1-, 10-, 20-, and 30-Year Treasury Bond Rates
3. Forecasting Trend
Consistency Efficiency
Retail Sales
Retail Sales Linear Trend Regression Dependent Variable is RTRR Sample: 1955:01 1993:12 Included observations: 468 Variable C TIME Coefficient -16391.25 349.7731 Std. Error 1469.177 5.428670 0.899076 0.898859 15866.12 1.17E+11 -5189.529 0.004682 T-Statistic -11.15676 64.43073 Prob. 0.0000 0.0000 65630.56 49889.26 19.34815 19.36587 4151.319 0.000000
R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood Durbin-Watson stat
Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion F-statistic Prob(F-statistic)
Retail Sales Quadratic Trend Regression Dependent Variable is RTRR Sample: 1955:01 1993:12 Included observations: 468 Variable C TIME TIME2 Coefficient 18708.70 -98.31130 0.955404 Std. Error 379.9566 3.741388 0.007725 T-Statistic 49.23905 -26.27669 123.6754 Prob. 0.0000 0.0000 0.0000 65630.56 49889.26 15.82919 15.85578 77848.80 0.000000
R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood Durbin-Watson stat
Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion F-statistic Prob(F-statistic)
Retail Sales Log Linear Trend Regression Dependent Variable is LRTRR Sample: 1955:01 1993:12 Included observations: 468 Variable C TIME Coefficient 9.389975 0.005931 Std. Error 0.008508 3.14E-05 T-Statistic 1103.684 188.6541 Prob. 0.0000 0.0000 10.78072 0.807325 -4.770302 -4.752573 35590.36 0.000000
R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood Durbin-Watson stat
Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion F-statistic Prob(F-statistic)
Retail Sales Exponential Trend Regression Dependent Variable is RTRR Sample: 1955:01 1993:12 Included observations: 468 Convergence achieved after 1 iterations RTRR=C(1)*EXP(C(2)*TIME) Coefficient C(1) C(2) 11967.80 0.005944 Std. Error 177.9598 3.77E-05 0.988796 0.988772 5286.406 1.30E+10 -4675.175 0.040527 T-Statistic 67.25003 157.7469 Prob. 0.0000 0.0000 65630.56 49889.26 17.15005 17.16778 41126.02 0.000000
R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood Durbin-Watson stat
Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion F-statistic Prob(F-statistic)
Model Selection Criteria Linear, Quadratic and Exponential Trend Models Linear Trend AIC SIC 19.35 19.37 Quadratic Trend 15.83 15.86 Exponential Trend 17.15 17.17
Retail Sales History, 1990.01 - 1993.12 Quadratic Trend Forecast and Realization, 1994.01-1994.12
Retail Sales History, 1990.01 - 1993.12 Linear Trend Forecast and Realization, 1994.01-1994.12
Modeling and Forecasting Seasonality 1. The Nature and Sources of Seasonality 2. Modeling Seasonality D1 = (1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, ...) D2 = (0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, ...) D3 = (0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, ...) D4 = (0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, ...)
Gasoline Sales
Liquor Sales
LS // Dependent Variable is STARTS Sample: 1946:01 1993:12 Included observations: 576 Variable D1 D2 D3 D4 D5 D6 D7 D8 D9 D10 D11 D12 Coefficient 86.50417 89.50417 122.8833 142.1687 147.5000 145.9979 139.1125 138.4167 130.5625 134.0917 111.8333 92.15833 Std. Error 4.029055 4.029055 4.029055 4.029055 4.029055 4.029055 4.029055 4.029055 4.029055 4.029055 4.029055 4.029055 t-Statistic 21.47009 22.21468 30.49929 35.28588 36.60908 36.23627 34.52733 34.35462 32.40524 33.28117 27.75671 22.87344 Prob. 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 123.3944 35.21775 6.678878 6.769630 31.93250 0.000000
R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood Durbin-Watson stat
Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion F-statistic Prob(F-statistic)
Residual Plot
Characterizing Cycles 1. Covariance Stationary Time Series Realization Sample path Covariance stationarity
2. White Noise
4. Wolds Theorem, the General Linear Process, and Rational Distributed Lags Wolds Theorem Let {yt} be any zero-mean covariance-stationary process. Then:
where
and
5. Estimation and Inference for the Mean, Autocorrelation and Partial Autocorrelation Functions
Canadian Employment Index Correlogram Sample: 1962:1 1993:4 Included observations: 128 Acorr. 1 2 3 4 5 6 7 8 9 10 11 12 0.949 0.877 0.795 0.707 0.617 0.526 0.438 0.351 0.258 0.163 0.073 -0.005 P. Acorr. 0.949 -0.244 -0.101 -0.070 -0.063 -0.048 -0.033 -0.049 -0.149 -0.070 -0.011 0.016 Std. Error .088 .088 .088 .088 .088 .088 .088 .088 .088 .088 .088 .088 Ljung-Box 118.07 219.66 303.72 370.82 422.27 460.00 486.32 503.41 512.70 516.43 517.20 517.21 p-value 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
Canadian Employment Index Sample Autocorrelation and Partial Autocorrelation Functions, With Plus or Minus Two Standard Error Bands
Modeling Cycles: MA, AR, and ARMA Models The MA(1) Process
If invertible:
where
If covariance stationary:
Moment structure:
For
. Thus
and
Partial autocorrelations:
MA representation if invertible:
LS // Dependent Variable is CANEMP Sample: 1962:1 1993:4 Included observations: 128 Convergence achieved after 49 iterations Variable C MA(1) MA(2) MA(3) MA(4) Coefficient 100.5438 1.587641 0.994369 -0.020305 -0.298387 Std. Error 0.843322 0.063908 0.089995 0.046550 0.020489 t-Statistic 119.2234 24.84246 11.04917 -0.436189 -14.56311 Prob. 0.0000 0.0000 0.0000 0.6635 0.0000 101.0176 7.499163 2.203073 2.314481 174.1826 0.000000 -.87
R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood Durbin-Watson stat Inverted MA Roots
Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion F-statistic Prob(F-statistic) -.56+.72i -.56 -.72i
Sample: 1962:1 1993:4 Included observations: 128 Q-statistic probabilities adjusted for 4 ARMA term(s)
Acorr. 1 2 3 4 5 6 7 8 9 10 11 12 0.345 0.660 0.534 0.427 0.347 0.484 0.121 0.348 0.148 0.102 0.081 0.029
P. Acorr. 0.345 0.614 0.426 -0.042 -0.398 0.145 -0.118 -0.048 -0.019 -0.066 -0.098 -0.113
Std. Error .088 .088 .088 .088 .088 .088 .088 .088 .088 .088 .088 .088
Ljung-Box 15.614 73.089 111.01 135.49 151.79 183.70 185.71 202.46 205.50 206.96 207.89 208.01
p-value
Employment MA(4) Model Residual Sample Autocorrelation and Partial Autocorrelation Functions, With Plus or Minus Two Standard Error Bands
LS // Dependent Variable is CANEMP Sample: 1962:1 1993:4 Included observations: 128 Convergence achieved after 3 iterations Variable C AR(1) AR(2) Coefficient 101.2413 1.438810 -0.476451 Std. Error 3.399620 0.078487 0.077902 t-Statistic 29.78017 18.33188 -6.116042 Prob. 0.0000 0.0000 0.0000 101.0176 7.499163 0.761677 0.828522 1643.837 0.000000
R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood Durbin-Watson stat Inverted AR Roots
Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion F-statistic Prob(F-statistic) .52
Sample: 1962:1 1993:4 Included observations: 128 Q-statistic probabilities adjusted for 2 ARMA term(s)
Acorr. 1 2 3 4 5 6 7 8 9 10 11 12 -0.035 0.044 0.011 0.051 0.002 0.019 -0.024 0.078 0.080 0.050 -0.023 -0.129
P. Acorr. -0.035 0.042 0.014 0.050 0.004 0.015 -0.024 0.072 0.087 0.050 -0.027 -0.148
Std. Error .088 .088 .088 .088 .088 .088 .088 .088 .088 .088 .088 .088
Ljung-Box 0.1606 0.4115 0.4291 0.7786 0.7790 0.8272 0.9036 1.7382 2.6236 2.9727 3.0504 5.4385
p-value
0.512 0.678 0.854 0.935 0.970 0.942 0.918 0.936 0.962 0.860
Employment AR(2) Model Residual Sample Autocorrelation and Partial Autocorrelation Functions, With Plus or Minus Two Standard Error Bands
Employment AIC Values Various ARMA Models MA Order 0 0 1 AR Order 2 3 4 1.01 .762 .77 .79 1 2.86 .83 .77 .761 .79 2 2.32 .79 .78 .77 .77 3 2.47 .80 .80 .78 .79 4 2.20 .81 .80 .79 .80
Employment SIC Values Various ARMA Models MA Order 0 0 1 AR Order 2 3 4 1.05 .83 .86 .90 1 2.91 .90 .86 .87 .92 2 2.38 .88 .89 .90 .93 3 2.56 .91 .92 .94 .97 4 2.31 .94 .96 .96 1.00
LS // Dependent Variable is CANEMP Sample: 1962:1 1993:4 Included observations: 128 Convergence achieved after 17 iterations Variable C AR(1) AR(2) AR(3) MA(1) Coefficient 101.1378 0.500493 0.872194 -0.443355 0.970952 Std. Error 3.538602 0.087503 0.067096 0.080970 0.035015 t-Statistic 28.58130 5.719732 12.99917 -5.475560 27.72924 Prob. 0.0000 0.0000 0.0000 0.0000 0.0000 101.0176 7.499163 0.760668 0.872076 836.2912 0.000000
R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood Durbin-Watson stat Inverted AR Roots Inverted MA Roots
Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion F-statistic Prob(F-statistic) .51 -.94
Sample: 1962:1 1993:4 Included observations: 128 Q-statistic probabilities adjusted for 4 ARMA term(s)
Acorr. 1 2 3 4 5 6 7 8 9 10 11 12 -0.032 0.041 0.014 0.048 0.006 0.013 -0.017 0.064 0.092 0.039 -0.016 -0.137
P. Acorr. -0.032 0.040 0.017 0.047 0.007 0.009 -0.019 0.060 0.097 0.040 -0.022 -0.153
Std. Error .09 .09 .09 .09 .09 .09 .09 .09 .09 .09 .09 .09
Ljung-Box 0.1376 0.3643 0.3904 0.6970 0.7013 0.7246 0.7650 1.3384 2.5182 2.7276 2.7659 5.4415
p-value
Employment ARMA(3,1) Model Residual Sample Autocorrelation and Partial Autocorrelation Functions, With Plus or Minus Two Standard Error Bands
Forecasting Cycles
where
, and
Putting it all Together: A Forecasting Model with Trend, Seasonal and Cyclical Components The full model:
Point Forecasting
Interval Forecasting
Density Forecasting
Assessing the Stability of Forecasting Models: Recursive Parameter Estimation and Recursive Residuals At each t, t = k, ... ,T-1, compute: Recursive parameter est. and forecast: Recursive residual: If all is well: Sequence of 1-step forecast tests: Standardized recursive residuals: If all is well:
LS // Dependent Variable is LSALES Sample: 1968:01 1993:12 Included observations: 312 Variable C TIME TIME2 Coefficient 6.237356 0.007690 -1.14E-05 Std. Error 0.024496 0.000336 9.74E-07 t-Statistic 254.6267 22.91552 -11.72695 Prob. 0.0000 0.0000 0.0000 7.112383 0.379308 -4.152073 -4.116083 1281.296 0.000000
R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood Durbin-Watson stat
Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion F-statistic Prob(F-statistic)
Log Liquor Sales Quadratic Trend Regression Residual Correlogram Acorr. 0.117 -0.149 -0.106 -0.014 0.142 0.041 0.134 -0.029 -0.136 -0.205 0.056 0.888 0.055 -0.187 -0.159 -0.059 0.091 -0.010 0.086 -0.066 -0.170 -0.231 0.028 0.811 0.013 -0.221 -0.196 -0.092 0.045 -0.043 0.057 -0.095 -0.195 -0.240 0.006 0.765 P. Acorr. 0.117 -0.165 -0.069 -0.017 0.125 -0.004 0.175 -0.046 -0.080 -0.206 0.080 0.879 -0.507 -0.159 -0.144 -0.002 -0.118 -0.055 -0.032 0.028 0.044 0.180 0.016 -0.014 -0.128 -0.136 -0.017 -0.079 -0.094 0.045 0.041 -0.002 0.026 0.088 -0.089 0.076 Std. Error .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 Ljung-Box 4.3158 11.365 14.943 15.007 21.449 21.979 27.708 27.975 33.944 47.611 48.632 306.26 307.25 318.79 327.17 328.32 331.05 331.08 333.57 335.03 344.71 362.74 363.00 586.50 586.56 603.26 616.51 619.42 620.13 620.77 621.89 625.07 638.38 658.74 658.75 866.34 p-value 0.038 0.003 0.002 0.005 0.001 0.001 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36
Log Liquor Sales Quadratic Trend Regression Residual Sample Autocorrelation and Partial Autocorrelation Functions
LS // Dependent Variable is LSALES Sample: 1968:01 1993:12 Included observations: 312 Variable TIME TIME2 D1 D2 D3 D4 D5 D6 D7 D8 D9 D10 D11 D12 Coefficient 0.007656 -1.14E-05 6.147456 6.088653 6.174127 6.175220 6.246086 6.250387 6.295979 6.268043 6.203832 6.229197 6.259770 6.580068 Std. Error 0.000123 3.56E-07 0.012340 0.012353 0.012366 0.012378 0.012390 0.012401 0.012412 0.012423 0.012433 0.012444 0.012453 0.012463 t-Statistic 62.35882 -32.06823 498.1699 492.8890 499.3008 498.8970 504.1398 504.0194 507.2402 504.5509 498.9630 500.5968 502.6602 527.9819 Prob. 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 7.112383 0.379308 -6.128963 -5.961008 1627.567 0.000000
R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood Durbin-Watson stat
Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion F-statistic Prob(F-statistic)
Log Liquor Sales Quadratic Trend Regression with Seasonal Dummies Residual Plot
Log Liquor Sales Quadratic Trend Regression with Seasonal Dummies Residual Correlogram Acorr. 0.700 0.686 0.725 0.569 0.569 0.577 0.460 0.480 0.466 0.327 0.364 0.355 0.225 0.291 0.211 0.138 0.195 0.114 0.055 0.134 0.062 -0.006 0.084 -0.039 -0.063 -0.016 -0.143 -0.135 -0.124 -0.189 -0.178 -0.139 -0.226 -0.155 -0.142 -0.242 P. Acorr. 0.700 0.383 0.369 -0.141 0.017 0.093 -0.078 0.043 0.030 -0.188 0.019 0.089 -0.119 0.065 -0.119 -0.031 0.053 -0.027 -0.063 0.089 0.018 -0.115 0.086 -0.124 -0.055 -0.022 -0.075 -0.047 -0.048 0.086 -0.017 0.073 -0.049 0.097 0.008 -0.074 Std. Error .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 Ljung-Box 154.34 302.86 469.36 572.36 675.58 782.19 850.06 924.38 994.46 1029.1 1072.1 1113.3 1129.9 1157.8 1172.4 1178.7 1191.4 1195.7 1196.7 1202.7 1204.0 1204.0 1206.4 1206.9 1208.3 1208.4 1215.4 1221.7 1227.0 1239.5 1250.5 1257.3 1275.2 1283.7 1290.8 1311.6 p-value 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36
Log Liquor Sales Quadratic Trend Regression with Seasonal Dummies Residual Sample Autocorrelation and Partial Autocorrelation Functions
Log Liquor Sales Quadratic Trend Regression with Seasonal Dummies and AR(3) Disturbances
LS // Dependent Variable is LSALES Sample: 1968:01 1993:12 Included observations: 312 Convergence achieved after 4 iterations Variable TIME TIME2 D1 D2 D3 D4 D5 D6 D7 D8 D9 D10 D11 D12 AR(1) AR(2) AR(3) Coefficient 0.008606 -1.41E-05 6.073054 6.013822 6.099208 6.101522 6.172528 6.177129 6.223323 6.195681 6.131818 6.157592 6.188480 6.509106 0.268805 0.239688 0.395880 Std. Error 0.000981 2.53E-06 0.083922 0.083942 0.083947 0.083934 0.083946 0.083947 0.083939 0.083943 0.083940 0.083934 0.083932 0.083928 0.052909 0.053697 0.053109 t-Statistic 8.768212 -5.556103 72.36584 71.64254 72.65524 72.69393 73.52962 73.58364 74.14071 73.80857 73.04993 73.36197 73.73176 77.55624 5.080488 4.463723 7.454150 Prob. 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 7.112383 0.379308 -7.145319 -6.941373 3720.875 0.000000
R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood Durbin-Watson stat Inverted AR Roots
Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion F-statistic Prob(F-statistic) -.34+.55i -.34 -.55i
Log Liquor Sales Quadratic Trend Regression with Seasonal Dummies and AR(3) Disturbances Residual Plot
Log Liquor Sales Quadratic Trend Regression with Seasonal Dummies and AR(3) Disturbances Residual Correlogram Acorr. 0.056 0.037 0.024 -0.084 -0.007 0.065 -0.041 0.069 0.080 -0.163 -0.009 0.145 -0.074 0.149 -0.039 -0.089 0.058 -0.062 -0.110 0.100 0.039 -0.122 0.146 -0.072 0.006 0.148 -0.109 -0.029 -0.046 -0.084 -0.095 0.051 -0.114 0.024 0.043 -0.229 P. Acorr. 0.056 0.034 0.020 -0.088 0.001 0.072 -0.044 0.063 0.074 -0.169 -0.005 0.175 -0.078 0.113 -0.060 -0.058 0.048 -0.050 -0.074 0.056 0.042 -0.114 0.130 -0.040 0.017 0.082 -0.067 -0.045 -0.100 0.020 -0.101 0.012 -0.061 0.002 -0.010 -0.140 Std. Error .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 .056 Ljung-Box 0.9779 1.4194 1.6032 3.8256 3.8415 5.1985 5.7288 7.2828 9.3527 18.019 18.045 24.938 26.750 34.034 34.532 37.126 38.262 39.556 43.604 46.935 47.440 52.501 59.729 61.487 61.500 69.024 73.145 73.436 74.153 76.620 79.793 80.710 85.266 85.468 86.116 104.75 p-value 0.323 0.492 0.659 0.430 0.572 0.519 0.572 0.506 0.405 0.055 0.081 0.015 0.013 0.002 0.003 0.002 0.002 0.002 0.001 0.001 0.001 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36
Log Liquor Sales Quadratic Trend Regression with Seasonal Dummies and AR(3) Disturbances Residual Sample Autocorrelation and Partial Autocorrelation Functions
Log Liquor Sales Quadratic Trend Regression with Seasonal Dummies and AR(3) Disturbances Residual Histogram and Normality Test
Log Liquor Sales Quadratic Trend Regression with Seasonal Dummies and AR(3) Disturbances Recursive Residuals and Two Standard Error Bands
Log Liquor Sales Quadratic Trend Regression with Seasonal Dummies and AR(3) Disturbances Recursive Parameter Estimates
Log Liquor Sales Quadratic Trend Regression with Seasonal Dummies and AR(3) Disturbances CUSUM Analysis
Forecasting with Regression Models Conditional Forecasting Models and Scenario Analysis
Density forecast: C Scenario analysis, contingency analysis C No forecasting the RHS variables problem
C Forecasting the RHS variables problem C Could fit a model to x (e.g., an autoregressive model) C Preferably, regress y on C No problem in trend and seasonal models
Generalize to
subject to
C Lag weights constrained to lie on low-order polynomial C Additional constraints can be imposed, such as C Smooth lag distribution C Parsimonious
Equivalently,
C Lags of x and y included C Important to allow for lags of y, one way or another
Another way: the transfer function model and various special cases
Name Model Restrictions
Transfer Function Standard Distributed Lag Rational Distributed Lag Univariate AR Univariate MA Univariate ARMA
, or C(L)=1, D(L)=B(L)
B(L)=1
B(L)=C(L)=1
C Estimation by OLS
C Order selection by information criteria C Impulse-response functions, variance decompositions, predictive causality C Forecasts via Wolds chain rule
Point and Interval Forecasts Top Panel Interval Forecasts Dont Acknowledge Parameter Uncertainty Bottom Panel Interval Forecasts Do Acknowledge Parameter Uncertainty
Notes to figure: The left scale is starts, and the right scale is completions.
Acorr. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 0.937 0.907 0.877 0.838 0.795 0.751 0.704 0.650 0.604 0.544 0.496 0.446 0.405 0.346 0.292 0.233 0.175 0.122 0.070 0.019 -0.034 -0.074 -0.123 -0.167
P. Acorr. 0.937 0.244 0.054 -0.077 -0.096 -0.058 -0.067 -0.098 0.004 -0.129 0.029 -0.008 0.076 -0.144 -0.079 -0.111 -0.050 -0.018 0.002 -0.025 -0.032 0.036 -0.028 -0.048
Std. Error 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059
Ljung-Box 255.24 495.53 720.95 927.39 1113.7 1280.9 1428.2 1554.4 1663.8 1752.6 1826.7 1886.8 1936.8 1973.3 1999.4 2016.1 2025.6 2030.2 2031.7 2031.8 2032.2 2033.9 2038.7 2047.4
p-value 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
Completions Correlogram
Acorr. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 0.939 0.920 0.896 0.874 0.834 0.802 0.761 0.721 0.677 0.633 0.583 0.533 0.483 0.434 0.390 0.337 0.290 0.234 0.181 0.128 0.068 0.020 -0.038 -0.087
P. Acorr. 0.939 0.328 0.066 0.023 -0.165 -0.067 -0.100 -0.070 -0.055 -0.047 -0.080 -0.073 -0.038 -0.020 0.041 -0.057 -0.008 -0.109 -0.082 -0.047 -0.133 0.037 -0.092 -0.003
Std. Error 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059
Ljung-Box 256.61 504.05 739.19 963.73 1168.9 1359.2 1531.2 1686.1 1823.2 1943.7 2046.3 2132.2 2203.2 2260.6 2307.0 2341.9 2367.9 2384.8 2395.0 2400.1 2401.6 2401.7 2402.2 2404.6
p-value 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
Notes to figure: We graph the sample correlation between completions at time t and starts at time t-i, i = 1, 2, ..., 24.
VAR Starts Equation LS // Dependent Variable is STARTS Sample(adjusted): 1968:05 1991:12 Included observations: 284 after adjusting endpoints Variable Coefficient Std. Error 0.044235 0.061242 0.072724 0.072655 0.066032 0.102712 0.103847 0.100946 0.094569 t-Statistic 3.320264 10.77587 3.157587 1.966281 0.118217 0.307759 -1.163069 -0.204078 -0.289779 Prob. 0.0010 0.0000 0.0018 0.0503 0.9060 0.7585 0.2458 0.8384 0.7722 1.574771 0.382362 -4.122118 -4.006482 294.7796 0.000000
C 0.146871 STARTS(-1) 0.659939 STARTS(-2) 0.229632 STARTS(-3) 0.142859 STARTS(-4) 0.007806 COMPS(-1) 0.031611 COMPS(-2) -0.120781 COMPS(-3) -0.020601 COMPS(-4) -0.027404 R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood Durbin-Watson stat
Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion F-statistic Prob(F-statistic)
Acorr. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 0.001 0.003 0.006 0.023 -0.013 0.022 0.038 -0.048 0.056 -0.114 -0.038 -0.030 0.192 0.014 0.063 -0.006 -0.039 -0.029 -0.010 0.010 -0.057 0.045 -0.038 -0.149
P. Acorr. 0.001 0.003 0.006 0.023 -0.013 0.021 0.038 -0.048 0.056 -0.116 -0.038 -0.028 0.193 0.021 0.067 -0.015 -0.035 -0.043 -0.009 -0.014 -0.047 0.018 0.011 -0.141
Std. Error 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059
Ljung-Box 0.0004 0.0029 0.0119 0.1650 0.2108 0.3463 0.7646 1.4362 2.3528 6.1868 6.6096 6.8763 17.947 18.010 19.199 19.208 19.664 19.927 19.959 19.993 21.003 21.644 22.088 29.064
p-value 0.985 0.999 1.000 0.997 0.999 0.999 0.998 0.994 0.985 0.799 0.830 0.866 0.160 0.206 0.205 0.258 0.292 0.337 0.397 0.458 0.459 0.481 0.515 0.218
LS // Dependent Variable is COMPS Sample(adjusted): 1968:05 1991:12 Included observations: 284 after adjusting endpoints Variable C STARTS(-1) STARTS(-2) STARTS(-3) STARTS(-4) COMPS(-1) COMPS(-2) COMPS(-3) COMPS(-4) Coefficient 0.045347 0.074724 0.040047 0.047145 0.082331 0.236774 0.206172 0.120998 0.156729 Std. Error 0.025794 0.035711 0.042406 0.042366 0.038504 0.059893 0.060554 0.058863 0.055144 t-Statistic 1.758045 2.092461 0.944377 1.112805 2.138238 3.953313 3.404742 2.055593 2.842160 Prob. 0.0799 0.0373 0.3458 0.2668 0.0334 0.0001 0.0008 0.0408 0.0048 1.547958 0.286689 -5.200872 -5.085236 509.8375 0.000000
R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood Durbin-Watson stat
Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion F-statistic Prob(F-statistic)
Acorr. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 -0.009 -0.035 -0.037 -0.088 -0.105 0.012 -0.024 0.041 0.048 0.045 -0.009 -0.050 -0.038 -0.055 0.027 -0.005 0.096 0.011 0.041 0.046 -0.096 0.039 -0.113 -0.136
P. Acorr. -0.009 -0.035 -0.037 -0.090 -0.111 0.000 -0.041 0.024 0.029 0.037 -0.005 -0.046 -0.024 -0.049 0.028 -0.020 0.082 -0.002 0.040 0.061 -0.079 0.077 -0.114 -0.125
Std. Error 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059 0.059
Ljung-Box 0.0238 0.3744 0.7640 3.0059 6.1873 6.2291 6.4047 6.9026 7.5927 8.1918 8.2160 8.9767 9.4057 10.318 10.545 10.553 13.369 13.405 13.929 14.569 17.402 17.875 21.824 27.622
p-value 0.877 0.829 0.858 0.557 0.288 0.398 0.493 0.547 0.576 0.610 0.694 0.705 0.742 0.739 0.784 0.836 0.711 0.767 0.788 0.801 0.686 0.713 0.531 0.276
Sample: 1968:01 1991:12 Lags: 4 Obs: 284 Null Hypothesis: STARTS does not Cause COMPS COMPS does not Cause STARTS F-Statistic 26.2658 2.23876 Probability 0.00000 0.06511
with variance
So, four key properties of optimal forecasts: a. Optimal forecasts are unbiased b. Optimal forecasts have 1-step-ahead errors that are white noise c. Optimal forecasts have h-step-ahead errors that are at most MA(h-1) d. Optimal forecasts have h-step-ahead errors with variances that are non-decreasing in h and that converge to the unconditional variance of the process C All are easily checked. How?
Assessing optimality with respect to an information set Unforecastability principle: The errors from good forecasts are not be forecastable! Regression:
are 0
Evaluating multiple forecasts: comparing forecast accuracy Forecast errors, Forecast percent errors,
Forecast encompassing
C If (a, b) = (1,0), model a forecast-encompasses model b C If (a, b)= (0,1), model b forecast-encompasses model a C Otherwise, neither model encompasses the other
Alternative approach:
C Equivalent to variance-covariance combination if weights sum to unity and intercept is excluded C Easy extension to include more than two forecasts C Time-varying combining weights C Dynamic combining regressions C Shrinkage of combining weights toward equality C Nonlinear combining regressions
Sample: 1/01/1988 7/18/1997 Included observations: 499 Acorr. 0.518 0.010 -0.044 -0.039 0.025 0.057 P. Acorr. 0.518 -0.353 0.205 -0.172 0.195 -0.117 Std. Error .045 .045 .045 .045 .045 .045 Ljung-Box 134.62 134.67 135.65 136.40 136.73 138.36 p-value 0.000 0.000 0.000 0.000 0.000 0.000
1 2 3 4 5 6
Sample: 1/01/1988 7/18/1997 Included observations: 499 Acorr. 1 2 3 4 5 6 0.495 -0.027 -0.045 -0.056 -0.033 0.087 P. Acorr. 0.495 -0.360 0.229 -0.238 0.191 -0.011 Std. Error .045 .045 .045 .045 .045 .045 Ljung-Box 122.90 123.26 124.30 125.87 126.41 130.22 p-value 0.000 0.000 0.000 0.000 0.000 0.000
LS // Dependent Variable is EQ Sample: 1/01/1988 7/18/1997 Included observations: 499 Convergence achieved after 6 iterations Variable C MA(1) Coefficient -0.024770 0.935393 Std. Error 0.079851 0.015850 t-Statistic -0.310200 59.01554 Prob. 0.7565 0.0000 -0.026572 1.262817 -0.159064 -0.142180 437.8201 0.000000
R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood Durbin-Watson stat Inverted MA Roots
Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion F-statistic Prob(F-statistic)
LS // Dependent Variable is EJ Sample: 1/01/1988 7/18/1997 Included observations: 499 Convergence achieved after 7 iterations Variable C MA(1) Coefficient 1.026372 0.961524 Std. Error 0.067191 0.012470 t-Statistic 15.27535 77.10450 Prob. 0.0000 0.0000 1.023744 1.063681 -0.531226 -0.514342 465.2721 0.000000
R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood Durbin-Watson stat Inverted MA Roots
Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion F-statistic Prob(F-statistic)
LS // Dependent Variable is VOL Sample: 1/01/1988 7/18/1997 Included observations: 499 Convergence achieved after 10 iterations Variable C VOLQ MA(1) Coefficient 2.958191 0.849559 0.912559 Std. Error 0.341841 0.016839 0.018638 t-Statistic 8.653696 50.45317 48.96181 Prob. 0.0000 0.0000 0.0000 19.80609 3.403283 -0.304685 -0.279358 3686.790 0.000000
R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood Durbin-Watson stat Inverted MA Roots
Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion F-statistic Prob(F-statistic)
0.000000 0.000000
LS // Dependent Variable is VOL Sample: 1/01/1988 7/18/1997 Included observations: 499 Convergence achieved after 11 iterations Variable C VOLJ MA(1) Coefficient 2.592648 0.916576 0.949690 Std. Error 0.271740 0.014058 0.014621 t-Statistic 9.540928 65.20021 64.95242 Prob. 0.0000 0.0000 0.0000 19.80609 3.403283 -0.595907 -0.570581 5016.993 0.000000
R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood Durbin-Watson stat Inverted MA Roots
Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion F-statistic Prob(F-statistic)
0.000000 0.000000
Loss Differential
Sample: 1/01/1988 7/18/1997 Included observations: 499 Acorr. 1 2 3 4 5 6 0.357 -0.069 -0.050 -0.044 -0.078 0.017 P. Acorr. 0.357 -0.226 0.074 -0.080 -0.043 0.070 Std. Error .045 .045 .045 .045 .045 .045 Ljung-Box 64.113 66.519 67.761 68.746 71.840 71.989 p-value 0.000 0.000 0.000 0.000 0.000 0.000
LS // Dependent Variable is DD Sample: 1/01/1988 7/18/1997 Included observations: 499 Convergence achieved after 4 iterations Variable C MA(1) Coefficient -0.585333 0.472901 Std. Error 0.204737 0.039526 t-Statistic -2.858945 11.96433 Prob. 0.0044 0.0000 -0.584984 3.416190 2.270994 2.287878 105.2414 0.000000
R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood Durbin-Watson stat Inverted MA Roots
Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion F-statistic Prob(F-statistic)
LS // Dependent Variable is VOL Sample: 1/01/1988 7/18/1997 Included observations: 499 Convergence achieved after 11 iterations Variable C VOLQ VOLJ MA(1) Coefficient 2.181977 0.291577 0.630551 0.951107 Std. Error 0.259774 0.038346 0.039935 0.014174 t-Statistic 8.399524 7.603919 15.78944 67.10327 Prob. 0.0000 0.0000 0.0000 0.0000 19.80609 3.403283 -0.702371 -0.668603 3747.077 0.000000
R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood Durbin-Watson stat Inverted MA Roots
Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion F-statistic Prob(F-statistic)
Unit Roots, Stochastic Trends, ARIMA Forecasting Models, and Smoothing 1. Stochastic Trends and Forecasting
Random walk:
ARIMA(p,1,q) model
or
where
and all the roots of both lag operator polynomials are outside the unit circle.
ARIMA(p,d,q) model
or
where
and all the roots of both lag operator polynomials are outside the unit circle.
Properties of ARIMA(p,1,q) processes C Appropriately made stationary by differencing C Shocks have permanent effects -- Forecasts dont revert to a mean C Variance grows without bound as time progresses -- Interval forecasts widen without bound as horizon grows
Random walk example Point forecast Recall that for the AR(1) process,
Interval and density forecasts Recall error associated with optimal AR(1) forecast:
with variance
Effects of Unit Roots C Sample autocorrelation function fails to damp C Sample partial autocorrelation function near 1 for and then damps quickly C Properties of estimators change e.g., least-squares autoregression with unit roots True process: ,
Estimated model:
Superconsistency: Bias:
distribution
which we rewrite as
where
C vanishes when
(null)
C is nevertheless present under the alternative, so we include an intercept in the regression Dickey-Fuller distribution
Allowing for deterministic linear trend under the alternative Basic model:
or
where
and
C Under the deterministic-trend alternative hypothesis both the intercept and the trend enter and so are included in the regression.
Rewrite:
where
, and
Unit root:
or
where
In the unit root case, the intercept vanishes, because distribution holds asymptotically.
or
where
and
and
C k-1 augmentation lags have been included C , , and hold asymptotically under the null
Simple moving average smoothing Original data: Smoothed data: Two-sided moving average is One-sided moving average is One-sided weighted moving average is C Must choose smoothing parameter, m
C Exponential smoothing can construct the optimal estimate of -- and hence the optimal forecast of any future value of y -on the basis of current and past y C What if the model is misspecified?
Exponential smoothing algorithm Observed series, Smoothed series, Forecasts, (1) Initialize at t=1: (2) Update: (3) Forecast: C Smoothing parameter (estimate of the local level)
where
Holt-Winters Smoothing
C Local level and slope model C Holt-Winters smoothing can construct optimal estimates of and -- and hence the optimal forecast of any future value of y by extrapolating the trend -- on the basis of current and past y
(2) Update:
t = 3, 4, ..., T. (3) Forecast: C C is the estimated level at time t is the estimated slope at time t
Random Walk, Levels Sample Autocorrelation Function (Top Panel) Sample Partial Autocorrelation Function (Bottom Panel)
Random Walk, First Differences Sample Autocorrelation Function (Top Panel) Sample Partial Autocorrelation Function (Bottom Panel)
Log Yen / Dollar Exchange Rate (Top Panel) Change in Log Yen / Dollar Exchange Rate (Bottom Panel)
Log Yen / Dollar Exchange Rate Sample Autocorrelations (Top Panel) Sample Partial Autocorrelations (Bottom Panel)
Log Yen / Dollar Exchange Rate, First Differences Sample Autocorrelations (Top Panel) Sample Partial Autocorrelations (Bottom Panel)
Log Yen / Dollar Rate, Levels AIC Values Various ARMA Models MA Order 0 0 AR Order 1 2 3 -7.171 -7.319 -7.322 1 -5.171 -7.300 -7.314 -7.323 2 -5.953 -7.293 -7.320 -7.316 3 -6.428 -7.287 -7.317 -7.308
Log Yen / Dollar Rate, Levels SIC Values Various ARMA Models MA Order 0 0 AR Order 1 2 3 -7.131 -7.265 -7.253 1 -5.130 -7.211 -7.246 -7.241 2 -5.899 -7.225 -7.238 -7.220 3 -6.360 -7.205 -7.221 -7.199
Log Yen / Dollar Exchange Rate Best-Fitting Deterministic-Trend Model LS // Dependent Variable is LYEN Sample(adjusted): 1973:03 1994:12 Included observations: 262 after adjusting endpoints Convergence achieved after 3 iterations Variable C TIME AR(1) AR(2) Coefficient 5.904705 -0.004732 1.305829 -0.334210 Std. Error 0.136665 0.000781 0.057587 0.057656 t-Statistic 43.20570 -6.057722 22.67561 -5.796676 Prob. 0.0000 0.0000 0.0000 0.0000 5.253984 0.341563 -7.319015 -7.264536 15461.07 0.000000
R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood Durbin-Watson stat Inverted AR Roots
Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion F-statistic Prob(F-statistic) .35
Log Yen / Dollar Exchange Rate Best-Fitting Deterministic-Trend Model Residual Plot
Log Yen / Dollar Rate History and Forecast AR(2) in Levels with Linear Trend
Log Yen / Dollar Rate History and Long-Horizon Forecast AR(2) in Levels with Linear Trend
Log Yen / Dollar Rate History, Forecast and Realization AR(2) in Levels with Linear Trend
Log Yen / Dollar Exchange Rate Augmented Dickey-Fuller Unit Root Test
-2.498863
Augmented Dickey-Fuller Test Equation LS // Dependent Variable is D(LYEN) Sample(adjusted): 1973:05 1994:12 Included observations: 260 after adjusting endpoints Variable Coefficient Std. Error t-Statistic Prob. 0.0131 0.0000 0.0795 0.0535 0.0132 0.0088 -0.003749 0.027103 -7.327517 -7.245348 8.432417 0.000000
LYEN(-1) -0.029423 D(LYEN(-1)) 0.362319 D(LYEN(-2)) -0.114269 D(LYEN(-3)) 0.118386 C 0.170875 @TREND(1973:01) -0.000139 R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood Durbin-Watson stat 0.142362 0.125479 0.025345 0.163166 589.6532 2.010829
Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion F-statistic Prob(F-statistic)
Log Yen / Dollar Rate, Changes AIC Values Various ARMA Models MA Order 0 0 AR Order 1 2 3 -7.308 -7.312 -7.316 1 -7.298 -7.307 -7.314 -7.309 2 -7.290 -7.307 -7.307 -7.340 3 -7.283 -7.302 -7.299 -7.336
Log Yen / Dollar Rate, Changes SIC Values Various ARMA Models MA Order 0 0 AR Order 1 2 3 -7.281 -7.271 -7.261 1 -7.270 -7.266 -7.259 -7.241 2 -7.249 -7.252 -7.238 -7.258 3 -7.228 -7.234 -7.217 -7.240
LS // Dependent Variable is DLYEN Sample(adjusted): 1973:03 1994:12 Included observations: 262 after adjusting endpoints Convergence achieved after 3 iterations Variable C AR(1) Coefficient -0.003697 0.321870 Std. Error 0.002350 0.057767 t-Statistic -1.573440 5.571863 Prob. 0.1168 0.0000 -0.003888 0.027227 -7.308418 -7.281179 31.04566 0.000000
R-squared Adjusted R-squared S.E. of regression Sum squared resid Log likelihood Durbin-Watson stat Inverted AR Roots
Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion F-statistic Prob(F-statistic)
Log Yen / Dollar Exchange Rate Best-Fitting Stochastic-Trend Model Residual Plot
Log Yen / Dollar Rate History and Forecast AR(1) in Differences with Intercept
Log Yen / Dollar Rate History and Long-Horizon Forecast AR(1) in Differences with Intercept
Log Yen / Dollar Rate History, Forecast and Realization AR(1) in Differences with Intercept
Sample: 1973:01 1994:12 Included observations: 264 Method: Holt-Winters, No Seasonal Original Series: LYEN Forecast Series: LYENSM Parameters: Alpha Beta Sum of Squared Residuals Root Mean Squared Error End of Period Levels: Mean Trend 1.000000 0.090000 0.202421 0.027690 4.606969 -0.005193
Log Yen / Dollar Rate History and Long-Horizon Forecast Holt-Winters Smoothing
Log Yen / Dollar Rate History, Forecast and Realization Holt-Winters Smoothing
Well look at: Basic structure and properties Time variation in volatility and prediction-error variance ARMA representation in squares GARCH(1,1) and exponential smoothing Unconditional symmetry and leptokurtosis Convergence to normality under temporal aggregation Estimation and testing
Basic Structure and Properties Standard models (e.g., ARMA): Unconditional mean: constant Unconditional variance: constant Conditional mean: varies Conditional variance: constant (unfortunately) k-step-ahead forecast error variance: depends only on k, not on t (again unfortunately)
ARCH(1) process:
Time Variation in Volatility and Prediction Error Variance Prediction error variance depends on t-1 e.g., 1-step-ahead prediction error variance is now
Thus
is a noisy indicator of
where
GARCH(1,1):
Unconditional Symmetry and Leptokurtosis C Volatility clustering produces unconditional leptokurtosis C Conditional symmetry translates into unconditional symmetry Unexpected agreement with the facts!
Convergence to Normality under Temporal Aggregation CTemporal aggregation of covariance stationary GARCH processes produces convergence to normality. Again, unexpected agreement with the facts!
Testing: likelihood ratio tests Graphical diagnostics: Correlogram of squares, correlogram of squared standardized residuals
Variations on Volatility Models We will look at: Asymmetric response and the leverage effect Exogenous variables GARCH-M and time-varying risk premia
Asymmetric Response and the Leverage Effect: TGARCH and EGARCH Asymmetric response I: TARCH Standard GARCH:
TARCH:
where
positive return (good news): effect on volatility negative return (bad news): + effect on volatility 0: Asymetric news response >0: Leverage effect
Log specification ensures that the conditional variance is positive. Volatility driven by both size and sign of shocks Leverage effect when <0
, where
Transitory dynamics governed by Persistent dynamics governed by Equivalent to nonlinearly restricted GARCH(2,2) Exogenous variables and asymmetry can be allowed:
Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion F-statistic Prob(F-statistic)
Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion Durbin-Watson stat
Mean dependent var S.D. dependent var Akaike info criterion Schwarz criterion Durbin-Watson stat
Conditional Standard Deviation Extended History and Extended Forecast GARCH(1,1) Model