0% found this document useful (0 votes)
24 views39 pages

10a. Estimation and Forecasting Techniques

Uploaded by

ajengg
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views39 pages

10a. Estimation and Forecasting Techniques

Uploaded by

ajengg
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 39

ESTIMATION AND

FORECASTING
TECHNIQUES
O K TO FA Y U D H A S U D R A J A D, P H D
Individual Consumer’s Demand

QdX = f(PX, I, PY, T)

QdX = quantity demanded of commodity X by an


individual per time period
PX = price per unit of commodity X
I = consumer’s income
PY = price of related (substitute or complementary)
commodity
T = tastes of the consumer
DEMAND ESTIMATION:
MARKETING RESEARCH
APPROACHES
• Consumer Surveys
• Observational Research
• Consumer Clinics
• Market Experiments
• Virtual Shopping
• Virtual Management
STEPS IN DEMAND ESTIMATION

• Model Specification: Identify Variables


• Collect Data
• Specify Functional Form
• Estimate Function
• Test the Results
FUNCTIONAL FORM
SPECIFICATIONS
Linear Function:

QX = a0 + a1 PX + a2 I + a3 N + a4 PY + +e

Power Function: Estimation Format:

QX = a ( PXb1 )( PYb2 ) ln QX = ln a + b1 ln PX + b2 ln PY
THE IDENTIFICATION PROBLEM
REGRESSION ANALYSIS
Year X Y
1 10 44
Scatter Diagram
2 9 40
3 11 42
4 12 46
5 11 48
6 12 52
7 13 54
8 13 58
9 14 56
10 15 60
REGRESSION ANALYSIS

• Regression Line: Line of Best Fit

• Regression Line: Minimizes the sum of the squared vertical deviations (et) of
each point from the regression line.

• Ordinary Least Squares (OLS) Method


REGRESSION ANALYSIS
ORDINARY LEAST SQUARES
(OLS)
Model: Yt = a + bX t + et

ˆ
Yˆt = aˆ + bX t

et = Yt − Yˆt
ORDINARY LEAST SQUARES
(OLS)
Objective: Determine the slope and
intercept that minimize the sum of
the squared errors.

n n n

t  t t  t
e 2

t =1
= (Y −
t =1
Yˆ ) 2
= (Y
t =1
− ˆ
a − ˆ )2
bX t
ORDINARY LEAST SQUARES
(OLS)
Estimation Procedure
n

(X t − X )(Yt − Y )
bˆ = t =1 ˆ
â = Y − bX
n

(X
t =1
t − X) 2
ORDINARY LEAST SQUARES
(OLS)
Estimation Example
Time Xt Yt Xt − X Yt − Y ( X t − X )(Yt − Y ) ( X t − X )2
1 10 44 -2 -6 12 4
2 9 40 -3 -10 30 9
3 11 42 -1 -8 8 1
4 12 46 0 -4 0 0
5 11 48 -1 -2 2 1
6 12 52 0 2 0 0
7 13 54 1 4 4 1
8 13 58 1 8 8 1
9 14 56 2 6 12 4
10 15 60 3 10 30 9
120 500 106 30
n n n
n = 10  X t = 120
t =1
 Yt = 500
t =1
(X
t =1
t − X ) 2 = 30 bˆ =
106
30
= 3.533

n n n
X t 120 Yt 500
X = = = 12 Y = = = 50 (X t − X )(Yt − Y ) = 106 aˆ = 50 − (3.533)(12) = 7.60
t =1 n 10 t =1 n 10 t =1
ORDINARY LEAST SQUARES
(OLS) Estimation Example
n
X t 120
n = 10 X = = = 12
t =1 n 10
n
Yt 500
Y =
n n

X
t =1
t = 120 Y
t =1
t = 500
t =1 n
=
10
= 50

n
106
(X
t =1
t − X ) = 30
2 ˆ
b=
30
= 3.533

(X
t =1
t − X )(Yt − Y ) = 106 aˆ = 50 − (3.533)(12) = 7.60
TESTS OF SIGNIFICANCE

Standard Error of the Slope Estimate

sbˆ =
 (Yt − Y )
ˆ 2

=
e 2
t

( n − k ) ( X t − X ) 2
(n − k ) ( X t − X )2
TESTS OF SIGNIFICANCE
Example Calculation
Time Xt Yt Yˆt et = Yt − Yˆt et2 = (Yt − Yˆt )2 ( X t − X )2
1 10 44 42.90 1.10 1.2100 4
2 9 40 39.37 0.63 0.3969 9
3 11 42 46.43 -4.43 19.6249 1
4 12 46 49.96 -3.96 15.6816 0
5 11 48 46.43 1.57 2.4649 1
6 12 52 49.96 2.04 4.1616 0
7 13 54 53.49 0.51 0.2601 1
8 13 58 53.49 4.51 20.3401 1
9 14 56 57.02 -1.02 1.0404 4
10 15 60 60.55 -0.55 0.3025 9
65.4830 30

n n n  (Y − Yˆ ) 2
65.4830
 e =  (Yt − Yˆt )2 = 65.4830 (X sbˆ = = = 0.52
t
2
− X ) = 30
2

t =1
t
t =1 t =1
t
( n − k ) ( X − X )
t
2
(10 − 2)(30)
TESTS OF SIGNIFICANCE
Example Calculation
n n

 t  t t = 65.4830
e 2

t =1
= (Y − Yˆ ) 2

t =1
n

 t
( X
t =1
− X ) 2
= 30

sbˆ =
 (Yt − Y )
ˆ 2

=
65.4830
= 0.52
( n − k ) ( X t − X ) 2
(10 − 2)(30)
TESTS OF SIGNIFICANCE
Calculation of the t Statistic

bˆ 3.53
t= = = 6.79
sbˆ 0.52

Degrees of Freedom = (n-k) = (10-2) = 8


Critical Value at 5% level =2.306
TESTS OF SIGNIFICANCE
Decomposition of Sum of Squares

Total Variation = Explained Variation + Unexplained Variation

 (Yt − Y ) =  (Y − Y ) +  (Yt − Yt )
2 ˆ 2 ˆ 2
TESTS OF SIGNIFICANCE
Decomposition of Sum of Squares
TESTS OF SIGNIFICANCE
Coefficient of Determination

R2 =
Explained Variation
=
 (Yˆ − Y ) 2

TotalVariation  t
(Y − Y ) 2

373.84
R =
2
= 0.85
440.00
TESTS OF SIGNIFICANCE
Coefficient of Correlation

r = R 2 withthe sign of bˆ

−1  r  1

r = 0.85 = 0.92
MULTIPLE REGRESSION ANALYSIS

Model: Y = a + b1 X 1 + b2 X 2 + + bk ' X k '


MULTIPLE REGRESSION ANALYSIS

Adjusted Coefficient of Determination

(n − 1)
R 2 = 1 − (1 − R 2 )
(n − k )
MULTIPLE REGRESSION ANALYSIS

Analysis of Variance and F Statistic

Explained Variation /(k − 1)


F=
Unexplained Variation /(n − k )

R 2 /(k − 1)
F=
(1 − R 2 ) /(n − k )
PROBLEMS IN REGRESSION
ANALYSIS
• Multicollinearity: Two or more explanatory variables are highly correlated.
• Heteroskedasticity: Variance of error term is not independent of the Y variable.
• Autocorrelation: Consecutive error terms are correlated.
DURBIN-WATSON STATISTIC

Test for Autocorrelation


n

 t t −1
( e − e ) 2

d= t =2
n

t
e 2

t =1

If d=2, autocorrelation is absent.


TIME-SERIES ANALYSIS

• Secular Trend
– Long-Run Increase or Decrease in Data
• Cyclical Fluctuations
– Long-Run Cycles of Expansion and Contraction
• Seasonal Variation
– Regularly Occurring Fluctuations
• Irregular or Random Influences
TREND PROJECTION
• Linear Trend:
St = S0 + b t
b = Growth per time period
• Constant Growth Rate
St = S0 (1 + g)t
g = Growth rate
• Estimation of Growth Rate
lnSt = lnS0 + t ln(1 + g)
SEASONAL VARIATION

Ratio to Trend Method


Actual
Ratio =
Trend Forecast
Seasonal Average of Ratios for
=
Adjustment Each Seasonal Period
Adjusted Trend Seasonal
Forecast = Forecast Adjustment
SEASONAL VARIATION
Ratio to Trend Method:
Example Calculation for Quarter 1
Trend Forecast for 1996.1 = 11.90 + (0.394)(17) = 18.60
Seasonally Adjusted Forecast for 1996.1 = (18.60)(0.8869) = 16.50

Trend
Year Forecast Actual Ratio
1992.1 12.29 11.00 0.8950
1993.1 13.87 12.00 0.8652
1994.1 15.45 14.00 0.9061
1995.1 17.02 15.00 0.8813
Seasonal Adjustment = 0.8869
MOVING AVERAGE FORECASTS

Forecast is the average of data from w


periods prior to the forecast data point.

w
At −i
Ft = 
i =1 w
THE EFFECT OF THE PARAMETER N

• A smaller w makes the forecast more

responsive
• A larger w makes the forecast more

stable
EXPONENTIAL SMOOTHING
FORECASTS
Forecast is the weighted average of of
the forecast and the actual value from
the prior period.

Ft +1 = wAt + (1 − w) Ft

0  w 1
THE EFFECT OF THE PARAMETER 

• A smaller w makes the forecast more

stable
• A larger w makes the forecast more

responsive
AUTOGREGRESSIVE MODELING
• Used for Forecasting
• Takes Advantage of Autocorrelation
– 1st order - correlation between consecutive values
– 2nd order - correlation between values 2 periods apart
• Autoregressive Model for pth order: Random
Error

Yi = b0 + b1Yi −1 + b2Yi −2 + • • • + bpYi − p + ei


AUTOREGRESSIVE MODELING
STEPS
• 1. Choose p:
• 2. Form a series of “lag predictor” variables
Yi-1 , Yi-2 , … Yi-p
• 3. Use Excel to run regression model using
all p variables
• 4. Test significance of Bp
– If null hypothesis rejected, this model is selected
– If null hypothesis not rejected, decrease p by 1 and repeat
your calculations
ROOT MEAN SQUARE ERROR

Measures the Accuracy


of a Forecasting Method

RMSE =
(A − F )
t t
2

You might also like