0% found this document useful (0 votes)
116 views

Multiple Regression

Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
116 views

Multiple Regression

Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 67

Introduction to Multiple Regression

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 14-1


Learning Objectives

In this chapter, you learn:


n How to develop a multiple regression model

n How to interpret the regression coefficients

n How to determine which independent variables to

include in the regression model


n How to determine which independent variables are more

important in predicting a dependent variable


n How to use categorical variables in a regression model

n How to predict a categorical dependent variable using

logistic regression

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-2


The Multiple Regression
Model

Idea: Examine the linear relationship between


1 dependent (Y) & 2 or more independent variables (Xi)

Multiple Regression Model with k Independent Variables:

Y-intercept Population slopes Random Error

Yi = β 0 + β1X1i + β 2 X 2i + × × × + β k X ki + ε i

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-3


Multiple Regression Equation

The coefficients of the multiple regression model are


estimated using sample data

Multiple regression equation with k independent variables:


Estimated Estimated
(or predicted) Estimated slope coefficients
intercept
value of Y

ˆ = b + b X + b X + ××× + b X
Yi 0 1 1i 2 2i k ki
In this chapter we will use Excel or Minitab to obtain the
regression slope coefficients and other regression
summary measures.
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-4
Multiple Regression Equation
(continued)
Two variable model
Y
Ŷ = b0 + b1X1 + b2 X2

X2

X1
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-5
Example:
2 Independent Variables
n A distributor of frozen dessert pies wants to
evaluate factors thought to influence demand

n Dependent variable: Pie sales (units per week)


n Independent variables: Price (in $)
Advertising ($100’s)

n Data are collected for 15 weeks

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-6


Pie Sales Example
Pie Price Advertising
Week Sales ($) ($100s) Multiple regression equation:
1 350 5.50 3.3
2 460 7.50 3.3 Sales = b0 + b1 (Price)
3 350 8.00 3.0
4 430 8.00 4.5 + b2 (Advertising)
5 350 6.80 3.0
6 380 7.50 4.0
7 430 4.50 3.0
8 470 6.40 3.7
9 450 7.00 3.5
10 490 5.00 4.0
11 340 7.20 3.5
12 300 7.90 3.2
13 440 5.90 4.0
14 450 5.00 3.5
15 300 7.00 2.7
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-7
Excel Multiple Regression Output
Regression Statistics
Multiple R 0.72213

R Square 0.52148
Adjusted R Square 0.44172
Standard Error 47.46341 Sales = 306.526 - 24.975(Price) + 74.131(Advertising)
Observations 15

ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404
Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392
Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-8


Minitab Multiple Regression Output

Sales = 306.526 - 24.975(Price) + 74.131(Advertising)

The regression equation is


Sales = 307 - 25.0 Price + 74.1 Advertising

Predictor Coef SE Coef T P


Constant 306.50 114.30 2.68 0.020
Price -24.98 10.83 -2.31 0.040
Advertising 74.13 25.97 2.85 0.014

S = 47.4634 R-Sq = 52.1% R-Sq(adj) = 44.2%

Analysis of Variance

Source DF SS MS F P
Regression 2 29460 14730 6.54 0.012
Residual Error 12 27033 2253
Total 14 56493

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-9


The Multiple Regression Equation

Sales = 306.526 - 24.975(Price) + 74.131(Adv ertising)


where
Sales is in number of pies per week
Price is in $
Advertising is in $100’s.
b1 = -24.975: sales b2 = 74.131: sales will
will decrease, on increase, on average,
average, by 24.975 by 74.131 pies per
pies per week for week for each $100
each $1 increase in increase in
selling price, net of advertising, net of the
the effects of changes effects of changes
due to advertising due to price

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-10


Using The Equation to Make
Predictions
Predict sales for a week in which the selling
price is $5.50 and advertising is $350:

Sales = 306.526 - 24.975(Pri ce) + 74.131(Adv ertising)


= 306.526 - 24.975 (5.50) + 74.131 (3.5)
= 428.62

Note that Advertising is


Predicted sales in $100’s, so $350
means that X2 = 3.5
is 428.62 pies

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-11


Predictions in Excel using PHStat
n PHStat | regression | multiple regression …

Check the
“confidence and
prediction interval
estimates” box

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-12


Predictions in PHStat
(continued)

Input values

<
Predicted Y value
Confidence interval for the
mean value of Y, given
these X values

Prediction interval for an


individual Y value, given
these X values
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-13
Predictions in Minitab
Confidence interval for
the mean value of Y,
given these X values
Predicted Values for New Observations

New
Obs Fit SE Fit 95% CI 95% PI
ˆ value 1 428.6 17.2 (391.1, 466.1) (318.6, 538.6)
Predicted Y

Values of Predictors for New Observations

New
Obs Price Advertising
1 5.50 3.50 Prediction interval
for an individual Y
value, given these X
Input values values

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-14


Coefficient of
Multiple Determination
n Reports the proportion of total variation in Y
explained by all X variables taken together

SSR regression sum of squares


r =2
=
SST total sum of squares

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-15


Multiple Coefficient of
Determination In Excel
Regression Statistics
SSR 29460.0
Multiple R 0.72213
r =
2
= = .52148
R Square 0.52148 SST 56493.3
Adjusted R Square 0.44172
52.1% of the variation in pie sales
Standard Error 47.46341
is explained by the variation in
Observations 15
price and advertising
ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404
Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392
Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-16


Multiple Coefficient of
Determination In Minitab

The regression equation is


Sales = 307 - 25.0 Price + 74.1 Advertising

Predictor Coef SE Coef T P


Constant 306.50 114.30 2.68 0.020 SSR 29460.0
Price -24.98 10.83 -2.31 0.040
r2 = = = .52148
Advertising 74.13 25.97 2.85 0.014 SST 56493.3
S = 47.4634 R-Sq = 52.1% R-Sq(adj) = 44.2%

Analysis of Variance
52.1% of the variation in pie
Source DF SS MS F P sales is explained by the
Regression 2 29460 14730 6.54 0.012 variation in price and
Residual Error 12 27033 2253
advertising
Total 14 56493

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-17


Adjusted r2
n r2 never decreases when a new X variable is
added to the model
n This can be a disadvantage when comparing

models
n What is the net effect of adding a new variable?
n We lose a degree of freedom when a new X

variable is added
n Did the new X variable add enough

explanatory power to offset the loss of one


degree of freedom?
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-18
Adjusted r2
(continued)
n Shows the proportion of variation in Y explained
by all X variables adjusted for the number of X
variables used
é 2 æ n - 1 öù
r 2
adj = 1 - ê(1 - r )ç ÷ú
ë è n - k - 1 øû
(where n = sample size, k = number of independent variables)

n Penalize excessive use of unimportant independent


variables
n 0<= adjusted r2 <=r2
n Useful in comparing among models
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-19
Adjusted r2 in Excel
Regression Statistics
Multiple R 0.72213
2
r
adj = .44172
R Square 0.52148
Adjusted R Square 0.44172
44.2% of the variation in pie sales is
Standard Error 47.46341
explained by the variation in price and
Observations 15 advertising, taking into account the sample
size and number of independent variables
ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404
Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392
Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-20


Adjusted r2 in Minitab

The regression equation is


2
radj = .44172
Sales = 307 - 25.0 Price + 74.1 Advertising

Predictor Coef SE Coef T P


Constant 306.50 114.30 2.68 0.020
Price -24.98 10.83 -2.31 0.040
Advertising 74.13 25.97 2.85 0.014
44.2% of the variation in pie
S = 47.4634 R-Sq = 52.1% R-Sq(adj) = 44.2% sales is explained by the
variation in price and
Analysis of Variance
advertising, taking into account
Source DF SS MS F P the sample size and number of
Regression 2 29460 14730 6.54 0.012 independent variables
Residual Error 12 27033 2253
Total 14 56493

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-21


Is the Model Significant?
n F Test for Overall Significance of the Model
n Shows if there is a linear relationship between all
of the X variables considered together and Y
n Use F-test statistic
n Hypotheses:
H0: β1 = β2 = … = βk = 0 (no linear relationship)
H1: at least one βi ≠ 0 (at least one independent
variable affects Y)

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-22


F Test for Overall Significance
n Test statistic:
SSR
MSR k
FSTAT = =
MSE SSE
n - k -1

where FSTAT has numerator d.f. = k and


denominator d.f. = (n – k - 1)

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-23


F Test for Overall Significance In
Excel
(continued)
Regression Statistics
Multiple R 0.72213
MSR 14730.0
R Square 0.52148
FSTAT = = = 6.5386
Adjusted R Square 0.44172 MSE 2252.8
Standard Error 47.46341
With 2 and 12 degrees P-value for
Observations 15 of freedom the F Test

ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404
Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392
Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-24


F Test for Overall Significance In
Minitab

The regression equation is


Sales = 307 - 25.0 Price + 74.1 Advertising

Predictor Coef SE Coef T P


Constant 306.50 114.30 2.68 0.020
Price -24.98 10.83 -2.31 0.040
MSR 14730.0
Advertising 74.13 25.97 2.85 0.014 FSTAT = = = 6.5386
MSE 2252.8
S = 47.4634 R-Sq = 52.1% R-Sq(adj) = 44.2%

Analysis of Variance

Source DF SS MS F P
Regression 2 29460 14730 6.54 0.012
Residual Error 12 27033 2253
Total 14 56493

With 2 and 12 degrees P-value for


of freedom the F Test

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-25


F Test for Overall Significance
(continued)

H0 : β 1 = β 2 = 0 Test Statistic:
H1: β1 and β2 not both zero MSR
FSTAT = = 6.5386
a = .05 MSE
df1= 2 df2 = 12
Decision:
Critical Since FSTAT test statistic is
Value:
in the rejection region (p-
F0.05 = 3.885 value < .05), reject H0
a = .05
Conclusion:
0 F There is evidence that at least one
Do not Reject H0
reject H0 independent variable affects Y
F0.05 = 3.885
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-26
Residuals in Multiple Regression
Two variable model
Y Sample

Residual =
Yi observation Ŷ = b0 + b1X1 + b2 X2
<

ei = (Yi – Yi)
<

Yi

x2i
X2

x1i
The best fit equation is found
by minimizing the sum of
X1
squared errors, Se2
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-27
Multiple Regression Assumptions

Errors (residuals) from the regression model:

<
ei = (Yi – Yi)

Assumptions:
n The errors are normally distributed

n Errors have a constant variance

n The model errors are independent

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-28


Residual Plots Used
in Multiple Regression
n These residual plots are used in multiple
regression:

<
n Residuals vs. Yi
n Residuals vs. X1i
n Residuals vs. X2i
n Residuals vs. time (if time series data)

Use the residual plots to check for


violations of regression assumptions

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-29


Are Individual Variables
Significant?

n Use t tests of individual variable slopes


n Shows if there is a linear relationship between
the variable Xj and Y holding constant the effects
of other X variables
n Hypotheses:
n H0: βj = 0 (no linear relationship)
n H1: βj ≠ 0 (linear relationship does exist
between Xj and Y)

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-30


Are Individual Variables
Significant?
(continued)

H0: βj = 0 (no linear relationship)


H1: βj ≠ 0 (linear relationship does exist
between Xj and Y)

Test Statistic:

bj - 0
t STAT = (df = n – k – 1)
Sb
j

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-31


Are Individual Variables
Significant? Excel Output
(continued)
Regression Statistics
t Stat for Price is tSTAT = -2.306, with
Multiple R 0.72213
R Square 0.52148
p-value .0398
Adjusted R Square 0.44172
Standard Error 47.46341 t Stat for Advertising is tSTAT = 2.855,
Observations 15 with p-value .0145

ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404
Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392
Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-32


Are Individual Variables
Significant? Minitab Output

The regression equation is


Sales = 307 - 25.0 Price + 74.1 Advertising

Predictor Coef SE Coef T P


Constant 306.50 114.30 2.68 0.020
Price -24.98 10.83 -2.31 0.040
Advertising 74.13 25.97 2.85 0.014
t Stat for Price is tSTAT = -2.306, with
p-value .0398
S = 47.4634 R-Sq = 52.1% R-Sq(adj) = 44.2%
t Stat for Advertising is tSTAT = 2.855,
Analysis of Variance
with p-value .0145
Source DF SS MS F P
Regression 2 29460 14730 6.54 0.012
Residual Error 12 27033 2253
Total 14 56493

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-33


Inferences about the Slope:
t Test Example
From the Excel and Minitab output:
H0: βj = 0
H1: βj ¹ 0 For Price tSTAT = -2.306, with p-value .0398

For Advertising tSTAT = 2.855, with p-value .0145


d.f. = 15-2-1 = 12
a = .05 The test statistic for each variable falls
ta/2 = 2.1788 in the rejection region (p-values < .05)
Decision:
a/2=.025 a/2=.025 Reject H0 for each variable
Conclusion:
There is evidence that both
Reject H0
-tα/2
Do not reject H0
tα/2
Reject H0
Price and Advertising affect
0
-2.1788 2.1788 pie sales at a = .05
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-34
Confidence Interval Estimate
for the Slope
Confidence interval for the population slope βj

b j ± tα / 2 Sb where t has
(n – k – 1) d.f.
j

Coefficients Standard Error


Intercept 306.52619 114.25389 Here, t has
Price -24.97509 10.83213
(15 – 2 – 1) = 12 d.f.
Advertising 74.13096 25.96732

Example: Form a 95% confidence interval for the effect of changes in


price (X1) on pie sales:
-24.975 ± (2.1788)(10.832)
So the interval is (-48.576 , -1.374)
(This interval does not contain zero, so price has a significant effect on sales)
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-35
Confidence Interval Estimate
for the Slope
(continued)
Confidence interval for the population slope βj

Coefficients Standard Error … Lower 95% Upper 95%


Intercept 306.52619 114.25389 … 57.58835 555.46404
Price -24.97509 10.83213 … -48.57626 -1.37392
Advertising 74.13096 25.96732 … 17.55303 130.70888

Example: Excel output also reports these interval endpoints:


Weekly sales are estimated to be reduced by between 1.37 to
48.58 pies for each increase of $1 in the selling price, holding the
effect of price constant

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-36


Testing Portions of the
Multiple Regression Model
n Contribution of a Single Independent Variable Xj

SSR(Xj | all variables except Xj)


= SSR (all variables) – SSR(all variables except Xj)

n Measures the contribution of Xj in explaining the


total variation in Y (SST)

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-37


Testing Portions of the
Multiple Regression Model
(continued)
Contribution of a Single Independent Variable Xj,
assuming all other variables are already included
(consider here a 2-variable model):

SSR(X1 | X2)
= SSR (all variables) – SSR(X2)

From ANOVA section of From ANOVA section of


regression for regression for
ˆ =b +b X +b X
Y ˆ =b +b X
Y
0 1 1 2 2 0 2 2

Measures the contribution of X1 in explaining SST


Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-38
The Partial F-Test Statistic

n Consider the hypothesis test:


H0: variable Xj does not significantly improve the model after all
other variables are included
H1: variable Xj significantly improves the model after all other
variables are included

n Test using the F-test statistic:


(with 1 and n-k-1 d.f.)

SSR (X j | all variables except j)


FSTAT =
MSE
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-39
Testing Portions of Model:
Example

Example: Frozen dessert pies

Test at the a = .05 level


to determine whether
the price variable
significantly improves
the model given that
advertising is included

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-40


Testing Portions of Model:
Example
(continued)

H0: X1 (price) does not improve the model


with X2 (advertising) included
H1: X1 does improve model

a = .05, df = 1 and 12
F0.05 = 4.75
(For X1 and X2) (For X2 only)
ANOVA ANOVA
df SS MS df SS
Regression 2 29460.02687 14730.01343 Regression 1 17484.22249
Residual 12 27033.30647 2252.775539 Residual 13 39009.11085
Total 14 56493.33333 Total 14 56493.33333

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-41


Testing Portions of Model:
Example
(continued)
(For X1 and X2) (For X2 only)
ANOVA ANOVA
df SS MS df SS
Regression 2 29460.02687 14730.01343 Regression 1 17484.22249
Residual 12 27033.30647 2252.775539 Residual 13 39009.11085
Total 14 56493.33333 Total 14 56493.33333

SSR (X1 | X 2 ) 29,460.03 - 17,484.22


FSTAT = = = 5.316
MSE(all) 2252.78

Conclusion: Since FSTAT = 5.316 > F0.05 = 4.75 Reject H0;


Adding X1 does improve model

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-42


Relationship Between Test
Statistics

n The partial F test statistic developed in this section and


the t test statistic are both used to determine the
contribution of an independent variable to a multiple
regression model.
n The hypothesis tests associated with these two
statistics always result in the same decision (that is, the
p-values are identical).

2
ta = F1,a
Where a = degrees of freedom

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-43


Coefficient of Partial Determination
for k variable model
2
rYj.(all variables except j)

SSR (X j | all variables except j)


=
SST- SSR(all variables) + SSR(X j | all variables except j)

n Measures the proportion of variation in the dependent


variable that is explained by Xj while controlling for
(holding constant) the other independent variables

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-44


Coefficient of Partial
Determination in Excel
n Coefficients of Partial Determination can be
found using Excel:

n PHStat | regression | multiple regression …


n Check the “coefficient of partial determination” box

Regression Analysis
Coefficients of Partial Determination

Intermediate Calculations
SSR(X1,X2) 29460.02687
SST 56493.33333
SSR(X2) 17484.22249 SSR(X1 | X2) 11975.80438
SSR(X1) 11100.43803 SSR(X2 | X1) 18359.58884

Coefficients
r2 Y1.2 0.307000188
r2 Y2.1 0.404459524
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-45
Using Dummy Variables

n A dummy variable is a categorical independent


(qualitative) variable with two levels:
n yes or no, on or off, male or female
n coded as 0 or 1
n Assumes the slopes associated with numerical
independent variables do not change with the
value for the categorical variable
n If more than two levels, the number of dummy
variables needed is (number of levels - 1)

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-46


Qualitative Independent
Variables

n Many variables of interest in business,


economics, and social and biological sciences
are not quantitative but are qualitative.
n Examples of qualitative variables are gender
(male, female), purchase status (purchase, no
purchase), and type of firms.
n Qualitative variables can also be used in
multiple regression.
Qualitative Independent Variables

n An economist wished to relate the speed with which a


particular insurance innovation is adopted (y) to the size of
the insurance firm (x1) and the type of firm. The dependent
variable is measured by the number of months elapsed
between the time the first firm adopted the innovation and
and the time the given firm adopted the innovation. The first
independent variable, size of the firm, is quantitative, and
measured by the amount of total assets of the firm. The
second independent variable, type of firm, is qualitative and
is composed of two classes-Stock companies and mutual
companies.
Indicator variables (Dummy
Variables)

n Indicator, or dummy variables are used to


determine the relationship between qualitative
independent variables and a dependent
variable.
n Indicator variables take on the values 0 and 1.
n For the insurance innovation example, where
the qualitative variable has two classes, we
might define the indicator variable x2 as follows:
n
1 if stock company
x2 =
0 otherwise
Indicator variables

n A qualitative variable with c classes will be


represented by c-1 indicator variables.
n A regression function with an indicator variable
with two levels (c = 2) will yield two estimated
lines.
Interpretation of Regression
Coefficients

n In our insurance innovation example, the regression


model is:

n Where: y = b 0 + b1 x1 + b 2 x2 + e
n

n
x1 = size of firm
1 if stock company
x2 =
0 otherwise
Dummy-Variable Example
(with 2 Levels)

Ŷ = b0 + b1X1 + b2 X2
Let:
Y = pie sales
X1 = price
X2 = holiday (X2 = 1 if a holiday occurred during the week)
(X2 = 0 if there was no holiday that week)

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-52


Dummy-Variable Example
(with 2 Levels)
(continued)

Ŷ = b0 + b1X1 + b2 (1) = (b0 + b2 ) + b1X1 Holiday

Ŷ = b0 + b1X1 + b2 (0) = b0 + b1X1 No Holiday

Different Same
intercept slope
Y (sales)

If H0: β2 = 0 is
b0 + b2 rejected, then
b0 “Holiday” has a
significant effect
on pie sales

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc..


X1 (Price) Chap 14-53
Interpreting the Dummy Variable
Coefficient (with 2 Levels)
Example: Sales = 300 - 30(Price) + 15(Holiday)
Sales: number of pies sold per week
Price: pie price in $
1 If a holiday occurred during the week
Holiday:
0 If no holiday occurred

b2 = 15: on average, sales were 15 pies greater in


weeks with a holiday than in weeks without a
holiday, given the same price

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-54


Dummy-Variable Models
(more than 2 Levels)
n The number of dummy variables is one less
than the number of levels
n Example:
Y = house price ; X1 = square feet

n If style of the house is also thought to matter:


Style = ranch, split level, colonial

Three levels, so two dummy


variables are needed

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-55


Dummy-Variable Models
(more than 2 Levels)
(continued)
n Example: Let “colonial” be the default category, and let
X2 and X3 be used for the other two categories:

Y = house price
X1 = square feet
X2 = 1 if ranch, 0 otherwise
X3 = 1 if split level, 0 otherwise

The multiple regression equation is:

Ŷ = b0 + b1X1 + b2 X2 + b3 X3
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-56
Interpreting the Dummy Variable
Coefficients (with 3 Levels)
Consider the regression equation:
Ŷ = 20.43 + 0.045X1 + 23.53X2 + 18.84X3
For a colonial: X2 = X3 = 0
With the same square feet, a
Ŷ = 20.43 + 0.045X 1 ranch will have an estimated
average price of 23.53
thousand dollars more than a
For a ranch: X2 = 1; X3 = 0
colonial.
Ŷ = 20.43 + 0.045X 1 + 23.53
With the same square feet, a
split-level will have an
For a split level: X2 = 0; X3 = 1
estimated average price of
Ŷ = 20.43 + 0.045X 1 + 18.84 18.84 thousand dollars more
than a colonial.
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-57
Interaction Between
Independent Variables
n Hypothesizes interaction between pairs of X
variables
n Response to one X variable may vary at different
levels of another X variable

n Contains two-way cross product terms

n Ŷ = b0 + b1X1 + b 2 X 2 + b3 X3

= b0 + b1X1 + b 2 X 2 + b3 (X1X2 )
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-58
Effect of Interaction

n Given: Y = β0 + β1X1 + β2 X2 + β3 X1X2 + ε

n Without interaction term, effect of X1 on Y is


measured by β1
n With interaction term, effect of X1 on Y is
measured by β1 + β3 X2
n Effect changes as X2 changes

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-59


Interaction Example
Suppose X2 is a dummy variable and the estimated
regression equation is Ŷ = 1 + 2X1 + 3X2 + 4X1X2
Y
12
X2 = 1:
8 Y = 1 + 2X1 + 3(1) + 4X1(1) = 4 + 6X1

4
X2 = 0:
Y = 1 + 2X1 + 3(0) + 4X1(0) = 1 + 2X1
0
X1
0 0.5 1 1.5
Slopes are different if the effect of X1 on Y depends on X2 value
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-60
Significance of Interaction Term

n Can perform a partial F test for the contribution


of a variable to see if the addition of an
interaction term improves the model

n Multiple interaction terms can be included


n Use a partial F test for the simultaneous contribution
of multiple variables to the model

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-61


Simultaneous Contribution of
Independent Variables
n Use partial F test for the simultaneous
contribution of multiple variables to the model
n Let m variables be an additional set of variables
added simultaneously
n To test the hypothesis that the set of m variables
improves the model:

[SSR(all) - SSR (all except new set of m variables)] / m


FSTAT =
MSE(all)

(where FSTAT has m and n-k-1 d.f.)


Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-62
Logistic Regression

n Used when the dependent variable Y is binary


(i.e., Y takes on only two values)
n Examples
n Customer prefers Brand A or Brand B
n Employee chooses to work full-time or part-time
n Loan is delinquent or is not delinquent
n Person voted in last election or did not
n Logistic regression allows you to predict the
probability of a particular categorical response

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-63


Logistic Regression
(continued)

n Logistic regression is based on the odds ratio,


which represents the probability of a success
compared with the probability of failure

probability of success
Odds ratio =
1- probability of success

n The logistic regression model is based on the


natural log of this odds ratio

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-64


Logistic Regression
(continued)

Logistic Regression Model:

ln(odds ratio) = β 0 + β1X1i + β 2 X 2i + × × × + β k X ki + ε i


Where k = number of independent variables in the model
εi = random error in observation i

Logistic Regression Equation:

ln(estimated odds ratio) = b 0 + b1X1i + b 2 X 2i + × × × + b k X ki

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-65


Estimated Odds Ratio and
Probability of Success

n Once you have the logistic regression equation,


compute the estimated odds ratio:

Estimated odds ratio = e ln(estimat ed odds ratio)

n The estimated probability of success is


estimated odds ratio
Estimated probability of success =
1+ estimated odds ratio

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-66


Chapter Summary
n Developed the multiple regression model
n Tested the significance of the multiple regression model
n Discussed adjusted r2
n Discussed using residual plots to check model
assumptions
n Tested individual regression coefficients
n Tested portions of the regression model
n Used dummy variables
n Evaluated interaction effects
n Discussed logistic regression

Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 14-67

You might also like