0% found this document useful (0 votes)
33 views14 pages

Chapetr 14-Intro Multiple Regression-MyLO 2020S1

Uploaded by

maria akshatha
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
33 views14 pages

Chapetr 14-Intro Multiple Regression-MyLO 2020S1

Uploaded by

maria akshatha
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

Chapter 14

Introduction to Multiple Regression

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 1

Objectives
In this chapter, you learn:
◼ How to develop a multiple regression model.
◼ How to interpret the regression coefficients.
◼ How to determine which independent variables to
include in the regression model.
◼ How to determine which independent variables are most
important in predicting a dependent variable.
◼ How to use categorical independent variables in a
regression model.
◼ How to use logistic regression to predict a categorical
dependent variable.

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 2

The Multiple Regression Model With k


Independent Variables
Idea: Examine the linear relationship between
1 dependent (Y) & 2 or more independent variables (Xi).
Multiple Regression Model with k Independent Variables:

Y-intercept Population slopes Random Error

Yi = β 0 + β1X1i + β 2 X 2i +    + β k X ki + ε i
Where:
β0 = Y intercept.
β1 = slope of Y with variable X1, holding X2, X3, X4, . . . , Xk constant.
β2 = slope of Y with variable X2, holding X1, X3, X4, . . . , Xk constant.
β3 = slope of Y with variable X3, holding X1, X2, X4,. . . , Xk constant.
.
.
βk = slope of Y with variable Xk, holding X1, X2, X3,. . . , Xk-1 constant.
εi = random error in Y for observation i.

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 3

3
Multiple Regression Model With
2 Independent Variables

Yi = β0 + β1X1i + β2X2i + εi
Where:
β0 = Y intercept.
β1 = slope of Y with variable X1, holding X2 constant.
β2 = slope of Y with variable X2, holding X1 constant.
εi = random error in Y for observation i.

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 4

Multiple Regression Equation


The coefficients of the multiple regression model are
estimated using sample data.

Multiple regression equation with k independent variables:


Estimated Estimated
(or predicted) Estimated slope coefficients.
Intercept.
value of Y.

ˆ = b + b X + b X +  + b X
Yi 0 1 1i 2 2i k ki

In this chapter we will use Excel to obtain the regression


slope coefficients and other regression summary measures.

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 5

Multiple Regression Equation With


Two Independent Variables
Two variable model.
Y
Ŷ = b0 + b1X1 + b2 X2

X2

X1
Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 6

6
Example:
2 Independent Variables
◼ A distributor of frozen dessert pies wants to
evaluate factors thought to influence demand.

◼ Dependent variable: Pie sales (units per week)


◼ Independent variables: Price (in $)
Advertising ($100’s)

◼ Data are collected for 15 weeks.

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 7

Example: Pie Sales


Pie Price Advertising
Week Sales ($) ($100s)
1 350 5.50 3.3
Multiple regression equation:
2 460 7.50 3.3
3 350 8.00 3.0 Sales = b0 + b1 (Price)
4 430 8.00 4.5
5 350 6.80 3.0 + b2 (Advertising).
6 380 7.50 4.0
7 430 4.50 3.0
8 470 6.40 3.7
9 450 7.00 3.5
10 490 5.00 4.0
11 340 7.20 3.5
12 300 7.90 3.2
13 440 5.90 4.0
14 450 5.00 3.5
15 300 7.00 2.7

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 8

Excel Multiple Regression Output


Regression Statistics
Multiple R 0.72213

R Square 0.52148
Adjusted R Square 0.44172
Standard Error 47.46341
Observations 15 Sales = 306.526 - 24.975(Price) + 74.131(Advertising)

ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404
Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392
Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 9

9
The Multiple Regression Equation

Sales = 306.526 – 24.975(Price) + 74.131(Advertising)


Where:
Sales is in number of pies per week.
Price is in $.
Advertising is in $100’s.

b1 = -24.975: Sales b2 = 74.131: Sales


will decrease, on will increase, on
average, by 24.975 average, by 74.131
pies per week for pies per week for
each $1 increase in each $100 increase
selling price, net of in advertising, net of
the effects of changes the effects of changes
due to advertising. due to price

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 10

10

Using The Regression Equation


to Make Predictions
Predict sales for a week in which the selling price is
$5.50 and advertising is $350:

Sales = 306.526 - 24.975(Price) + 74.131(Advertising)


= 306.526 - 24.975 (5.50) + 74.131 (3.5)
= 428.6216

Note that Advertising is


Predicted sales in $100s, so $350 means
that X2 = 3.5.
is 428.6216 pies.
Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 11

11

The Coefficient of Multiple


Determination, r2
◼ Reports the proportion of total variation in Y
explained by all X variables taken together.

SSR regression sum of squares


r2 = =
SST total sum of squares

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 12

12
Multiple Coefficient of Determination
In Excel
Regression Statistics
SSR 29,460.027
Multiple R 0.72213
r2 = = = .52148
R Square 0.52148 SST 56,493.306
Adjusted R Square 0.44172
52.1% of the variation in pie sales
Standard Error 47.46341
is explained by the variation in
Observations 15
price and advertising.
ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404
Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392
Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 13

13

Adjusted r2
◼ r2 never decreases when a new X variable is
added to the model.
◼ This can be a disadvantage when comparing

models.
◼ What is the net effect of adding a new variable?
◼ We lose a degree of freedom when a new X

variable is added.
◼ Did the new X variable add enough

explanatory power to offset the loss of one


degree of freedom?

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 14

14

Adjusted r2 (continued)

◼ Shows the proportion of variation in Y explained


by all X variables adjusted for the number of X
variables used:
  n − 1 
2
radj = 1 − (1 − r 2 ) 
  n − k − 1 
(where n = sample size, k = number of independent variables.)

◼ Penalizes excessive use of unimportant independent


variables.
◼ Smaller than r2.
◼ Useful in comparing among models.
Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 15

15
Adjusted r2 in Excel

= .44172
Regression Statistics 2
Multiple R 0.72213 radj
R Square 0.52148
Adjusted R Square 0.44172
44.2% of the variation in pie sales is explained by the
Standard Error 47.46341 variation in price and advertising, taking into account
Observations 15 the sample size and number of independent variables.

ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404
Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392
Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 16

16

Is the Overall Model Significant?


◼ F Test for Overall Significance of the Model.
◼ Shows if there is a linear relationship between all
of the X variables considered together and Y.
◼ Use F-test statistic.
◼ Hypotheses:
H0: β1 = β2 = … = βk = 0 (no linear relationship)
H1: at least one βi ≠ 0 (at least one independent
variable affects Y)

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 17

17

F-Test for Overall Significance


◼ Test statistic:
SSR
MSR k
FSTAT = =
MSE SSE
n − k −1

where FSTAT has numerator d.f.= k and


denominator d.f.= (n – k - 1).

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 18

18
F-Test for Overall Significance In Excel (continued)

Regression Statistics
Multiple R 0.72213
MSR 14,730.013
R Square 0.52148 FSTAT = = = 6.5386
Adjusted R Square 0.44172 MSE 2,252.776
Standard Error 47.46341
With 2 and 12 degrees P-value for
Observations 15 of freedom the F Test

ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404
Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392
Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 19

19

F Test for Overall Significance (continued)

H0: β1 = β2 = 0 Test Statistic:


H1: β1 and β2 not both zero MSR
FSTAT = = 6.5386
 = .05 MSE
df1= 2 df2 = 12
Decision:
Critical Since FSTAT test statistic is
Value:
in the rejection region (p-
F0.05 = 3.885 value < .05), reject H0.
 = .05
Conclusion:
0 F There is evidence that at least one
Do not Reject H0
reject H0 independent variable affects Y.
F0.05 = 3.885
Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 20

20

Multiple Regression Assumptions


Errors (residuals) from the regression model:
<

ei = (Yi – Yi)

Assumptions:
◼ The errors are normally distributed.

◼ Errors have a constant variance.

◼ The model errors are independent.

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 21

21
Residual Plots Used
in Multiple Regression
◼ These residual plots are used in multiple
regression:
<

◼ Residuals vs. Yi.


◼ Residuals vs. X1i.
◼ Residuals vs. X2i.
◼ Residuals vs. time (if time series data).

Use the residual plots to check for


violations of regression assumptions.

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 22

22

Residual Plots For The Pie Sales Model


Residuals Versus Predicted Pie Sales Residuals Versus Price
150 150

100 100
Residuals
Residuals

50 50

0 0

-50 -50

-100300 -100 4 5 6 7 8 9
350 400 450 500
Predicted Pie Sales Price

All these plots show


Residuals Versus Advertising
150
little or no pattern so we
100 can conclude the
Residuals

50
0 multiple regression
-50
-100 2.5
model is appropriate for
3 3.5 4 4.5 5
Advertising predicting pie sales.

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 23

23

Are Individual Variables Significant?


◼ Use t-tests of individual variable slopes.
◼ Shows if there is a linear relationship between
the variable Xj and Y holding constant the effects
of other X variables.

◼ Hypotheses:
◼ H0: βj = 0 (no linear relationship between Xj and Y.)
◼ H1: βj ≠ 0 (linear relationship does exist between Xj and Y.)

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 24

24
Are Individual Variables Significant? (continued)

H0: βj = 0 (no linear relationship between Xj and Y.)


H1: βj ≠ 0 (linear relationship does exist between Xj and Y.)

Test Statistic:

bj − 0
t STAT = (df = n – k – 1)
Sb
j

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 25

25

Are Individual Variables


Significant? Excel Output (continued)

Regression Statistics
t Stat for Price is tSTAT = -2.306, with
Multiple R 0.72213
R Square 0.52148
p-value .0398.
Adjusted R Square 0.44172
Standard Error 47.46341 t Stat for Advertising is tSTAT = 2.855,
Observations 15 with p-value .0145.

ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404
Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392
Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 26

26

Inferences about the Slope:


t Test Example
From the Excel output:
H0: βj = 0
H1: βj  0 For Price tSTAT = -2.306, with p-value .0398.

For Advertising tSTAT = 2.855, with p-value .0145


d.f. = 15-2-1 = 12
 = .05 The test statistic for each variable falls
t/2 = 2.1788 in the rejection region (p-values < .05).
Decision:
/2=.025 /2=.025 Reject H0 for each variable.
Conclusion:
There is evidence that both
Reject H0 Do not reject H0 Reject H0
-tα/2 tα/2 Price and Advertising affect
0
-2.1788 2.1788 pie sales at  = .05.
Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 27

27
Confidence Interval Estimate
for the Slope
Confidence interval for the population slope βj

b j  tα / 2 Sb where t has
(n – k – 1) d.f.
j

Coefficients Standard Error


Intercept 306.52619 114.25389 Here, t has
Price -24.97509 10.83213
(15 – 2 – 1) = 12 d.f.
Advertising 74.13096 25.96732

Example: Form a 95% confidence interval for the effect of changes in


price (X1) on pie sales:
-24.975 ± (2.1788)(10.832)
So the interval is (-48.576, -1.374)
(This interval does not contain zero, so price has a significant effect on sales).

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 28

28

Confidence Interval Estimate


for the Slope (continued)

Confidence interval for the population slope βj .


Coefficients Standard Error … Lower 95% Upper 95%
Intercept 306.52619 114.25389 … 57.58835 555.46404
Price -24.97509 10.83213 … -48.57626 -1.37392
Advertising 74.13096 25.96732 … 17.55303 130.70888

Example: Excel output also reports these interval endpoints:


Weekly sales are estimated to be reduced by between 1.37 to
48.58 pies for each increase of $1 in the selling price, holding the
effect of advertising constant.

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 29

29

Testing Portions of the


Multiple Regression Model
◼ Contribution of a Single Independent Variable Xj.

SSR(Xj | all variables except Xj)


= SSR (all variables) – SSR(all variables except Xj)

◼ Measures the contribution of Xj in explaining the total


variation in Y (SST).

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 30

30
Testing Portions of the
Multiple Regression Model (continued)

Contribution of a Single Independent Variable Xj,


assuming all other variables are already included
(consider here a 2-variable model):

SSR(X1 | X2)
= SSR (all variables) – SSR(X2)

From ANOVA section of From ANOVA section of


regression for regression for
ˆ =b +b X +b X
Y ˆ =b +b X
Y
0 1 1 2 2 0 2 2

Measures the contribution of X1 in explaining SST.


Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 31

31

The Partial F-Test Statistic


◼ Consider the hypothesis test:
H0: variable Xj does not significantly improve the model after all
other variables are included.
H1: variable Xj significantly improves the model after all other
variables are included.

◼ Test using the F-test statistic:


(with 1 and n-k-1 d.f.)

SSR (X j | all variables except j)


FSTAT =
MSE

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 32

32

Testing Portions of Model


Example: Frozen dessert pies

Test at the  = .05 level to determine


whether the price variable significantly
improves the model given that advertising
is included.

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 33

33
Testing Portions of Model (continued)
Example:
H0: X1 (price) does not improve the model
with X2 (advertising) included.
H1: X1 (price) does improve the model.

 = .05, df = 1 and 12
F0.05 = 4.75
(For X1 and X2) (For X2 only)
ANOVA ANOVA
df SS MS df SS
Regression 2 29460.02687 14730.01343 Regression 1 17484.22249
Residual 12 27033.30647 2252.775539 Residual 13 39009.11085
Total 14 56493.33333 Total 14 56493.33333

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 34

34

Testing Portions of Model (continued)


Example:
(For X1 and X2) (For X2 only)
ANOVA ANOVA
df SS MS df SS
Regression 2 29460.02687 14730.01343 Regression 1 17484.22249
Residual 12 27033.30647 2252.775539 Residual 13 39009.11085
Total 14 56493.33333 Total 14 56493.33333

SSR (X1 | X 2 ) 29,460.03 − 17,484.22


FSTAT = = = 5.316
MSE(all) 2,252.78

Conclusion: Since FSTAT = 5.316 > F0.05 = 4.75 Reject H0;


Adding X1 does improve model.

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 35

35

Testing Portions of Model (continued)


Example:
H0: X2 (advertising) does not improve the
model with X1 (price) included.
H1: X2 does improve model.
K?

 = .05, df = 1 and 12
F0.05 = 4.75
(For X1 and X2) (For X1 only)
ANOVA ANOVA
df SS MS df SS
Regression 2 29460.02687 14730.01343 Regression 1 11100.43803
Residual 12 27033.30647 2252.775539 Residual 13 45392.8953
Total 14 56493.33333 Total 14 56493.33333

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 36

36
Testing Portions of Model (continued)
Example:
(For X1 and X2) (For X1 only)
ANOVA ANOVA
df SS MS df SS
Regression 2 29460.02687 14730.01343 Regression 1 11100.43803
Residual 12 27033.30647 2252.775539 Residual 13 45392.8953
Total 14 56493.33333 Total 14 56493.33333

SSR (X 2 | X1 ) 29,460.03 − 11,100.44


FSTAT = = = 8.150
MSE(all) 2,252.78

Conclusion: Since FSTAT = 8.150 > F0.05 = 4.75 Reject H0;


Adding X2 does improve model.

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 37

37

Coefficient of Partial Determination


for k variable model
2
rYj.(all variablesexcept j)

SSR (X j | all variables except j)


=
SST− SSR(all variables ) + SSR(X j | all variables except j)

◼ Measures the proportion of variation in the dependent


variable that is explained by Xj while controlling for
(holding constant) the other independent variables.

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 38

38

Chapter Summary
In this chapter we discussed:
◼ How to develop a multiple regression model.
◼ How to interpret the regression coefficients.
◼ How to determine which independent variables to
include in the regression model.

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 39

39
Week 13 Tutorial
Selected Practice Questions (see Levine et. al. 8th edition):
Page 504: 14.1a-b and 14.3a-c.
Page 507: 14.9a-e.
Page 513: 14.25a-b.
Page 518: 14.31a and 14.32a.

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 40

40

Week 13 Tutorial
Additional Practice Questions (Levine et. al. 7th edition):

Copyright © 2017 Pearson Education, Ltd. Chapter 14 - 41

41

You might also like