Multiple Regression
Multiple Regression
Statistics
Chap 14-1
The Multiple Regression Model
Yi = β 0 + β1 X1i + β 2 X 2i + ⋅ ⋅ ⋅ + β k X ki + ε i
Multiple Regression Equation
ˆ = b + b X + b X + ⋅⋅⋅ + b X
Yi 0 1 1i 2 2i k ki
In this chapter we will use Excel to obtain the regression
slope coefficients and other regression summary
measures.
Multiple Regression Equation
(continued)
Two variable model
Y
Ŷ = b0 + b1X1 + b 2 X 2
X2
X1
Example:
2 Independent Variables
Regression Statistics
Multiple R 0.72213
R Square 0.52148
Adjusted R Square 0.44172
Standard Error 47.46341
Observations 15
Sales = 306.526 - 24.975(Price) + 74.131(Advertising)
ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333
Analysis of Variance
Source DF SS MS F P
Regression 2 29460 14730 6.54 0.012
Residual Error 12 27033 2253
Total 14 56493
The Multiple Regression Equation
Regression Statistics
Multiple R 0.72213 SSR 29460.0
R Square 0.52148
r =
2
= = .52148
SST 56493.3
Adjusted R Square 0.44172
Standard Error 47.46341 52.1% of the variation in pie sales is
Observations 15 explained by the variation in price
and advertising
ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333
Analysis of Variance
52.1% of the variation in pie
Source DF SS MS F P sales is explained by the
Regression 2 29460 14730 6.54 0.012 variation in price and
Residual Error 12 27033 2253
Total 14 56493
advertising
Adjusted r2
(continued)
Shows the proportion of variation in Y
explained by all X variables adjusted for the
number of X variables used
Regression Statistics
Multiple R 0.72213
R Square 0.52148
2
radj = .44172
Adjusted R
Square 0.44172 44.2% of the variation in pie sales is explained
Standard Error 47.46341 by the variation in price and advertising, taking
Observations 15 into account number of independent variables
Significance
ANOVA df SS MS F F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333
Coefficient Standard
s Error t Stat P-value Lower 95% Upper 95%
Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404
Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392
Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888
Adjusted r2 in Minitab
(continued)
H0: βj = 0 (no linear relationship)
H1: βj ≠ 0 (linear relationship does exist
between Xj and Y)
Test Statistic:
bj − 0
t STAT = (df = n – k – 1)
Sb
j
t - Test
T statistic b/Sb
vs
(continued)
Regression Statistics
Multiple R 0.72213 t Stat for Price is tSTAT = -2.306, with
R Square 0.52148 p-value .0398
Adjusted R Square 0.44172
Standard Error 47.46341 t Stat for Advertising is tSTAT = 2.855,
Observations 15 with p-value .0145
ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333
b j ± tα / 2 S b where t has
(n – k – 1) d.f.
j
Ŷ = b0 + b1 X1 + b 2 X 2
Let:
Y = pie sales
X1 = price
X2 = holiday (X2 = 1 if a holiday occurred during the week)
(X2 = 0 if there was no holiday that week)
Dummy-Variable Example
(with 2 Levels)
(continued)
Ŷ = b0 + b1 X1 + b 2 (0) = b0 + b1 X1 No Holiday
Different Same
intercept slope
Y (sales)
If H0: β2 = 0 is
b0 + b2 rejected, then
b0 “Holiday” has a
significant effect
on pie sales
X1 (Price)
Interpreting the Dummy Variable
Coefficient (with 2 Levels)
Example:
Y = house price ; X1 = square feet
(continued)
Example: Let “colonial” be the default category, and
let X2 and X3 be used for the other two categories:
Y = house price
X1 = square feet
X2 = 1 if ranch, 0 otherwise
X3 = 1 if split level, 0 otherwise
Ŷ = b0 + b1X1 + b 2 X 2 + b3 X3
Interpreting the Dummy Variable
Coefficients (with 3 Levels)
Ŷ = b0 + b1X1 + b 2 X 2 + b3 X3
= b0 + b1X1 + b 2 X 2 + b3 (X1X 2 )
Interaction Between Independent
Variables
Ŷ = b0 + b1X1 + b 2 X 2 + b3 X3
= b0 + b1X1 + b 2 X 2 + b3 (X1X 2 )
When X2 = 0, b0 + b1X1
Y
12
X2 = 1:
Y = 1 + 2X1 + 3(1) + 4X1(1) = 4 + 6X1
8
4 X2 = 0:
Y = 1 + 2X1 + 3(0) + 4X1(0) = 1 + 2X1
0
X1
0 0.5 1 1.5
Slopes are different if the effect of X1 on Y depends on X2 value