0% found this document useful (0 votes)
14 views80 pages

Materi 2 - Simple Regression

This document provides an overview of simple linear regression. It defines regression as using an independent variable to predict a dependent variable. Simple linear regression involves using one independent variable (X) to predict a dependent variable (Y) based on a linear relationship between the two. The least squares method is used to calculate the slope and intercept of the regression line that best fits the data by minimizing the sum of squared errors between observed and predicted Y values. An example using house price and square footage data illustrates fitting a simple linear regression model in Excel.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views80 pages

Materi 2 - Simple Regression

This document provides an overview of simple linear regression. It defines regression as using an independent variable to predict a dependent variable. Simple linear regression involves using one independent variable (X) to predict a dependent variable (Y) based on a linear relationship between the two. The least squares method is used to calculate the slope and intercept of the regression line that best fits the data by minimizing the sum of squared errors between observed and predicted Y values. An example using house price and square footage data illustrates fitting a simple linear regression model in Excel.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 80

Simple Linear Regression

Sutikno
Department of Statistics
Faculty of Mathematics, Computing and Data Science
Institut Tekonologi Sepuluh Nopember Surabaya
[email protected]
085230203017

Source: Basic Business Statistics 12th Edition: Pearson Education, Inc. publishing as Prentice Hall
Correlation vs. Regression

• A scatter diagram can be used to show the


relationship between two variables
• Correlation analysis is used to measure
strength of the association (linear
relationship) between two variables
– Correlation is only concerned with strength of
the relationship
– No causal effect is implied with correlation

Department of Statistics, ITS Surabaya Slide-2


Introduction to Regression Analysis

• Regression analysis is used to:


– Predict the value of a dependent variable based on the value
of at least one independent variable
– Explain the impact of changes in an independent variable on
the dependent variable
Dependent variable: the variable we wish to predict
or explain
Independent variable: the variable used to explain
the dependent variable

Department of Statistics, ITS Surabaya Slide-3


Simple Linear Regression Model

• Only one independent variable, X


• Relationship between X and Y is described
by a linear function
• Changes in Y are assumed to be caused by
changes in X

Department of Statistics, ITS Surabaya Slide-4


Types of Relationships

Linear relationships Curvilinear relationships

Y Y

X X

Y Y

X X
Department of Statistics, ITS Surabaya Slide-5
Types of Relationships

(continued)
Strong relationships Weak relationships

Y Y

X X

Y Y

X X
Department of Statistics, ITS Surabaya Slide-6
Types of Relationships

(continued)
No relationship

X
Department of Statistics, ITS Surabaya Slide-7
Contoh:Data Ascombe
Contoh:Data Ascombe
Contoh:Data Ascombe
Contoh:Data Ascombe
Contoh:Data Ascombe
Correlation Coefficient

n n n
n xi yi   xi  yi
r i 1
1/ 2
i 1 i 1
1/ 2
 n 2  n 
2
  n 2  n 
2

(n xi    xi    n  yi    yi  
 i 1  i 1    i 1  i 1  

 r = +1 perfect positive correlatiom


 r = 0 no correlation
 r = -1 perfect negative correlation

Department of Statistics, ITS Surabaya Slide-13


Simple Linear Regression Model

Population Random
Population Independent Error
Slope
Y intercept Variable term
Coefficient
Dependent
Variable

Yi  β0  β1Xi  ε i
Linear component Random Error
component

Department of Statistics, ITS Surabaya Slide-14


Simple Linear Regression Model

(continued)

Y Yi  β0  β1Xi  ε i
Observed Value
of Y for Xi

εi Slope = β1
Predicted Value Random Error
of Y for Xi
for this Xi value

Intercept = β0

Xi X
Department of Statistics, ITS Surabaya Slide-15
Simple Linear Regression Equation (Prediction Line)

The simple linear regression equation provides an


estimate of the population regression line

Estimated
(or predicted) Estimate of Estimate of the
Y value for the regression regression slope
observation i
intercept
Value of X for

Ŷi  b0  b1Xi
observation i

The individual random error terms ei have a mean of zero

Department of Statistics, ITS Surabaya Slide-16


Least Squares Method

• b0 and b1 are obtained by finding the values of b0


and b1 that minimize the sum of the squared
differences between Y and : Ŷ

min  (Yi Ŷi )  min  (Yi  (b 0  b1Xi ))


2 2

Department of Statistics, ITS Surabaya Slide-17


Least Squares Method

x y i i  nx y
b1  i 1
n

 i
x 2

i 1
 nx 2

bo  y  b1 x

Department of Statistics, ITS Surabaya Slide-18


Interpretation of the Slope and the Intercept

• b0 is the estimated average value of Y


when the value of X is zero

• b1 is the estimated change in the average


value of Y as a result of a one-unit change
in X

Department of Statistics, ITS Surabaya Slide-19


Simple Linear Regression Example

• A real estate agent wishes to examine the relationship


between the selling price of a home and its size
(measured in square feet)

• A random sample of 10 houses is selected


– Dependent variable (Y) = house price in $1000s
– Independent variable (X) = square feet

Department of Statistics, ITS Surabaya Slide-20


Sample Data for House Price Model

House Price in $1000s Square Feet


(Y) (X)
245 1400
312 1600
279 1700
308 1875
199 1100
219 1550
405 2350
324 2450
319 1425
255 1700

Department of Statistics, ITS Surabaya Slide-21


Graphical Presentation

• House price model: scatter plot


450
400
House Price ($1000s)

350
300
250
200
150
100
50
0
0 500 1000 1500 2000 2500 3000
Square Feet

Department of Statistics, ITS Surabaya Slide-22


Regression Using Excel
• Tools / Data Analysis / Regression

Department of Statistics, ITS Surabaya Slide-23


Excel Output

Regression Statistics
Multiple R 0.76211 The regression equation is:
R Square 0.58082
Adjusted R Square 0.52842 house price  98.24833  0.10977 (square feet)
Standard Error 41.33032
Observations 10

ANOVA
df SS MS F Significance F
Regression 1 18934.9348 18934.9348 11.0848 0.01039
Residual 8 13665.5652 1708.1957
Total 9 32600.5000

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580

Department of Statistics, ITS Surabaya Slide-24


Graphical Presentation

• House price model: scatter plot and


regression
450 line
400
House Price ($1000s)

350
Slope
300
250
= 0.10977
200
150
100
50
Intercept 0
= 98.248 0 500 1000 1500 2000 2500 3000
Square Feet

house price  98.24833  0.10977 (square feet)


Department of Statistics, ITS Surabaya Slide-25
Interpretation of the
Intercept, b0

house price  98.24833  0.10977 (square feet)

• b0 is the estimated average value of Y when


the value of X is zero (if X = 0 is in the range
of observed X values)
– Here, no houses had 0 square feet, so b0 =
98.24833 just indicates that, for houses within
the range of sizes observed, $98,248.33 is the
portion of the house price not explained by
Department of Statistics, ITS Surabaya Slide-26
Interpretation of the
Slope Coefficient, b1

house price  98.24833  0.10977 (square feet)

• b1 measures the estimated change in the


average value of Y as a result of a one-unit
change in X
– Here, b1 = .10977 tells us that the average value
of a house increases by .10977($1000) = $109.77,
on average, for each additional one square foot
of size Department of Statistics, ITS Surabaya Slide-27
Predictions using
Regression Analysis

Predict the price for a house


with 2000 square feet:

house price  98.25  0.1098 (sq.ft.)

 98.25  0.1098(200 0)

 317.85
The predicted price for a house with 2000
square feet is 317.85($1,000s) = $317,850
Department of Statistics, ITS Surabaya Slide-28
Interpolation vs. Extrapolation

• When using a regression model for prediction,


only predict within the relevant range of data
Relevant range for
interpolation

450
400
House Price ($1000s)

350
300
250
200
150 Do not try to
100
extrapolate
50
0
beyond the range
0 500 1000 1500 2000 2500 3000 of observed X’s
Square Feet Department of Statistics, ITS Surabaya Slide-29
Measures of Variation

• Total variation is made up of two parts:


SST  SSR  SSE
Total Sum of Regression Sum Error Sum of
Squares of Squares Squares

SST   ( Yi  Y )2 SSR   ( Ŷi  Y )2 SSE   ( Yi  Ŷi )2


where:
Y = Average value of the dependent variable
Yi = Observed values of the dependent variable
Ŷ = Predicted value of Y for the given X value
i Department of Statistics, ITS Surabaya i Slide-30
Measures of Variation

(continued)

• SST = total sum of squares


– Measures the variation of the Yi values around their
mean Y
• SSR = regression sum of squares
– Explained variation attributable to the relationship
between X and Y
• SSE = error sum of squares
– Variation attributable to factors other than the
relationship between X and Y

Department of Statistics, ITS Surabaya Slide-31


Measures of Variation

(continued)
Y
Yi  
SSE = (Yi - Yi )2 Y
_
SST = (Yi - Y)2

Y  _
_ SSR = (Yi - Y)2 _
Y Y

Xi X
Department of Statistics, ITS Surabaya Slide-32
Coefficient of Determination, r2

• The coefficient of determination is the portion of


the total variation in the dependent variable that
is explained by variation in the independent
variable
• The coefficient of determination is also called r-
squared and is denoted as r2
SSR regression sum of squares
r 
2

SST total sum of squares

note: 0 r 1 2
Department of Statistics, ITS Surabaya Slide-33
Examples of Approximate
r2 Values

Y
r2 = 1

Perfect linear relationship


between X and Y:
X
r2 = 1
Y 100% of the variation in Y is
explained by variation in X

X
r =1
2
Department of Statistics, ITS Surabaya Slide-34
Examples of Approximate
r2 Values

Y
0 < r2 < 1

Weaker linear relationships


between X and Y:
X
Some but not all of the
Y
variation in Y is explained
by variation in X

X
Department of Statistics, ITS Surabaya Slide-35
Examples of Approximate
r2 Values

r2 = 0
Y
No linear relationship
between X and Y:

The value of Y does not


X depend on X. (None of the
r2 = 0
variation in Y is explained
by variation in X)

Department of Statistics, ITS Surabaya Slide-36


Excel Output

SSR 18934.9348
Regression Statistics
r 
2
  0.58082
Multiple R 0.76211 SST 32600.5000
R Square 0.58082
Adjusted R Square 0.52842 58.08% of the variation in
Standard Error 41.33032
house prices is explained by
Observations 10
variation in square feet
ANOVA
df SS MS F Significance F
Regression 1 18934.9348 18934.9348 11.0848 0.01039
Residual 8 13665.5652 1708.1957
Total 9 32600.5000

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580

Department of Statistics, ITS Surabaya Slide-37


Standard Error of Estimate

• The standard deviation of the variation of


observations around the regression line is
estimated by
n

SSE  i i
( Y  Ŷ ) 2

S YX   i1
n2 n2
Where
SSE = error sum of squares
n = sample size

Department of Statistics, ITS Surabaya Slide-38


Excel Output

Regression Statistics
Multiple R 0.76211 S YX  41.33032
R Square 0.58082
Adjusted R Square 0.52842
Standard Error 41.33032
Observations 10

ANOVA
df SS MS F Significance F
Regression 1 18934.9348 18934.9348 11.0848 0.01039
Residual 8 13665.5652 1708.1957
Total 9 32600.5000

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580

Department of Statistics, ITS Surabaya Slide-39


Comparing Standard Errors

SYX is a measure of the variation of observed


Y values from the regression line
Y Y

small s YX X large s YX X

The magnitude of SYX should always be judged relative to the


size of the Y values in the sample data
i.e., SYX = $41.33K is moderately small relative to house prices in
the $200 - $300K range
Department of Statistics, ITS Surabaya Slide-40
Assumptions of Regression

Use the acronym LINE:


• Linearity
– The underlying relationship between X and Y is linear

• Independence of Errors
– Error values are statistically independent

• Normality of Error
– Error values (ε) are normally distributed for any given value of X

• Equal Variance (Homoscedasticity)


– The probability distribution of the errors has constant variance

Department of Statistics, ITS Surabaya Slide-41


Residual Analysis

ei  Yi  Ŷi
• The residual for observation i, ei, is the difference between
its observed and predicted value
• Check the assumptions of regression by examining the
residuals
– Examine for linearity assumption
– Evaluate independence assumption
– Evaluate normal distribution assumption
– Examine for constant variance for all levels of X (homoscedasticity)

• Graphical Analysis of Residuals


– Can plot residuals vs. X
Department of Statistics, ITS Surabaya Slide-42
Residual Analysis for Linearity

Y Y

x x
residuals

residuals
x x

Not Linear
 Linear

Department of Statistics, ITS


Slide-43
Surabaya
Residual Analysis for
Independence

Not Independent
 Independent
residuals

residuals
X
residuals

Department of Statistics, ITS Surabaya Slide-44


Residual Analysis for Normality

A normal probability plot of the residuals can be


used to check for normality:

Percent
100

0
-3 -2 -1 0 1 2 3
Residual
Department of Statistics, ITS Surabaya Slide-45
Residual Analysis for Equal Variance

Y Y

x x
residuals

x residuals x

Non-constant variance

Department of Statistics, ITS Surabaya
Constant variance
Slide-46
Excel Residual Output

RESIDUAL OUTPUT House Price Model Residual Plot


Predicted
House Price Residuals 80
1 251.92316 -6.923162 60
2 273.87671 38.12329
40
3 284.85348 -5.853484 Residuals
20
4 304.06284 3.937162
0
5 218.99284 -19.99284
0 1000 2000 3000
6 268.38832 -49.38832 -20

7 356.20251 48.79749 -40


8 367.17929 -43.17929 -60
9 254.6674 64.33264 Square Feet
10 284.85348 -29.85348
Does not appear to violate
any regression assumptions
Department of Statistics, ITS Surabaya Slide-47
Measuring Autocorrelation:
The Durbin-Watson Statistic

• Used when data are collected over time


to detect if autocorrelation is present
• Autocorrelation exists if residuals in one
time period are related to residuals in
another period

Department of Statistics, ITS Surabaya Slide-48


Autocorrelation

• Autocorrelation is correlation of the errors


(residuals) over time Time (t) Residual Plot

15
 Here, residuals show a 10

Residuals
5
cyclic pattern, not
0
random. Cyclical
-5 0 2 4 6 8
patterns are a sign of -10
positive autocorrelation -15
Time (t)

 Violates the regression assumption that


residuals are random and independent
Department of Statistics, ITS Surabaya Slide-49
The Durbin-Watson Statistic
• The Durbin-Watson statistic is used to test for
autocorrelation
H0: residuals are not correlated
H1: positive autocorrelation is present

n  The possible range is 0 ≤ D ≤ 4


 (e  ei i 1 ) 2

 D should be close to 2 if H0 is true


D i 2
n

 i
e 2
 D less than 2 may signal positive
i1 autocorrelation, D greater than 2
may signal negative autocorrelation

Department of Statistics, ITS Surabaya Slide-50


Testing for Positive Autocorrelation

H0: positive autocorrelation does not exist


H1: positive autocorrelation is present
 Calculate the Durbin-Watson test statistic = D
(The Durbin-Watson Statistic can be found using Excel or Minitab or SPSS)

 Find the values dL and dU from the Durbin-Watson table


(for sample size n and number of independent variables k)

Decision rule: reject H0 if D < dL

Reject H0 Inconclusive Do not reject H0

0 dL dU 2
Department of Statistics, ITS Surabaya Slide-51
Testing for Positive Autocorrelation

• Suppose we have the following time series data:


160

140

120

100
Sales

80 y = 30.65 + 4.7038x
2
60 R = 0.8976
40

20

0
0 5 10 15 20 25 30
Tim e

• Is there autocorrelation?
Department of Statistics, ITS Surabaya Slide-52
Testing for Positive Autocorrelation
• Example with n = 25:
Excel/PHStat output: 160

140
Durbin-Watson Calculations
120
Sum of Squared
100
Difference of Residuals 3296.18

Sales
80 y = 30.65 + 4.7038x
Sum of Squared 2
60 R = 0.8976
Residuals 3279.98
40
Durbin-Watson
20
Statistic 1.00494
0
0 5 10 15 20 25 30
Tim e
n

 i i1
(e  e ) 2
3296.18
D i 2
n
  1.00494
3279.98
e
2
i
i1

Department of Statistics, ITS Surabaya Slide-53


Testing for Positive Autocorrelation
• Here, n = 25 and there is k = 1 one independent variable
• Using the Durbin-Watson table, dL = 1.29 and dU = 1.45
• D = 1.00494 < dL = 1.29, so reject H0 and conclude that
significant positive autocorrelation exists
• Therefore the linear model is not the appropriate model to
forecast sales

Decision: reject H0 since


D = 1.00494 < dL

Reject H0 Inconclusive Do not reject H0


0 dL=1.29 dU=1.45 2
Department of Statistics, ITS Surabaya Slide-54
Inferences About the Slope

• The standard error of the regression slope


coefficient (b1) is estimated by
S YX S YX
Sb1  
SSX  (X i  X) 2

where:
Sb1
= Estimate of the standard error of the least squares slope

SSE = Standard error of the estimate


S YX 
n2

Department of Statistics, ITS Surabaya Slide-55


Excel Output

Regression Statistics
Multiple R 0.76211
R Square 0.58082
Adjusted R Square 0.52842
Standard Error
Observations
41.33032
10
Sb1  0.03297
ANOVA
df SS MS F Significance F
Regression 1 18934.9348 18934.9348 11.0848 0.01039
Residual 8 13665.5652 1708.1957
Total 9 32600.5000

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580

Department of Statistics, ITS Surabaya Slide-56


Comparing Standard Errors of the Slope

Sb1 is a measure of the variation in the slope of regression


lines from different possible samples

Y Y

small Sb1 X large Sb1 X

Department of Statistics, ITS Surabaya Slide-57


Inference about the Slope:
t Test

• t test for a population slope


– Is there a linear relationship between X and Y?
• Null and alternative hypotheses
H0: β1 = 0 (no linear relationship)
H1: β1  0 (linear relationship does exist)
• Test statistic
b1  β1 where:

t b1 = regression slope
Sb1 coefficient
β1 = hypothesized slope
Sb = standard
d.f.  n  2 1
error of the slope
Department of Statistics, ITS Surabaya Slide-58
Inference about the Slope:
t Test

House Price Simple Linear Regression Equation:


Square Feet
in $1000s
(x)
(y)
house price  98.25  0.1098 (sq.ft.)
245 1400
312 1600
279 1700
308 1875 The slope of this model is 0.1098
199 1100
219 1550
Does square footage of the house
405 2350 affect its sales price?
324 2450
319 1425
255 1700

Department of Statistics, ITS Surabaya Slide-59


Inferences about the Slope:
t Test Example

b1 Sb1
H0: β1 = 0 From Excel output:
H1: β1  0 Coefficients Standard Error t Stat P-value
Intercept 98.24833 58.03348 1.69296 0.12892
Square Feet 0.10977 0.03297 3.32938 0.01039

b1  β1 0.10977  0
t  t  3.32938
Sb1 0.03297

Department of Statistics, ITS Surabaya Slide-60


Inferences about the Slope:
t Test Example

Test Statistic: t = 3.329


b1 Sb1 t
H0: β1 = 0 From Excel output:
H1: β1  0 Coefficients Standard Error t Stat P-value
Intercept 98.24833 58.03348 1.69296 0.12892
Square Feet 0.10977 0.03297 3.32938 0.01039

d.f. = 10-2 = 8
Decision:
a/2=.025 a/2=.025 Reject H0
Conclusion:
There is sufficient evidence that
Reject H0 Do not reject H0 Reject H0
-tα/2 0
tα/2 square footage affects house
-2.3060 2.3060 3.329 price
Department of Statistics, ITS Surabaya Slide-61
Inferences about the Slope:
t Test Example

(continued)
P-value = 0.01039
P-value
H0: β1 = 0 From Excel output:
H1: β1  0 Coefficients Standard Error t Stat P-value
Intercept 98.24833 58.03348 1.69296 0.12892
Square Feet 0.10977 0.03297 3.32938 0.01039

This is a two-tail test, so Decision: P-value < α so


the p-value is Reject H0
P(t > 3.329)+P(t < -3.329) Conclusion:
= 0.01039 There is sufficient evidence
(for 8 d.f.) that square footage affects
house price
Department of Statistics, ITS Surabaya Slide-62
F Test for Significance

• F Test statistic: MSR


F
where MSE

SSR
MSR 
k
SSE
MSE 
n  k 1
where F follows an F distribution with k numerator and (n – k - 1)
denominator degrees of freedom

(k = the number of independent variables in the regression model)


Department of Statistics, ITS Surabaya Slide-63
Excel Output

Regression Statistics
Multiple R 0.76211
MSR 18934.9348
R Square 0.58082 F   11.0848
Adjusted R Square 0.52842 MSE 1708.1957
Standard Error 41.33032
Observations 10 With 1 and 8 degrees P-value for
of freedom the F Test
ANOVA
df SS MS F Significance F
Regression 1 18934.9348 18934.9348 11.0848 0.01039
Residual 8 13665.5652 1708.1957
Total 9 32600.5000

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580

Department of Statistics, ITS Surabaya Slide-64


F Test for Significance

H0: β1 = 0 Test Statistic:


H1: β1 ≠ 0 MSR
F  11.08
 = .05 MSE
df1= 1 df2 = 8 Decision:
Critical Reject H0 at  = 0.05
Value:
F = 5.32
Conclusion:
 = .05
There is sufficient evidence that
0 F house size affects selling price
Do not Reject H0
reject H0
F.05 = 5.32
Department of Statistics, ITS Surabaya Slide-65
Confidence Interval Estimate
for the Slope

Confidence Interval Estimate of the Slope:


b1  t n2Sb1 d.f. = n - 2

Excel Printout for House Prices:


Coefficients Standard Error t Stat P-value Lower 95% Upper 95%
Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580

At 95% level of confidence, the confidence interval for


the slope is (0.0337, 0.1858)

Department of Statistics, ITS Surabaya Slide-66


Confidence Interval Estimate
for the Slope
(continued)

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580

Since the units of the house price variable is


$1000s, we are 95% confident that the average
impact on sales price is between $33.70 and
$185.80 per square foot of house size

This 95% confidence interval does not include 0.


Conclusion: There is a significant relationship between
house price and square feet at the .05 level of significance

Department of Statistics, ITS Surabaya Slide-67


t Test for a Correlation Coefficient

• Hypotheses
H 0: ρ = 0 (no correlation between X and Y)
H A: ρ ≠ 0 (correlation exists)

• Test statistic
r -ρ
– t (with n – 2 degrees of freedom)
1 r 2
where

n2 r   r 2 if b1  0

r   r 2 if b1  0
Department of Statistics, ITS Surabaya Slide-68
Example: House Prices

Is there evidence of a linear relationship


between square feet and house price at
the .05 level of significance?

H 0: ρ = 0 (No correlation)
H 1: ρ ≠ 0 (correlation exists)
 =.05 , df = 10 - 2 = 8

r ρ .762  0
t   3.329
1 r 2 1 .762 2
n2 10  2
Department of Statistics, ITS Surabaya Slide-69
Example: Test Solution

r ρ .762  0 Decision:
t   3.329
1 r 2 1 .762 2 Reject H0

n2 10  2 Conclusion:
There is
d.f. = 10-2 = 8
evidence of a
linear association
a/2=.025 a/2=.025
at the 5% level of
significance
Reject H0 Do not reject H0 Reject H0
-tα/2 0
tα/2
-2.3060 2.3060
3.329
Department of Statistics, ITS Surabaya Slide-70
Estimating Mean Values and Predicting
Individual Values

Goal: Form intervals around Y to express


uncertainty about the value of Y for a given Xi
Confidence
Interval for Y 
the mean of Y
Y, given Xi

Y = b0+b1Xi

Prediction Interval
for an individual Y,
given Xi
Department of Statistics, ITS Surabaya Xi X
Slide-71
Confidence Interval for
the Average Y, Given X

Confidence interval estimate for the


mean value of Y given a particular Xi

Confidence interval for μY|X Xi :

Ŷ  t n2S YX hi
Size of interval varies according
to distance away from mean, X

1 (X i  X)2 1 (X i  X)2
hi    
n SSX n  (X i  X)2
Department of Statistics, ITS Surabaya Slide-72
Prediction Interval for an Individual Y, Given X

Confidence interval estimate for an


Individual value of Y given a particular Xi

Confidence interval for YX  Xi :

Ŷ  t n2S YX 1  hi

This extra term adds to the interval width to reflect


the added uncertainty for an individual case

Department of Statistics, ITS Surabaya Slide-73


Estimation of Mean Values: Example

Confidence Interval Estimate for μY|X=X


i

Find the 95% confidence interval for the mean price


of 2,000 square-foot houses

Predicted Price Yi = 317.85 ($1,000s)

1 (X i  X)2
Ŷ  t n-2S YX   317.85  37.12
n  (X i  X) 2

The confidence interval endpoints are 280.66 and 354.90,


or from $280,660 to $354,900
Department of Statistics, ITS Surabaya Slide-74
Estimation of Individual Values: Example

Prediction Interval Estimate for YX=X


i

Find the 95% prediction interval for an individual


house with 2,000 square feet

Predicted Price Yi = 317.85 ($1,000s)

1 (X i  X)2
Ŷ  t n-1S YX 1   317.85  102.28
n  (X i  X) 2

The prediction interval endpoints are 215.50 and 420.07,


or from $215,500 to $420,070
Department of Statistics, ITS Surabaya Slide-75
Finding Confidence and
Prediction Intervals in Excel

• In Excel, use
PHStat | regression | simple linear regression …

– Check the
“confidence and prediction interval for X=”
box and enter the X-value and confidence level
desired

Department of Statistics, ITS Surabaya Slide-76


Finding Confidence and
Prediction Intervals in Excel

(continued)

Input values


Y

Confidence Interval Estimate for μY|X=Xi

Prediction Interval Estimate for YX=Xi


Department of Statistics, ITS Surabaya Slide-77
Pitfalls of Regression Analysis

• Lacking an awareness of the assumptions


underlying least-squares regression
• Not knowing how to evaluate the assumptions
• Not knowing the alternatives to least-squares
regression if a particular assumption is violated
• Using a regression model without knowledge
of the subject matter
• Extrapolating outside the relevant range
Department of Statistics, ITS Surabaya Slide-78
Strategies for Avoiding
the Pitfalls of Regression
• Start with a scatter diagram of X vs. Y to
observe possible relationship
• Perform residual analysis to check the
assumptions
– Plot the residuals vs. X to check for violations
of assumptions such as homoscedasticity
– Use a histogram, stem-and-leaf display, box-
and-whisker plot, or normal probability plot of
the residuals to uncover possible non-
normality

Department of Statistics, ITS Surabaya Slide-79


Strategies for Avoiding
the Pitfalls of Regression
• If there is violation of any assumption, use
alternative methods or models
• If there is no evidence of assumption
violation, then test for the significance of the
regression coefficients and construct
confidence intervals and prediction intervals
• Avoid making predictions or forecasts outside
the relevant range

Department of Statistics, ITS Surabaya Slide-80

You might also like