0% found this document useful (0 votes)
306 views47 pages

Introduction To Linear Regression and Correlation Analysis

The document discusses linear regression and correlation analysis. It defines scatter plots, correlation coefficients, and linear regression models. It explains how to calculate the sample correlation coefficient r and how r indicates the strength and direction of the linear relationship between two variables. The document also describes how to perform linear regression by estimating the regression slope and intercept that best predict the dependent variable from the independent variable.

Uploaded by

khem_singh
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
306 views47 pages

Introduction To Linear Regression and Correlation Analysis

The document discusses linear regression and correlation analysis. It defines scatter plots, correlation coefficients, and linear regression models. It explains how to calculate the sample correlation coefficient r and how r indicates the strength and direction of the linear relationship between two variables. The document also describes how to perform linear regression by estimating the regression slope and intercept that best predict the dependent variable from the independent variable.

Uploaded by

khem_singh
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 47

Chapter 14

Introduction to Linear Regression


and Correlation Analysis
Scatter Plots and Correlation

 A scatter plot (or scatter diagram) is used to show


the relationship between two variables
 Correlation analysis is used to measure strength
of the association (linear relationship) between
two variables
 Only concerned with strength of the
relationship
 No causal effect is implied
Scatter Plot Examples
Linear relationships Curvilinear relationships

y y

x x

y y

x x
Scatter Plot Examples
(continued)
Strong relationships Weak relationships

y y

x x

y y

x x
Scatter Plot Examples
(continued)
No relationship

x
Correlation Coefficient
(continued)

 The population correlation coefficient ρ (rho)


measures the strength of the association
between the variables
 The sample correlation coefficient r is an
estimate of ρ and is used to measure the
strength of the linear relationship in the
sample observations
Features of ρand r
 Unit free
 Range between -1 and 1
 The closer to -1, the stronger the negative
linear relationship
 The closer to 1, the stronger the positive
linear relationship
 The closer to 0, the weaker the linear
relationship
Examples of Approximate
r Values
y y y

x x x
r = -1 r = -.6 r=0
y y

x x
r = +.3 r = +1
Calculating the
Correlation Coefficient
Sample correlation coefficient:

r
 ( x  x)( y  y)
[ ( x  x ) ][  ( y  y ) ]
2 2

or the algebraic equivalent:


n xy   x  y
r
[n(  x 2 )  (  x )2 ][n(  y 2 )  (  y )2 ]
where:
r = Sample correlation coefficient
n = Sample size
x = Value of the independent variable
y = Value of the dependent variable
Significance Test for Correlation
 Hypotheses
H0: ρ = 0 (no correlation)
HA: ρ ≠ 0 (correlation exists)
 Test statistic

r

t (with n – 2 degrees of freedom)

1 r 2

n2
Introduction to Regression Analysis

 Regression analysis is used to:


 Predict the value of a dependent variable based on
the value of at least one independent variable
 Explain the impact of changes in an independent
variable on the dependent variable
Dependent variable: the variable we wish to
explain
Independent variable: the variable used to
explain the dependent variable
Simple Linear Regression Model

 Only one independent variable, x


 Relationship between x and y is
described by a linear function
 Changes in y are assumed to be caused
by changes in x
Types of Regression Models
Positive Linear Relationship Relationship NOT Linear

Negative Linear Relationship No Relationship


Population Linear Regression

The population regression model:


Population Random
Population Independent Error
Slope
y intercept Variable term, or
Coefficient
Dependent residual

y  β0  β1x  ε
Variable

Linear component Random Error


component
Linear Regression Assumptions

 Error values (ε) are statistically independent


 Error values are normally distributed for any
given value of x
 The probability distribution of the errors is
normal
 The probability distribution of the errors has
constant variance
 The underlying relationship between the x
variable and the y variable is linear
Population Linear Regression
(continued)

y y  β0  β1x  ε
Observed Value
of y for xi

εi Slope = β1
Predicted Value
Random Error
of y for xi
for this x value

Intercept = β0

xi x
Estimated Regression Model
The sample regression line provides an estimate of
the population regression line

Estimated Estimate of Estimate of the


(or predicted) the regression regression slope
y value
intercept
Independent

ŷ i  b0  b1x variable

The individual random error terms ei have a mean of zero


Least Squares Criterion

 b0 and b1 are obtained by finding the values


of b0 and b1 that minimize the sum of the
squared residuals

e 2
  (y ŷ) 2

  (y  (b 0  b1x))
2
The Least Squares Equation
 The formulas for b1 and b0 are:

b1 
 ( x  x )( y  y )
 (x  x) 2

algebraic equivalent:
and

 xy   x y
b1  n b0  y  b1 x
 x 2

(  x ) 2

n
Interpretation of the
Slope and the Intercept

 b0 is the estimated average value of y


when the value of x is zero

 b1 is the estimated change in the


average value of y as a result of a one-
unit change in x
Simple Linear Regression Example

 A real estate agent wishes to examine the


relationship between the selling price of a home
and its size (measured in square feet)

 A random sample of 10 houses is selected


 Dependent variable (y) = house price in $1000s

 Independent variable (x) = square feet


Sample Data for House Price Model
House Price in $1000s Square Feet
(y) (x)
245 1400
312 1600
279 1700
308 1875
199 1100
219 1550
405 2350
324 2450
319 1425
255 1700
Excel Output
Regression Statistics
Multiple R 0.76211 The regression equation is:
R Square 0.58082
Adjusted R Square 0.52842 house price  98.24833  0.10977 (square feet)
Standard Error 41.33032
Observations 10

ANOVA
  df SS MS F Significance F
Regression 1 18934.9348 18934.9348 11.0848 0.01039
Residual 8 13665.5652 1708.1957
Total 9 32600.5000      

  Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580
Graphical Presentation
 House price model: scatter plot and
regression line
450
400
House Price ($1000s)

350
Slope
300
250
= 0.10977
200
150
100
50
Intercept 0
= 98.248 0 500 1000 1500 2000 2500 3000
Square Feet

house price  98.24833  0.10977 (square feet)


Interpretation of the
Intercept, b0

house price  98.24833  0.10977 (square feet)

 b0 is the estimated average value of Y when the


value of X is zero (if x = 0 is in the range of
observed x values)
 Here, no houses had 0 square feet, so b0 = 98.24833
just indicates that, for houses within the range of
sizes observed, $98,248.33 is the portion of the
house price not explained by square feet
Interpretation of the
Slope Coefficient, b1

house price  98.24833  0.10977 (square feet)

 b1 measures the estimated change in the


average value of Y as a result of a one-
unit change in X
 Here, b1 = .10977 tells us that the average value of a
house increases by .10977($1000) = $109.77, on
average, for each additional one square foot of size
Least Squares Regression
Properties
 The sum of the residuals from the least squares
regression line is 0 (  ( y  yˆ )  0 )
 The sum of the squared residuals is a minimum
(minimized  ( y  ˆ
y ) 2
)
 The simple regression line always passes through the
mean of the y variable and the mean of the x variable
 The least squares coefficients are unbiased
estimates of β0 and β1
Explained and Unexplained Variation
 Total variation is made up of two parts:

SST  SSE  SSR


Total sum of Sum of Squares Sum of Squares
Squares Error Regression

SST   ( y  y )2 SSE   ( y  ŷ )2 SSR   ( ŷ  y )2


where:
y = Average value of the dependent variable
y = Observed values of the dependent variable
ŷ = Estimated value of y for the given x value
Explained and Unexplained Variation
(continued)

 SST = total sum of squares


 Measures the variation of the y i values around their
mean y
 SSE = error sum of squares
 Variation attributable to factors other than the
relationship between x and y
 SSR = regression sum of squares
 Explained variation attributable to the relationship
between x and y
Coefficient of Determination, R2
 The coefficient of determination is the portion
of the total variation in the dependent variable
that is explained by variation in the
independent variable

 The coefficient of determination is also called


R-squared and is denoted as R2

SSR
R 
2 where 0 R 12

SST
Coefficient of Determination, R2
(continued)
Coefficient of determination
SSR sum of squares explained by regression
R 
2

SST total sum of squares

Note: In the single independent variable case, the coefficient


of determination is

R r2 2

where:
R2 = Coefficient of determination
r = Simple correlation coefficient
Excel Output
SSR 18934.9348
Regression Statistics
R 2
  0.58082
Multiple R 0.76211 SST 32600.5000
R Square 0.58082
Adjusted R Square 0.52842 58.08% of the variation in
Standard Error 41.33032 house prices is explained by
Observations 10
variation in square feet
ANOVA
  df SS MS F Significance F
Regression 1 18934.9348 18934.9348 11.0848 0.01039
Residual 8 13665.5652 1708.1957
Total 9 32600.5000      

  Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580
Standard Error of Estimate
 The standard deviation of the variation of
observations around the regression line is
estimated by

SSE
s 
n  k 1
Where
SSE = Sum of squares error
n = Sample size
k = number of independent variables in the
model
The Standard Deviation of the
Regression Slope
 The standard error of the regression slope
coefficient (b1) is estimated by
sε sε
sb1  
 (x  x) 2

x 2

(  x)2

n
where:
sb1 = Estimate of the standard error of the least squares slope
SSE
sε  = Sample standard error of the estimate
n2
Excel Output
Regression Statistics sε  41.33032
Multiple R 0.76211
R Square 0.58082
Adjusted R Square 0.52842
Standard Error
Observations
41.33032
10
sb1  0.03297
ANOVA
  df SS MS F Significance F
Regression 1 18934.9348 18934.9348 11.0848 0.01039
Residual 8 13665.5652 1708.1957
Total 9 32600.5000      

  Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580
Inference about the Slope:
t Test
 t test for a population slope
 Is there a linear relationship between x and y?
 Null and alternative hypotheses
 H0: β1 = 0 (no linear relationship)
 H1: β1 0 (linear relationship does exist)
 Test statistic

b1  β1 where:

t b1 = Sample regression slope
sb1 coefficient
β1 = Hypothesized slope
sb1 = Estimator of the standard
d.f.  n  2

error of the slope
Inference about the Slope:
t Test
(continued)

House Price Estimated Regression Equation:


Square Feet
in $1000s
(x)
(y) house price  98.25  0.1098 (sq.ft.)
245 1400
312 1600
279 1700
308 1875 The slope of this model is 0.1098
199 1100
219 1550
Does square footage of the house
405 2350 affect its sales price?
324 2450
319 1425
255 1700
Inferences about the Slope:
t Test Example
Test Statistic: t = 3.329
b1 sb1 t
H0: β1 = 0 From Excel output:
HA: β1  0   Coefficients Standard Error t Stat P-value
Intercept 98.24833 58.03348 1.69296 0.12892
Square Feet 0.10977 0.03297 3.32938 0.01039

d.f. = 10-2 = 8
Decision:
/2=.025 /2=.025 Reject H0
Conclusion:
Reject H0 Do not reject H0 Reject H
There is sufficient evidence
-tα/2 tα/2 0

0 that square footage affects


-2.3060 2.3060 3.329
house price
Regression Analysis for
Description
Confidence Interval Estimate of the Slope:
b1  t /2 sb1 d.f. = n - 2

Excel Printout for House Prices:


  Coefficients Standard Error t Stat P-value Lower 95% Upper 95%
Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580

At 95% level of confidence, the confidence interval for


the slope is (0.0337, 0.1858)
Regression Analysis for
Description
  Coefficients Standard Error t Stat P-value Lower 95% Upper 95%
Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580

Since the units of the house price variable is


$1000s, we are 95% confident that the average
impact on sales price is between $33.70 and
$185.80 per square foot of house size

This 95% confidence interval does not include 0.


Conclusion: There is a significant relationship between
house price and square feet at the .05 level of significance
Confidence Interval for
the Average y, Given x
Confidence interval estimate for the
mean of y given a particular xp

Size of interval varies according


to distance away from mean, x

1 (x p  x)
2

ŷ  t /2sε 
n  (x  x) 2
Confidence Interval for
an Individual y, Given x
Confidence interval estimate for an
Individual value of y given a particular xp

1 (x p  x)
2

ŷ  t /2 sε 1 
n  (x  x) 2

This extra term adds to the interval width to reflect


the added uncertainty for an individual case
Interval Estimates
for Different Values of x

Prediction Interval
for an individual y,
y given xp

Confidence
Interval for
 + b x the mean of
y = b0
1
y, given xp

x
x xp
Example: House Prices

House Price Estimated Regression Equation:


Square Feet
in $1000s
(x)
(y)
house price  98.25  0.1098 (sq.ft.)
245 1400
312 1600
279 1700
308 1875 Predict the price for a house
199 1100 with 2000 square feet
219 1550
405 2350
324 2450
319 1425
255 1700
Example: House Prices
(continued)
Predict the price for a house
with 2000 square feet:

house price  98.25  0.1098 (sq.ft.)

 98.25  0.1098(2000)

 317.85
The predicted price for a house with 2000
square feet is 317.85($1,000s) = $317,850
Estimation of Mean Values:
Example
Confidence Interval Estimate for E(y)|xp
Find the 95% confidence interval for the average
price of 2,000 square-foot houses

Predicted Price Yi = 317.85 ($1,000s)

1 (x p  x)2
ŷ  t α/2 sε   317.85  37.12
n  (x  x) 2

The confidence interval endpoints are 280.66 -- 354.90,


or from $280,660 -- $354,900
Estimation of Individual Values:
Example
Prediction Interval Estimate for y|xp
Find the 95% confidence interval for an individual
house with 2,000 square feet

Predicted Price Yi = 317.85 ($1,000s)

1 (x p  x)2
ŷ  t α/2 sε 1   317.85  102.28
n  (x  x) 2

The prediction interval endpoints are 215.50 -- 420.07,


or from $215,500 -- $420,070

You might also like