0% found this document useful (0 votes)
93 views16 pages

Simple Linear Regression Analysis: Mcgraw-Hill/Irwin

This chapter discusses simple linear regression analysis. It introduces measuring the linear correlation between two variables, describing a linear relationship with an equation, and making predictions with a linear regression model. Key aspects covered include calculating the correlation coefficient, establishing the linear regression model equation, using the least squares method to estimate coefficients, testing the significance of the slope, determining the coefficient of determination, and using the model to make predictions.

Uploaded by

Naeem Ayaz
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
93 views16 pages

Simple Linear Regression Analysis: Mcgraw-Hill/Irwin

This chapter discusses simple linear regression analysis. It introduces measuring the linear correlation between two variables, describing a linear relationship with an equation, and making predictions with a linear regression model. Key aspects covered include calculating the correlation coefficient, establishing the linear regression model equation, using the least squares method to estimate coefficients, testing the significance of the slope, determining the coefficient of determination, and using the model to make predictions.

Uploaded by

Naeem Ayaz
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 16

Chapter 13

Simple Linear Regression


Analysis

McGraw-Hill/Irwin
The Goal

This chapter talks about methods for


1. Measuring linear correlation between
two variables
2. Describing a linear relationship between
two variables with a linear equation
3. Making predictions with linear regression
model
4. Describing the usefulness of a linear
regression model
13-2
Different Values of the Correlation
Coefficient

13-3
Measure the linear Relationship: The Simple
Correlation Coefficient

n n n
n xi yi   xi  yi
r i 1 i 1 i 1
n n n n
n x 2  ( xi ) 2 n yi2  ( yi ) 2
i 1 i 1 i 1 i 1

The linear coefficient (or simple correlation coefficient) r is a


numerical measure of the strength of the linear relationship
between two variables representing quantitative data.

13-4
Interpret the Correlation Coefficient
r
• If r>0 we say the two variables are
positively correlated; if r<0 we say they
are negatively correlated.
• If the absolute value of r is ≥ 0.8, we
say the linear relationship is strong;
• If the absolute value of r is below 0.8
but ≥ 0.5, we say the linear relationship
is moderate.
• If the absolute value of r is below 0.5,
we say the linear relationship is weak. 13-5
Properties of the Correlation
Coefficient r
1. The value of r is always between -1 and +1.

2. The value for r does not change if all values


of either variables are converted to a
different scale.

3. The value of r is not affected by the choice


of x or y. Interchange all x and y values and
the value of r will not change.

4. r measures the strength of a linear


relationship. It is not designed to measure
the strength of a relationship that is not 13-6
The Simple Linear Regression Model
and the Least Squares Point Estimates
• The dependent (or response) variable is the
variable we wish to understand or predict,
denoted by Y
• The independent (or predictor or explanatory )
variable is the variable we will use to
understand or predict the dependent variable,
denoted by X
• Regression analysis is a statistical technique
that uses observed data to relate the dependent
variable to one or more independent variables

13-7
Form of The Simple Linear
Regression Model
• Y = β0 + β1X + ε
• β0 + β1X is the mean value of the dependent
variable Y when the value of the independent
variable is X. The mean is in the form of a
linear function. The mean determines the
overall trend of the relationship between X
and Y.
• β0 is the y-intercept, the mean of y when X is
0; β1 is the slope, the change in the mean of
Y per unit change in X
• ε is an error term that describes the effect on
Y of all factors other than X
• ŷ = b0 + b1x, ŷ is the estimate of mean value of Y 13-8
The Least Squares Estimation
Method

13-9
The Simple Linear Regression Model
Illustrated

13-10
The Least Squares Point Estimates

• Estimation/prediction equation
ŷ = b0 + b1x
• Least squares point estimate of the
slope β1
SS xy
b1 
SS xx

SS xy   ( xi  x )( yi  y )   xy 
  x   y  i i
i i
n

SS xx   ( xi  x )   x
2 2

  x i
2

i
n
13-11
The Least Squares Point Estimates
Continued

• Least squares point estimate of the y-


intercept 0

b0  y  b1 x y
 y i
x
 x i

n n

13-12
Testing the Significance of the Slope

• A regression model is not likely to be useful


unless there is a significant relationship
between x and y
• To test significance, we use the null
hypothesis:

H0: β1 = 0

• Versus the alternative hypothesis:

Ha: β1 ≠ 0

13-13
Testing the Significance of the Slope #2

Alternative Reject H0 If p-Value

Ha: β1 ≠ 0 |t| > tα/2* Twice area under t


distribution right of |t|

*
That is t > tα/2 or t < –tα/2

b1 s
t= where sb1 
sb1 SS xx
t, t/2 and p-values are based on n–2 degrees of
freedom

13-14
The Simple Coefficient of
Determination and Correlation
• How useful is a particular regression model?
• One measure of usefulness is the simple
coefficient of determination
• It is represented by the symbol r2 , because it
is actually equal to the square of (simple)
Correlation Coefficient which is denoted by r.
• It is interpreted as the percentage of
variation in Y that could be explained by the
linear regression line b0 + b1x

13-15
Prediction
• To estimate the mean value of Y for X= x0,
one just need to plug x0 into the regression
line formula and calculate the estimate of Y
by b0 + b1x0 . We usually denote the
estimated mean value of Y from the
regression line by ŷ = b0 + b1x0 and call ŷ the
fitted value for X= x0.

• window

13-16

You might also like