0% found this document useful (0 votes)
43 views

Lecture 07 Regression

Uploaded by

tama1999tonni
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
43 views

Lecture 07 Regression

Uploaded by

tama1999tonni
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 22

Linear Regression

Md. Manowarul Islam


Associate Professor, Dept. of CSE
Jagannath University
Correlation
Correlation
A correlation is a relationship between two variables. The
data can be represented by the ordered pairs (x, y) where
x is the independent (or explanatory) variable, and y is
the dependent (or response) variable.

A scatter plot can be used to determine y


whether a linear (straight line) correlation exists
between two variables. 2

x
Example: 2 4 6

x 1 2 3 4 5 –2

y –4 –2 –1 0 2
–4
Larson & Farber, Elementary Statistics: Picturing the World, 3e 3
Linear Correlation
y y
As x As x
increases, y increases, y
tends to tends to
decrease. increase.

x x
Negative Linear Correlation Positive Linear Correlation
y y

x x
No Correlation Nonlinear Correlation
Larson & Farber, Elementary Statistics: Picturing the World, 3e 4
Correlation Coefficient
The correlation coefficient is a measure of the
strength and the direction of a linear relationship
between two variables. The symbol r represents the
sample correlation coefficient. The formula for r is
n  xy   x  y 
r  .
2 2
n  x 2   x  n  y 2   y 

The range of the correlation coefficient is 1 to 1. If x


and y have a strong positive linear correlation, r is
close to 1. If x and y have a strong negative linear
correlation, r is close to 1. If there is no linear
correlation or a weak linear correlation, r is close to
0.
Larson & Farber, Elementary Statistics: Picturing the World, 3e 5
Linear Correlation
y
y

r = 0.91 r = 0.88

x
x
Strong negative correlation
Strong positive correlation
y
y
r = 0.42
r = 0.07

x
x
Weak positive correlation
Nonlinear Correlation
Larson & Farber, Elementary Statistics: Picturing the World, 3e 6
Calculating a Correlation
Coefficient
Calculating a Correlation Coefficient
In Words In Symbols
1. Find the sum of the x-values. x
2. Find the sum of the y-values. y
3. Multiply each x-value by its  xy
corresponding y-value and find the
sum.
4. Square each x-value and find the x2
sum.  y2
5. Square each y-value and find the n  xy   x  y 
r  .
sum. n  x   x 
2 2 2
n  y   y 
2

6. Use these five sums to calculate


the correlation coefficient. Continued.
Larson & Farber, Elementary Statistics: Picturing the World, 3e 7
Correlation Coefficient
Example:
Calculate the correlation coefficient r for the following
data. x y xy x2 y2
1 –3 –3 1 9
2 –1 –2 4 1
3 0 0 9 0
4 1 4 16 1
5 2 10 25 4
 x 15  y  1  xy 9  x 2 55  y 2 15

n  xy   x  y  5(9)  15 1


r  
2 2 2
n  x 2   x  n  y 2   y  5(55)  152 5(15)   1

60 There is a strong
 0.986
50 74 positive linear
correlation between x
and y.
Larson & Farber, Elementary Statistics: Picturing the World, 3e 8
Correlation Coefficient
Example:
The following data represents the number of hours
12 different students watched television during the
weekend and the scores of each student who took a
test the following Monday.
a.) Display the scatter plot.
b.) Calculate the correlation coefficient r.

Hours, x 0 1 2 3 3 5 5 5 6 7 7 10
Test score,
96 85 82 74 95 68 76 84 58 65 75 50
y

Continued.
Larson & Farber, Elementary Statistics: Picturing the World, 3e 9
Correlation Coefficient
Example continued:
Hours, x 0 1 2 3 3 5 5 5 6 7 7 10
Test score,
96 85 82 74 95 68 76 84 58 65 75 50
y
y
100
80
Test score

60
40
20
x
2 4 6 8 10
Hours watching TV
Continued.
Larson & Farber, Elementary Statistics: Picturing the World, 3e 10
Correlation Coefficient
Example continued:
Hours, x 0 1 2 3 3 5 5 5 6 7 7 10
Test score,
96 85 82 74 95 68 76 84 58 65 75 50
y
xy 0 85 164 222 285 340 380 420 348 455 525 500
x2 0 1 4 9 9 25 25 25 36 49 49 100
y2 9216 7225 6724 5476 9025 4624 5776 7056 3364 4225 5625 2500
 x 54  y 908  xy 3724  x 2 332  y 2 70836

n  xy   x  y  12(3724)  54908


r    0.831
2 2 2
n  x   x 
2
n  y   y 
2
12(332)  54 2
12(70836)  908

There is a strong negative linear correlation.


As the number of hours spent watching TV
increases, the test scores tend to decrease.
Larson & Farber, Elementary Statistics: Picturing the World, 3e 11
Linear Regression
Residuals
After verifying that the linear correlation between
two variables is significant, next we determine the
equation of the line that can be used to predict the
value of y for a given value of x.
Observed
y
y-value

d2 For a given x-value,


d1
d = (observed y-value) – (predicted y-value)

Predicted d
3
y-value
x
Each data point di represents the difference between the
observed y-value and the predicted y-value for a given x-value on
the line. These differences are called residuals.

Larson & Farber, Elementary Statistics: Picturing the World, 3e 13


Regression Line
A regression line, also called a line of best fit, is
the line for which the sum of the squares of the
residuals is a minimum.
The Equation of a Regression Line
The equation of a regression line for an independent
variable x and a dependent variable y is
ŷ = mx + b
where ŷ is the predicted y-value for a given x-value.
The slope m and y-intercept b are given by
n  xy   x  y  y x
m and b y  mx   m
n  x 2   x 
2 n n
where y is the mean of the y- values and x is the mean of the
x - values. The regression line always passes through (x , y ).

Larson & Farber, Elementary Statistics: Picturing the World, 3e 14


Regression Line
Example:
Find the equation of the regression line.
x y xy x2 y2
1 –3 –3 1 9
2 –1 –2 4 1
3 0 0 9 0
4 1 4 16 1
5 2 10 25 4
 x 15  y  1  xy 9  x 2 55  y 2 15

n  xy   x  y  5(9)  15 1 60


m   1.2
n  x 2   x 
2
5(55)  15
2 50
Continued.
Larson & Farber, Elementary Statistics: Picturing the World, 3e 15
Regression Line
Example continued:
1 15
b  y  mx   (1.2)  3.8
5 5
The equation of the regression line is
ŷ = 1.2x – 3.8. y
2
1
x
1 2 3 4 5
1
2  
(x , y )  3,
1
5
3

Larson & Farber, Elementary Statistics: Picturing the World, 3e 16


Regression Line
Example:
The following data represents the number of hours
12 different students watched television during the
weekend and the scores of each student who took a
test the following Monday.
a.) Find the equation of the regression line.
b.) Use the equation to find the expected test
score for a student who watches 9 hours of
Hours, x TV. 0 1 2 3 3 5 5 5 6 7 7 10
Test score,
96 85 82 74 95 68 76 84 58 65 75 50
y
xy 0 85 164 222 285 340 380 420 348 455 525 500
x2 0 1 4 9 9 25 25 25 36 49 49 100
y2 9216 7225 6724 5476 9025 4624 5776 7056 3364 4225 5625 2500
 x 54  y 908  xy 3724  x 2 332  y 2 70836
Larson & Farber, Elementary Statistics: Picturing the World, 3e 17
Regression Line
Example continued:
n  xy   x  y  12(3724)  54908
m 2
 2
 4.067
n  x   x 
2
12(332)  54
y
b  y  mx 100 (x , y )  1254 ,908
12 
4.5,75.7
908 54
  ( 4.067) 80
12 12
Test score 60
93.97
40
20
ŷ = –4.07x + 93.97
x
2 4 6 8 10
Hours watching TV
Continued.
Larson & Farber, Elementary Statistics: Picturing the World, 3e 18
Regression Line
Example continued:
Using the equation ŷ = –4.07x + 93.97, we can
predict the test score for a student who watches 9
hours of TV.
ŷ = –4.07x + 93.97
= –4.07(9) + 93.97
= 57.34

A student who watches 9 hours of TV over the


weekend can expect to receive about a 57.34 on
Monday’s test.

Larson & Farber, Elementary Statistics: Picturing the World, 3e 19


Multiple Regression
Equation
In many instances, a better prediction can be found for a dependent (response) variable by
using more than one independent (explanatory) variable.
For example, a more accurate prediction of Monday’s test grade from the previous section
might be made by considering the number of other classes a student is taking as well as the
student’s previous knowledge of the test material.

A multiple regression equation has the form


ŷ = b + m1x1 + m2x2 + m3x3 + … + mkxk
where x1, x2, x3,…, xk are independent variables, b is
the y-intercept, and y is the dependent variable.
* Because the mathematics associated with this concept is
complicated, technology is generally used to calculate the
multiple regression equation.
Larson & Farber, Elementary Statistics: Picturing the World, 3e 20
Predicting y-Values
After finding the equation of the multiple regression line,
you can use the equation to predict y-values over the range of
the data.
Example:
The following multiple regression equation can be used to
predict the annual U.S. rice yield (in pounds).
ŷ = 859 + 5.76x1 + 3.82x2
where x1 is the number of acres planted (in thousands), and
x2 is the number of acres harvested (in thousands).
(Source: U.S. National Agricultural Statistics Service)
a.) Predict the annual rice yield when x1 = 2758, and x2 =
2714.
b.) Predict the annual rice yield when x1 = 3581, and Continued.
x2 =
3021. Larson & Farber, Elementary Statistics: Picturing the World, 3e 21
Predicting y-Values
Example continued:

a.) ŷ = 859 + 5.76x1 + 3.82x2


= 859 + 5.76(2758) + 3.82(2714)
= 27,112.56
The predicted annual rice yield is 27,1125.56 pounds.

b.) ŷ = 859 + 5.76x1 + 3.82x2


= 859 + 5.76(3581) + 3.82(3021)
= 33,025.78
The predicted annual rice yield is 33,025.78 pounds.

Larson & Farber, Elementary Statistics: Picturing the World, 3e 22

You might also like