0% found this document useful (0 votes)
100 views53 pages

Modified Correlation

Here are the steps to test the population correlation coefficient: 1) Calculate the sample correlation coefficient r 2) Look up the critical value in Table 11 based on the sample size n and significance level α 3) Compare |r| to the critical value - if |r| is greater than the critical value, then ρ is significant at that level. For a sample size of n=6 and significance levels of α=0.05 and α=0.01, the critical values are 0.632 and 0.805 respectively. So if |r| is greater than 0.632, ρ would be significant at the 5% level, and if |r| is greater than 0.805,

Uploaded by

Hazel Herida
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
100 views53 pages

Modified Correlation

Here are the steps to test the population correlation coefficient: 1) Calculate the sample correlation coefficient r 2) Look up the critical value in Table 11 based on the sample size n and significance level α 3) Compare |r| to the critical value - if |r| is greater than the critical value, then ρ is significant at that level. For a sample size of n=6 and significance levels of α=0.05 and α=0.01, the critical values are 0.632 and 0.805 respectively. So if |r| is greater than 0.632, ρ would be significant at the 5% level, and if |r| is greater than 0.805,

Uploaded by

Hazel Herida
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 53

Correlation and

Regression
Correlation
Correlation
A correlation is a relationship between two variables. The data
can be represented by the ordered pairs (x, y) where x is the
independent (or explanatory) variable, and y is the dependent
(or response) variable.
y
A scatter plot can be used to determine
whether a linear (straight line) correlation
2
exists between two variables.
x
Example: 2 4 6

x 1 2 3 4 5 –2

y –4 –2 –1 0 2
–4
Larson & Farber, Elementary Statistics: Picturing the World, 3e 3
Linear Correlation
y y
As x increases, y As x increases, y
tends to decrease. tends to increase.

x x
Negative Linear Correlation Positive Linear Correlation
y y

x x
No Correlation Nonlinear Correlation
Larson & Farber, Elementary Statistics: Picturing the World, 3e 4
Correlation Coefficient
The correlation coefficient is a measure of the strength and the
direction of a linear relationship between two variables. The
symbol r represents the sample correlation coefficient. The
formula for r is
n  xy    x    y 
r  .
n x 2  x  n  y 2   y 
2 2

The range of the correlation coefficient is 1 to 1. If x and y have


a strong positive linear correlation, r is close to 1. If x and y have
a strong negative linear correlation, r is close to 1. If there is no
linear correlation or a weak linear correlation, r is close to 0.

Larson & Farber, Elementary Statistics: Picturing the World, 3e 5


Linear Correlation
y
y

r = 0.91 r = 0.88

x
x
Strong negative correlation
Strong positive correlation
y
y
r = 0.42
r = 0.07

x
x
Weak positive correlation
Nonlinear Correlation
Larson & Farber, Elementary Statistics: Picturing the World, 3e 6
Calculating a Correlation Coefficient
Calculating a Correlation Coefficient
In Words In Symbols
1. Find the sum of the x-values. x
2. Find the sum of the y-values. y
3. Multiply each x-value by its corresponding  xy
y-value and find the sum.
4. Square each x-value and find the sum.
5. Square each y-value and find the sum. x 2
6. Use these five sums to calculate the y 2
correlation coefficient. n  xy    x    y 
r  .
n x  x  n  y   y 
2 2 2 2

Continued.
Larson & Farber, Elementary Statistics: Picturing the World, 3e 7
Correlation Coefficient
Example:
Calculate the correlation coefficient r for the following data.
x y xy x2 y2
1 –3 –3 1 9
2 –1 –2 4 1
3 0 0 9 0
4 1 4 16 1
5 2 10 25 4
 x  15  y  1  xy  9  x 2  55  y 2  15

n  xy    x    y  5(9)  15  1


r  
n x 2  x  n  y 2   y  5(55)  15 2 5(15)   1
2 2 2

60 There is a strong positive


  0.986
50 74 linear correlation between x
and y.
Larson & Farber, Elementary Statistics: Picturing the World, 3e 8
Correlation Coefficient

The following data represents the number of hours 12 different


students watched television during the weekend and the scores of
each student who took a test the following Monday.

Calculate the correlation coefficient r and determine the strength


of correlation.

Hours, x 0 1 2 3 3 5 5 5 6 7 7 10
Test score, y 96 85 82 74 95 68 76 84 58 65 75 50

Continued.
Larson & Farber, Elementary Statistics: Picturing the World, 3e 9
Correlation Coefficient
Example continued:
Hours, x 0 1 2 3 3 5 5 5 6 7 7 10
Test score, y 96 85 82 74 95 68 76 84 58 65 75 50
y
100
80
Test score

60
40
20
x
2 4 6 8 10
Hours watching TV
Continued.
Larson & Farber, Elementary Statistics: Picturing the World, 3e 10
Correlation Coefficient
Example continued:
Hours, x 0 1 2 3 3 5 5 5 6 7 7 10
Test score, y 96 85 82 74 95 68 76 84 58 65 75 50
xy 0 85 164 222 285 340 380 420 348 455 525 500
x2 0 1 4 9 9 25 25 25 36 49 49 100
y2 9216 7225 6724 5476 9025 4624 5776 7056 3364 4225 5625 2500

 x  54  y  908  xy  3724  x 2  332  y 2  70836

n  xy    x    y  12(3724)   54   908
r    0.831
n x  x  n  y   y  12(70836)   908
2 2 2 2 2 2
12(332)  54

There is a strong negative linear correlation.


As the number of hours spent watching TV increases, the test
scores tend to decrease.
Larson & Farber, Elementary Statistics: Picturing the World, 3e 11
Testing a Population
Correlation Coefficient p

Larson & Farber, Elementary Statistics: Picturing the World, 3e 12


Review: Determine the strength of a linear
correlation given the following correlation
coeficient r.

1.) r = -0.775 3.) r = 0.455

2.) r = 0.651 4.) r = -0.333


Testing a Population Correlation Coefficient
Once the sample correlation coefficient r has been calculated, we need
to determine whether there is enough evidence to decide that the
population correlation coefficient ρ is significant at a specified level of
significance.
One way to determine this is to use Table 11 in Appendix B.
If |r| is greater than the critical value, there is enough evidence to
decide that the correlation coefficient ρ is significant.

n  = 0.05  = 0.01
For a sample of size n = 6, ρ
4 0.950 0.990
is significant at the 5%
5 0.878 0.959
significance level, if |r| >
6 0.811 0.917
0.811.
7 0.754 0.875

Larson & Farber, Elementary Statistics: Picturing the World, 3e 14


Testing a Population Correlation Coefficient
Finding the Correlation Coefficient ρ
In Words In Symbols
1. Determine the number of pairs of Determine n.
data in the sample.
2. Specify the level of significance. Identify .
3. Find the critical value.
4. Decide if the correlation is Use Table 11 in Appendix B.
significant. If |r| > critical value, the correlation
5. Interpret the decision in the is significant. Otherwise, there is
context of the original claim. not enough evidence to support that
the correlation is significant.

Larson & Farber, Elementary Statistics: Picturing the World, 3e 15


Example:
The following data represents the ages of 8 different
persons (year) and their corresponding weight (kilogram).
Calculate the correlation coefficient r and determine the
strength of correlation.Test whether the relationship is
significant using 5% level of significance.

Age of person, in 12 13 14 16 16 17 17 18
years (x)
Weight, in 40 42 38 35 45 51 48 48
kilograms (y)

Larson & Farber, Elementary Statistics: Picturing the World, 3e 16


The table below shows the height of a father and
his eldest son, in inches. Calculate the correlation
coefficient r and determine the strength of
correlation. Is the relationship significant using 1%
level of significance?
Height of
Father (x) 69 72 68 67 68 63 68 66 70 72 65 60

Height of
Eldest Son (y) 71 75 69 69 65 60 66 63 68 70 60 58

Larson & Farber, Elementary Statistics: Picturing the World, 3e 17


Larson & Farber, Elementary Statistics: Picturing the World, 3e 18
Larson & Farber, Elementary Statistics: Picturing the World, 3e 19
Testing a Population Correlation Coefficient

Example:
The following data represents the number of hours 12 different
students watched television during the weekend and the scores of
each student who took a test the following Monday.

The correlation coefficient r  0.831.

Hours, x 0 1 2 3 3 5 5 5 6 7 7 10
Test score, y 96 85 82 74 95 68 76 84 58 65 75 50

Is the correlation coefficient significant at  = 0.01?

Continued.
Larson & Farber, Elementary Statistics: Picturing the World, 3e 20
Testing a Population Correlation Coefficient
Example continued: Appendix B: Table 11
r  0.831 n  = 0.05  = 0.01
4 0.950 0.990
n = 12 5 0.878 0.959
 = 0.01 6 0.811 0.917

10 0.632 0.765
11 0.602 0.735
12 0.576 0.708 |r| > 0.708
13 0.553 0.684
Because, the population correlation is significant, there is enough evidence
at the 1% level of significance to conclude that there is a significant linear
correlation between the number of hours of television watched during the
weekend and the scores of each student who took a test the following
Monday.
Larson & Farber, Elementary Statistics: Picturing the World, 3e 21
Hypothesis Testing for ρ

The t-Test for the Correlation Coefficient


A t-test can be used to test whether the correlation between two
variables is significant. The test statistic is r and the
standardized test statistic
r r
t  
σr 1 r2
n 2
follows a t-distribution with n – 2 degrees of freedom.

In this text, only two-tailed hypothesis tests for ρ are


considered.

Larson & Farber, Elementary Statistics: Picturing the World, 3e 22


Hypothesis Testing for ρ
Using the t-Test for the Correlation Coefficient ρ
In Words In Symbols
1. State the null and alternative State H0 and Ha.
hypothesis.
2. Specify the level of significance. Identify .
3. Identify the degrees of freedom.
4. Determine the critical value(s) d.f. = n – 2
and rejection region(s).
Use Table 5 in Appendix B.

Larson & Farber, Elementary Statistics: Picturing the World, 3e 23


Hypothesis Testing for ρ
Using the t-Test for the Correlation Coefficient ρ
In Words In Symbols
5. Find the standardized test statistic. r
t 
1 r2
n 2
6. Make a decision to reject or fail to
reject the null hypothesis. If t is in the rejection region,
reject H0. Otherwise fail to
7. Interpret the decision in the reject H0.
context of the original claim.

Larson & Farber, Elementary Statistics: Picturing the World, 3e 24


Hypothesis Testing for ρ
Example:
The following data represents the number of hours 12 different
students watched television during the weekend and the scores of
each student who took a test the following Monday.

The correlation coefficient r  0.831.

Hours, x 0 1 2 3 3 5 5 5 6 7 7 10
Test score, y 96 85 82 74 95 68 76 84 58 65 75 50

Test the significance of this correlation coefficient significant at  =


0.01?
Continued.
Larson & Farber, Elementary Statistics: Picturing the World, 3e 25
Hypothesis Testing for ρ
Example continued:
H0: ρ = 0 (no correlation) Ha: ρ  0 (significant correlation)

The level of significance is  = 0.01.


Degrees of freedom are d.f. = 12 – 2 = 10.
The critical values are t0 = 3.169 and t0 = 3.169.
The standardized test statistic is
r 0.831 The test statistic falls in the
t  
2 2
1 r 1  (0.831) rejection region, so H0 is
n 2 12  2
rejected.
 4.72.
t
t0 = 3.169 0 t0 = 3.169

At the 1% level of significance, there is enough evidence to conclude that there is


a significant linear correlation between the number of hours of TV watched over
the weekend and the test scores on Monday morning.
Larson & Farber, Elementary Statistics: Picturing the World, 3e 26
Correlation and Causation
The fact that two variables are strongly correlated does not in
itself imply a cause-and-effect relationship between the variables.

If there is a significant correlation between two variables, you


should consider the following possibilities.
1. Is there a direct cause-and-effect relationship between the variables?
Does x cause y?
2. Is there a reverse cause-and-effect relationship between the variables?
Does y cause x?
3. Is it possible that the relationship between the variables can be caused by a
third variable or by a combination of several other variables?
4. Is it possible that the relationship between two variables may be a
coincidence?

Larson & Farber, Elementary Statistics: Picturing the World, 3e 27


§ 9.2
Linear Regression
Residuals
After verifying that the linear correlation between two variables is
significant, next we determine the equation of the line that can be
used to predict the value of y for a given value of x.
Observed y-
y
value

d2 For a given x-value,


d1
d = (observed y-value) – (predicted y-value)

Predicted y- d
3
value
x
Each data point di represents the difference between the observed
y-value and the predicted y-value for a given x-value on the line.
These differences are called residuals.
Larson & Farber, Elementary Statistics: Picturing the World, 3e 29
Regression Line
A regression line, also called a line of best fit, is the line for
which the sum of the squares of the residuals is a minimum.

The Equation of a Regression Line


The equation of a regression line for an independent variable x and a
dependent variable y is
ŷ = mx + b
where ŷ is the predicted y-value for a given x-value. The slope m and
y-intercept b are given by
n  xy    x    y  y x
m a n d b  y m x  m
n  x 2   x  n n
2

wh er e y is t h e m ea n of t h e y - va lu es a n d x is t h e m ea n of t h e
x - va lu es. Th e r egr ession lin e a lwa ys pa ss es t h r ou gh (x , y ).

Larson & Farber, Elementary Statistics: Picturing the World, 3e 30


Regression Line
Example:
Find the equation of the regression line.
x y xy x2 y2
1 –3 –3 1 9
2 –1 –2 4 1
3 0 0 9 0
4 1 4 16 1
5 2 10 25 4
 x  15  y  1  xy  9  x 2  55  y 2  15

n  xy    x    y  5(9)  15  1 60  1.2


m   
n x 2  x  5(55)  15 50
2 2

Continued.
Larson & Farber, Elementary Statistics: Picturing the World, 3e 31
Regression Line
Example continued:
1 15
b  y mx   (1.2)  3.8
5 5
The equation of the regression line is
ŷ = 1.2x – 3.8. y
2
1
x
1 2 3 4 5

 
1
1
2 (x , y )  3,
5
3

Larson & Farber, Elementary Statistics: Picturing the World, 3e 32


Regression Line
Example:
The following data represents the number of hours 12 different
students watched television during the weekend and the scores of
each student who took a test the following Monday.

a.) Find the equation of the regression line.


b.) Use the equation to find the expected test score for a
student who watches 9 hours of TV.
Hours, x 0 1 2 3 3 5 5 5 6 7 7 10
Test score, y 96 85 82 74 95 68 76 84 58 65 75 50
xy 0 85 164 222 285 340 380 420 348 455 525 500
x2 0 1 4 9 9 25 25 25 36 49 49 100
y2 9216 7225 6724 5476 9025 4624 5776 7056 3364 4225 5625 2500

 x  54  y  908  xy  3724  x 2  332  y 2  70836


Larson & Farber, Elementary Statistics: Picturing the World, 3e 33
Regression Line
Example continued:
n  xy    x    y  12(3724)   54   908
m    4.067
n x  x  12(332)   54 
2 2 2

y
b  y mx 100 (x , y )  1254 , 908
12 
  4.5,75.7
908 54
  (4.067) 80
Test score
12 12
60
 93.97
40
20
ŷ = –4.07x + 93.97
x
2 4 6 8 10
Hours watching TV
Continued.
Larson & Farber, Elementary Statistics: Picturing the World, 3e 34
Regression Line
Example continued:
Using the equation ŷ = –4.07x + 93.97, we can predict the test
score for a student who watches 9 hours of TV.

ŷ = –4.07x + 93.97
= –4.07(9) + 93.97
= 57.34

A student who watches 9 hours of TV over the weekend can


expect to receive about a 57.34 on Monday’s test.

Larson & Farber, Elementary Statistics: Picturing the World, 3e 35


§ 9.3
Measures of Regression
and Prediction Intervals
Variation About a Regression Line
To find the total variation, you must first calculate the total
deviation, the explained deviation, and the unexplained
deviation.
Tot a l devia t ion  y i  y
E xpla in ed devia t ion  yˆ i  y
Un expla in ed devia t ion  y i  yˆ i
y (xi, yi)
Unexplained
Total deviation
y i  yˆ i
deviation
yi  y
(xi, ŷi) Explained
y
deviation
(xi, yi) yˆ i  y

x
x
Larson & Farber, Elementary Statistics: Picturing the World, 3e 37
Variation About a Regression Line
The total variation about a regression line is the sum of the squares of
the differences between the y-value of each ordered pair and the mean of
y.
Tot a l va r ia t ion    y i  y 
2

The explained variation is the sum of the squares of the differences


between each predicted y-value and the mean of y.
E xpla in ed va r ia t ion    yˆ i  y 
2

The unexplained variation is the sum of the squares of the differences


between the y-value of each ordered pair and each corresponding
predicted y-value.
Un expla in ed va r ia t ion    y i  yˆ i 
2

Tot a l va r iat ion  E xpla in ed var ia t ion  Un expla in ed va r ia t ion

Larson & Farber, Elementary Statistics: Picturing the World, 3e 38


Coefficient of Determination
The coefficient of determination r2 is the ratio of the explained
variation to the total variation. That is,
E xpla in ed va r ia t ion
r2 
Tot a l va r ia t ion

Example:
The correlation coefficient for the data that represents the number
of hours students watched television and the test scores of each
student is r  0.831. Find the coefficient of determination.

r 2  (0.831)2 About 69.1% of the variation in the test scores


can be explained by the variation in the hours of
 0.691
TV watched. About 30.9% of the variation is
unexplained.

Larson & Farber, Elementary Statistics: Picturing the World, 3e 39


The Standard Error of Estimate
When a ŷ-value is predicted from an x-value, the prediction is a
point estimate.
An interval can also be constructed.

The standard error of estimate se is the standard deviation of the


observed yi -values about the predicted ŷ-value for a given xi -value.
It is given by
( y i  yˆ i )2
se 
n 2
where n is the number of ordered pairs in the data set.

The closer the observed y-values are to the predicted y-values, the
smaller the standard error of estimate will be.
Larson & Farber, Elementary Statistics: Picturing the World, 3e 40
The Standard Error of Estimate
Finding the Standard Error of Estimate
In Words In Symbols
1. Make a table that includes the column x i , y i , yˆ i , ( y i  yˆ i ),
heading shown. ( y i  yˆ i )2
2. Use the regression equation to yˆ  m x i  b
calculate the predicted y-values.
3. Calculate the sum of the squares of the ( y i  yˆ i )2
differences between each observed y-
value and the corresponding predicted
y-value.
4. Find the standard error of estimate.
( y i  yˆ i )2
se 
n 2

Larson & Farber, Elementary Statistics: Picturing the World, 3e 41


The Standard Error of Estimate
Example:
The regression equation for the following data is
ŷ = 1.2x – 3.8.
Find the standard error of estimate.
xi yi ŷi (yi – ŷi )2
1 –3 – 2.6 0.16
2 –1 – 1.4 0.16
3 0 – 0.2 0.04
4 1 1 0
5 2 2.2 0.04 Unexplained
  0.4 variation
( y i  yˆ i )2 0.4
se    0.365
n 2 5 2
The standard deviation of the predicted y value for a given x value is
about 0.365.
Larson & Farber, Elementary Statistics: Picturing the World, 3e 42
The Standard Error of Estimate
Example:
The regression equation for the data that represents the number of
hours 12 different students watched television during the
weekend and the scores of each student who took a test the
following Monday is
ŷ = –4.07x + 93.97.
Find the standard error of estimate.
Hours, xi 0 1 2 3 3 5
Test score, yi 96 85 82 74 95 68
ŷi 93.97 89.9 85.83 81.76 81.76 73.62
(yi – ŷi)2 4.12 24.01 14.67 60.22 175.3 31.58
Hours, xi 5 5 6 7 7 10
Test score, yi 76 84 58 65 75 50
ŷi 73.62 73.62 69.55 65.48 65.48 53.27
Continued.
(yi – ŷi)2 5.66 107.74 133.4 0.23 90.63 10.69
Larson & Farber, Elementary Statistics: Picturing the World, 3e 43
The Standard Error of Estimate
Example continued:

( y i  yˆ i )2  658.25
Unexplained
variation

( y i  yˆ i )2 658.25  8.11
se  
n 2 12  2

The standard deviation of the student test scores for a specific


number of hours of TV watched is about 8.11.

Larson & Farber, Elementary Statistics: Picturing the World, 3e 44


Prediction Intervals
Two variables have a bivariate normal distribution if for any
fixed value of x, the corresponding values of y are normally
distributed and for any fixed values of y, the corresponding x-
values are normally distributed.
A prediction interval can be constructed for the true value of y.

Given a linear regression equation ŷ = mx + b and x0, a specific value


of x, a c-prediction interval for y is
ŷ–E<y<ŷ +E
where 2
1 n (x 0  x )
E  t cs e 1   .
n n  x 2  ( x )2

The point estimate is ŷ and the margin of error is E. The probability


that the prediction interval contains y is c.
Larson & Farber, Elementary Statistics: Picturing the World, 3e 45
Prediction Intervals
Construct a Prediction Interval for y for a Specific Value of x
In Words In Symbols
1. Identify the number of ordered pairs in d.f.  n  2
the data set n and the degrees of
freedom.
2. Use the regression equation and the yˆ  m x i  b
given x-value to find the point estimate
ŷ.
3. Find the critical value tc that Use Table 5 in
corresponds to the given level of Appendix B.
confidence c.

Continued.
Larson & Farber, Elementary Statistics: Picturing the World, 3e 46
Prediction Intervals
Construct a Prediction Interval for y for a Specific Value of x
In Words In Symbols
4. Find the standard error of ( y i  yˆ i )2
estimate se. se 
n 2

5. Find the margin of error E. 1 n (x 0  x )2


E  t cs e 1 
n n  x 2  ( x )2

6. Find the left and right Left endpoint: ŷ – E Right


endpoints and form the endpoint: ŷ + E
prediction interval. Interval: ŷ – E < y < ŷ + E

Larson & Farber, Elementary Statistics: Picturing the World, 3e 47


Prediction Intervals
Example:
The following data represents the number of hours 12 different
students watched television during the weekend and the scores of
each student who took a test the following Monday.

Hours, x 0 1 2 3 3 5 5 5 6 7 7 10
Test score, y 96 85 82 74 95 68 76 84 58 65 75 50

ŷ = –4.07x + 93.97 se  8.11

Construct a 95% prediction interval for the test scores when


4 hours of TV are watched.
Continued.
Larson & Farber, Elementary Statistics: Picturing the World, 3e 48
Prediction Intervals
Example continued:
Construct a 95% prediction interval for the test scores when the
number of hours of TV watched is 4.
There are n – 2 = 12 – 2 = 10 degrees of freedom.
The point estimate is
ŷ = –4.07x + 93.97 = –4.07(4) + 93.97 = 77.69.
The critical value tc = 2.228, and se = 8.11.
ŷ–E<y< ŷ+E
77.69 – 8.11 = 69.58 77.69+ 8.11 = 85.8
You can be 95% confident that when a student watches 4 hours of
TV over the weekend, the student’s test grade will be between 69.58
and 85.8.
Larson & Farber, Elementary Statistics: Picturing the World, 3e 49
§ 9.4
Multiple Regression
Multiple Regression Equation
In many instances, a better prediction can be found for a dependent
(response) variable by using more than one independent (explanatory)
variable.
For example, a more accurate prediction of Monday’s test grade from the
previous section might be made by considering the number of other
classes a student is taking as well as the student’s previous knowledge of
the test material.
A multiple regression equation has the form
ŷ = b + m1x1 + m2x2 + m3x3 + … + mkxk
where x1, x2, x3,…, xk are independent variables, b is the y-
intercept, and y is the dependent variable.
* Because the mathematics associated with this concept is complicated,
technology is generally used to calculate the multiple regression equation.

Larson & Farber, Elementary Statistics: Picturing the World, 3e 51


Predicting y-Values
After finding the equation of the multiple regression line, you can use
the equation to predict y-values over the range of the data.
Example:
The following multiple regression equation can be used to predict the
annual U.S. rice yield (in pounds).
ŷ = 859 + 5.76x1 + 3.82x2
where x1 is the number of acres planted (in thousands), and x2 is the number
of acres harvested (in thousands). (Source: U.S. National
Agricultural Statistics Service)
a.) Predict the annual rice yield when x1 = 2758, and x2 = 2714.
b.) Predict the annual rice yield when x1 = 3581, and x2 = 3021.
Continued.
Larson & Farber, Elementary Statistics: Picturing the World, 3e 52
Predicting y-Values
Example continued:

a.) ŷ = 859 + 5.76x1 + 3.82x2


= 859 + 5.76(2758) + 3.82(2714)
= 27,112.56
The predicted annual rice yield is 27,1125.56 pounds.

b.) ŷ = 859 + 5.76x1 + 3.82x2


= 859 + 5.76(3581) + 3.82(3021)
= 33,025.78
The predicted annual rice yield is 33,025.78 pounds.

Larson & Farber, Elementary Statistics: Picturing the World, 3e 53

You might also like