0% found this document useful (0 votes)
12 views60 pages

Chap13 - Simple Linear Regression

Chapter 13 of 'Statistics for Business and Economics' focuses on simple regression analysis, covering key concepts such as correlation coefficients, hypothesis testing for correlation, and the simple linear regression model. It explains how to interpret regression equations, calculate confidence intervals, and use regression for prediction. Additionally, the chapter discusses the assumptions of linear regression and provides a practical example using real estate data.

Uploaded by

ngannnss180998
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views60 pages

Chap13 - Simple Linear Regression

Chapter 13 of 'Statistics for Business and Economics' focuses on simple regression analysis, covering key concepts such as correlation coefficients, hypothesis testing for correlation, and the simple linear regression model. It explains how to interpret regression equations, calculate confidence intervals, and use regression for prediction. Additionally, the chapter discusses the assumptions of linear regression and provides a practical example using real estate data.

Uploaded by

ngannnss180998
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 60

Statistics for

Business and Economics


6th Edition

Chapter 13

Simple Regression

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Chapter Goals

After completing this chapter, you should be


able to:
 Explain the correlation coefficient and perform a
hypothesis test for zero population correlation
 Explain the simple linear regression model
 Obtain and interpret the simple linear regression
equation for a set of data
 Describe R2 as a measure of explanatory power of the
regression model

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Chapter Goals
(continued)

After completing this chapter, you should be


able to:
 Explain measures of variation and determine whether
the independent variable is significant
 Calculate and interpret confidence intervals for the
regression coefficients
 Use a regression equation for prediction
 Form forecast intervals around an estimated Y value
for a given X
 Use graphical analysis to recognize potential problems
in regression analysis
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
* Population 1  X  Population 2 Y 
x1 ...................................................  y1
x2 ....................................................  y2
......
xk .....................................................  yk
......
 y  0  1.x  pt hoi quy tt tong the 
parameter :  0 and 1
* Random Sample :
 x , y ;  x , y ;  x , y ;.....; x , y  
1 1 2 2 3 3 n n yˆ bo  b1.x  pt hoi quy tt mau  1.
statistic :b0 ( y  intercept) and b1 ( slope).
*Test :
1) H 0 : 1 10
b1  10
t0  . If | t0 |  t then reject H 0
se b  ;n  2
Correlation Analysis
 Correlation analysis is used to measure
strength of the association (linear relationship)
between two variables
 Correlation is only concerned with strength of the
relationship
 No causal effect is implied with correlation
 Correlation was first presented in Chapter 3

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Sample:  x1 , y1 ;  x2 , y2 ;  x3 , y3 ;...,  xn , yn 

Correlation Analysis
 The population correlation coefficient is
denoted ρ (the Greek letter rho)
 The sample correlation coefficient is

s xy
r
sxsy
where

s xy 
 (x  x)(y  y) i i

n 1
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Hypothesis Test for Correlation

 To test the null hypothesis of no linear


association,
H0 : ρ 0

the test statistic follows the Student’s t


distribution with (n – 2 ) degrees of freedom:

r (n  2)
t
2
(1 r )
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Decision Rules
Hypothesis Test for Correlation
t0.025;8 2.3060

Lower-tail test: Upper-tail test: Two-tail test:


H0: ρ  0 H0: ρ ≤ 0 H0: ρ = 0
H1: ρ < 0 H1: ρ > 0 H1: ρ ≠ 0

  /2 /2

-t t -t/2 t/2


Reject H0 if t < -tn-2,  Reject H0 if t > tn-2,  Reject H0 if t < -tn-2, 
> t

n-2, 
 or t
r (n  2)
Where t  2
has n - 2 d.f.
(1 r )
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Introduction to
Regression Analysis
 Regression analysis is used to:
 Predict the value of a dependent variable based on
the value of at least one independent variable
 Explain the impact of changes in an independent
variable on the dependent variable
Dependent variable: the variable we wish to explain
(also called the endogenous variable)
Independent variable: the variable used to explain
the dependent variable
(also called the exogenous variable)

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Linear Regression Model

 The relationship between X and Y is


described by a linear function
 Changes in Y are assumed to be caused by
changes in X
 Linear regression population equation model

Yi β0  β1x i  ε i

 Where 0 and 1 are the population model


coefficients and  is a random error term.
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Simple Linear Regression
Model
The population regression model:
Population Random
Population Independent Error
Slope
Y intercept Variable term
Coefficient
Dependent
Variable

Yi β0  β1Xi  ε i
Linear component Random Error
component

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Simple Linear Regression
Model
(continued)

Y Yi β0  β1Xi  ε i
Observed Value
of Y for Xi

εi Slope = β1
Predicted Value Random Error
of Y for Xi
for this Xi value

Intercept = β0

Xi X
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Simple Linear Regression
Equation
The simple linear regression equation provides an
estimate of the population regression line
Estimated Estimate of Estimate of the
(or predicted) the regression regression slope
y value for
observation i intercept
Value of x for

yˆ i b0  b1x i observation i

The individual random error terms ei have a mean of zero

ei ( y i - yˆ i ) y i - (b0  b1x i )

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Least Squares Estimators

 b0 and b1 are obtained by finding the values


of b0 and b1 that minimize the sum of the
squared differences between y and ŷ :
min SSE min  ei2

min  (y i  yˆ i )2

min  [y i  (b0  b1x i )]2

Differential calculus is used to obtain the


coefficient estimators b0 and b1 that minimize SSE
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Least Squares Estimators
(continued)

 The slope coefficient estimator is


n

 (x  x)(y  y)
i i
sY
b1  i1 n
rxy
sX
 i
(x
i1
 x) x
2

 And the constant or y-intercept is

b0 y  b1x

 The regression line always goes through the mean x, y


Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Finding the Least Squares
Equation

 The coefficients b0 and b1 , and other


regression results in this chapter, will be
found using a computer
 Hand calculations are tedious
 Statistical routines are built into Excel
 Other statistical analysis software can be used

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Linear Regression Model
Assumptions

 The true relationship form is linear (Y is a linear function


of X, plus random error)
 The error terms, εi are independent of the x values
 The error terms are random variables with mean 0 and
constant variance, σ2
(the constant variance property is called homoscedasticity)
2
E[ε i ] 0 and E[ε i ] σ 2 for (i 1,  , n)
 The random error terms, εi, are not correlated with one
another, so that
E[ε iε j ] 0 for all i  j

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Interpretation of the
Slope and the Intercept

 b0 is the estimated average value of y


when the value of x is zero (if x = 0 is
in the range of observed x values)

 b1 is the estimated change in the


average value of y as a result of a
one-unit change in x

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Simple Linear Regression
Example
 A real estate agent wishes to examine the
relationship between the selling price of a home
and its size (measured in square feet)

 A random sample of 10 houses is selected


 Dependent variable (Y) = house price in $1000s
 Independent variable (X) = square feet

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Sample Data for House Price
Model
House Price in $1000s Square Feet
(Y) (X)
245 1400
312 1600
279 1700
308 1875
199 1100
219 1550
405 2350
324 2450
319 1425
255 1700

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Graphical Presentation

 House price model: scatter plot


450
400
House Price ($1000s)

350
300
250
200
150
100
50
0
0 500 1000 1500 2000 2500 3000
Square Feet

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Regression Using Excel
 Tools / Data Analysis / Regression

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Excel Output
Regression Statistics
Multiple R 0.76211 The regression equation is:
R Square 0.58082
Adjusted R Square 0.52842 house price 98.24833  0.10977 (square feet)
Standard Error 41.33032
Observations 10

ANOVA
df SS MS F Significance F
Regression 1 18934.9348 18934.9348 11.0848 0.01039
Residual 8 13665.5652 1708.1957
Total 9 32600.5000

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Graphical Presentation

 House price model: scatter plot and


regression
450
line
400
House Price ($1000s)

350
Slope
300
250
= 0.10977
200
150
100
50
Intercept 0
= 98.248 0 500 1000 1500 2000 2500 3000
Square Feet

house price 98.24833  0.10977 (square feet)

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Interpretation of the
Intercept, b0

house price 98.24833  0.10977 (square feet)

 b0 is the estimated average value of Y when the


value of X is zero (if X = 0 is in the range of
observed X values)
 Here, no houses had 0 square feet, so b0 = 98.24833
just indicates that, for houses within the range of
sizes observed, $98,248.33 is the portion of the
house price not explained by square feet

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Interpretation of the
Slope Coefficient, b1

house price 98.24833  0.10977 (square feet)

 b1 measures the estimated change in the


average value of Y as a result of a one-
unit change in X
 Here, b1 = .10977 tells us that the average value of a
house increases by .10977($1000) = $109.77, on
average, for each additional one square foot of size

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Measures of Variation

 Total variation is made up of two parts:

SST  SSR  SSE


Total Sum of Regression Sum Error Sum of
Squares of Squares Squares

SST  (y i  y)2 SSR  (yˆ i  y)2 SSE  (y i  yˆ i )2


where:
y = Average value of the dependent variable
yi = Observed values of the dependent variable
ŷ = Predicted value of y for the given x value
i i
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Measures of Variation
(continued)

 SST = total sum of squares


 Measures the variation of the yi values around their
mean, y
 SSR = regression sum of squares
 Explained variation attributable to the linear
relationship between x and y
 SSE = error sum of squares
 Variation attributable to factors other than the linear
relationship between x and y

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Measures of Variation
(continued)
Y
yi  2

SSE = (yi - yi ) y
_
SST = (yi - y)2

y  _2
_ SSR = (yi - y) _
y y

xi X
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Coefficient of Determination, R2
 The coefficient of determination is the portion
of the total variation in the dependent variable
that is explained by variation in the
independent variable
 The coefficient of determination is also called
R-squared and is denoted as R2
SSR regression sum of squares
2
R  
SST total sum of squares

2
note: 0 R 1
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Examples of Approximate
r2 Values
Y
r2 = 1

Perfect linear relationship


between X and Y:
X
r2 = 1
Y 100% of the variation in Y is
explained by variation in X

X
r =1
2

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Examples of Approximate
r2 Values
Y
0 < r2 < 1

Weaker linear relationships


between X and Y:
X
Some but not all of the
Y
variation in Y is explained
by variation in X

X
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Examples of Approximate
r2 Values

r2 = 0
Y
No linear relationship
between X and Y:

The value of Y does not


X depend on X. (None of the
r2 = 0
variation in Y is explained
by variation in X)

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Excel Output
SSR 18934.9348
2
Regression Statistics
R   0.58082
Multiple R 0.76211 SST 32600.5000
R Square 0.58082
Adjusted R Square 0.52842 58.08% of the variation in
Standard Error 41.33032
house prices is explained by
Observations 10
variation in square feet
ANOVA
df SS MS F Significance F
Regression 1 18934.9348 18934.9348 11.0848 0.01039
Residual 8 13665.5652 1708.1957
Total 9 32600.5000

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Correlation and R2

 The coefficient of determination, R2, for a


simple regression is equal to the simple
correlation squared

2 2
R r xy

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Estimation of Model
Error Variance
 An estimator for the variance of the population model
error is
n

SSE  i
e 2

σˆ s 
2
 2
e
i1
n 2 n 2
 Division by n – 2 instead of n – 1 is because the simple regression
model uses two estimated parameters, b0 and b1, instead of one

se  s2e is called the standard error of the estimate

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Excel Output
Regression Statistics
Multiple R 0.76211 se 41.33032
R Square 0.58082
Adjusted R Square 0.52842
Standard Error 41.33032
Observations 10

ANOVA
df SS MS F Significance F
Regression 1 18934.9348 18934.9348 11.0848 0.01039
Residual 8 13665.5652 1708.1957
Total 9 32600.5000

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Comparing Standard Errors
se is a measure of the variation of observed y
values from the regression line
Y Y

small se X large se X

The magnitude of se should always be judged relative to the size


of the y values in the sample data

i.e., se = $41.33K is moderately small relative to house prices in


the $200 - $300K range
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Inferences About the
Regression Model

 The variance of the regression slope coefficient


(b1) is estimated by

2 2
s s
s2b1  e
 e

 (xi  x) (n  1)s x
2 2

where:
sb1 = Estimate of the standard error of the least squares slope

SSE
se  = Standard error of the estimate
n 2
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Excel Output
Regression Statistics
Multiple R 0.76211
R Square 0.58082
Adjusted R Square 0.52842
Standard Error
Observations
41.33032
10
sb1 0.03297
ANOVA
df SS MS F Significance F
Regression 1 18934.9348 18934.9348 11.0848 0.01039
Residual 8 13665.5652 1708.1957
Total 9 32600.5000

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Comparing Standard Errors of
the Slope
Sb1 is a measure of the variation in the slope of regression
lines from different possible samples

Y Y

small Sb1 X large Sb1 X

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Inference about the Slope:
t Test
 t test for a population slope
 Is there a linear relationship between X and Y?
 Null and alternative hypotheses
H0: β1 = 0 (no linear relationship)
H1: β1 0 (linear relationship does exist)
 Test statistic
where:
b1  β1
t b1 = regression slope
Reject H 0 if t  t
; n 2
sb1 coefficient
2 β1 = hypothesized slope
sb1 = standard
d.f. n  2 error of the slope
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Inference about the Slope:
t Test
(continued)

House Price Estimated Regression Equation:


Square Feet
in $1000s
(x)
(y) house price 98.25  0.1098 (sq.ft.)
245 1400
312 1600
279 1700
308 1875 The slope of this model is 0.1098
199 1100
219 1550
Does square footage of the house
405 2350 affect its sales price?
324 2450
319 1425
255 1700

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Inferences about the Slope:
t Test Example

b1 sb1
H0: β1 = 0 From Excel output:
H1: β1  0 Coefficients Standard Error t Stat P-value
Intercept 98.24833 58.03348 1.69296 0.12892
Square Feet 0.10977 0.03297 3.32938 0.01039

b1  β1 0.10977  0
t  t 3.32938
sb1 0.03297

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Inferences about the Slope:
t Test Example
(continued)
Test Statistic: t = 3.329
b1 sb1 t
H0: β1 = 0 From Excel output:
H1: β1  0 Coefficients Standard Error t Stat P-value
Intercept 98.24833 58.03348 1.69296 0.12892
d.f. = 10-2 = 8 Square Feet 0.10977 0.03297 3.32938 0.01039

t8,.025 = 2.3060
Decision:
/2=.025 /2=.025 Reject H0
Conclusion:
Reject H0 Do not reject H0 Reject H0
There is sufficient evidence
-tn-2,α/2 0 tn-2,α/2 that square footage affects
-2.3060 2.3060 3.329 house price
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Inferences about the Slope:
t Test Example
(continued)
P-value = 0.01039
P-value
H0: β1 = 0 From Excel output:
H1: β1  0 Coefficients Standard Error t Stat P-value
Intercept 98.24833 58.03348 1.69296 0.12892
Square Feet 0.10977 0.03297 3.32938 0.01039

This is a two-tail test, so Decision: P-value < α so


the p-value is Reject H0
P(t > 3.329)+P(t < -3.329) Conclusion:
= 0.01039 There is sufficient evidence
(for 8 d.f.) that square footage affects
house price
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Confidence Interval Estimate
for the Slope
Confidence Interval Estimate of the Slope:
b1  t n 2,α/2 sb1  β1  b1  t n 2,α/2 sb1
d.f. = n - 2

Excel Printout for House Prices:


Coefficients Standard Error t Stat P-value Lower 95% Upper 95%
Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580

At 95% level of confidence, the confidence interval for


the slope is (0.0337, 0.1858)

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Confidence Interval Estimate
for the Slope
(continued)

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580

Since the units of the house price variable is


$1000s, we are 95% confident that the average
impact on sales price is between $33.70 and
$185.80 per square foot of house size

This 95% confidence interval does not include 0.


Conclusion: There is a significant relationship between
house price and square feet at the .05 level of significance

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


F-Test for Significance
 F Test statistic:
MSR
F
where MSE

SSR
MSR 
k
SSE
MSE 
n k  1
where F follows an F distribution with k numerator and (n – k - 1)
denominator degrees of freedom

(k = the number of independent variables in the regression model)

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Excel Output
Regression Statistics
Multiple R 0.76211
MSR 18934.9348
R Square 0.58082 F  11.0848
Adjusted R Square 0.52842 MSE 1708.1957
Standard Error 41.33032
Observations 10 With 1 and 8 degrees P-value for
of freedom the F-Test
ANOVA
df SS MS F Significance F
Regression 1 18934.9348 18934.9348 11.0848 0.01039
Residual 8 13665.5652 1708.1957
Total 9 32600.5000

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


F-Test for Significance
(continued)

H0: β1 = 0 Test Statistic:


H1: β1 ≠ 0 MSR
F 11.08
 = .05 MSE
df1= 1 df2 = 8 Decision:
Critical Reject H0 at  = 0.05
Value:
F = 5.32
 = .05 Conclusion:
There is sufficient evidence that
0 F house size affects selling price
Do not Reject H0
reject H0
F.05 = 5.32
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Prediction

 The regression equation can be used to


predict a value for y, given a particular x

 For a specified value, xn+1 , the predicted


value is
yˆ n1 b0  b1x n1

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Predictions Using
Regression Analysis
Predict the price for a house
with 2000 square feet:

house price  98.25  0.1098 (sq.ft.)

 98.25  0.1098(2000)

 317.85
The predicted price for a house with 2000
square feet is 317.85($1,000s) = $317,850
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Chapter Summary
 Introduced the linear regression model
 Reviewed correlation and the assumptions of
linear regression
 Discussed estimating the simple linear
regression coefficients
 Described measures of variation
 Described inference about the slope
 Addressed estimation of mean values and
prediction of individual values
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
23. Can you use movie critics’ opinions to forecast box office
receipts on the opening weekend? The following data, stored in
Tomatometer , indicate the Tomatometer rating, the percentage
of professional critic reviews that are positive, and the receipts
per theater ($thousands) on the weekend a movie opened for
ten movies:
a) Use the least-squares method to
compute the regression coefficients
b0 and b1. Determine the coefficient of
determination, r2 .

b) Test the null hypothesis of no linear associatio n, H 0 :  0.


Using  0.05.
c) Using α=0.05. Test H0: β1 = 0 (no linear relationship);
H1: β1 0 (linear relationship does exist
d) Test H0: β0 = 0 ;H1: β0 ≠ 0.
Using α=0.05.
24. Management of a soft-drink bottling company has the business objective of developing a
method for allocating delivery costs to customers. Although one cost clearly relates to travel time
within a particular route, another variable cost reflects the time required to unload the cases of soft
drink at the delivery point. To begin, management decided to develop a regression model to predict
delivery time based on the number of cases delivered. A sample of 20 deliveries within a territory
was selected. The delivery times and the number of cases delivered were organized in the following
table and stored in

a) Use the least-squares method to


compute the regression coefficients
b0 and b1. Determine the coefficient of
determination, r2 .

b) Test the null hypothesis of no linear associatio n, H 0 :  0.


Using  0.05.
c) Using α=0.05. Test H0: β1 = 0 (no linear relationship);
H1: β1 0 (linear relationship does exist
d) Test H0: β0 = 0 ;H1: β0 ≠ 0.
Using α=0.05.
b) Test the null hypothesis of no
Using  0.05.

You might also like