CH 11 Test
CH 11 Test
CH 11 Test
True/False Questions
2. The more variables that are added to the regression model, the better the model will fit
the data.
3. In multiple regression there is no need to consider the F-test, only the t-tests are
important.
4. Using multiple regression to regress five independent variables to predict y will give
the same result as five separate regressions of y versus each independent variable.
5. The degrees of freedom for error in the multiple regression model are: n - (k + 1).
8. The multiple regression model assumptions for e, the error term, are the same as for
the simple linear regression model.
9. When k = 2 variables, the graph of the model is a plane in three dimensions, rather
than a straight line as in the case when k = 1.
10. The symbol for the y-intercept in the multiple regression model is b0.
12. The F-test of the ANOVA used in multiple regression is not equivalent to the t-test for
the significance of the slope parameter used in simple linear regression.
13. In testing for the existence of a linear relationship between y and any k variables, if the
p-value for this test is 0.0001, then the null hypothesis is rejected.
15. The Mean Square Error, or MSE, is a biased estimator for the variance of the
population of errors, e, denoted by s2.
16. The square root of the MSE is the standard error of the estimate.
17. The adjusted multiple coefficient of determination always increases as new variables
are added to the model, just as R2 does.
18. The adjusted multiple coefficient of determination is the R2 with both SSE and SST
divided by their degrees of freedom.
19. When carrying out individual t-tests on each of the variables in the multiple regression
model, each test is independent of each other test.
20. When independent variables are correlated with each other, multicollinearity is
present.
21. Multicollinearity may cause the signs of some estimated regression parameters to be
the opposite of what we would expect.
22. When using qualitative variables, we use an indicator variable for each of the r
variables in the model.
23. When the adjusted coefficient of determination decreases when a term is included in
the multiple regression model, then that term should be retained in the model.
25. When powers of the variables are used in the model, a linear model is no longer
appropriate.
26. Inflated values of variances and standard errors of regression coefficient estimators
may be a sign of multicollinearity.
27. Removing collinear variables from a regression model is the easiest way of solving
problems of multicollinearity.
28. The test to check for first-order autocorrelation is the Durbin-Watson test.
31. In a multiple regression analysis with n = 15 and k = 14, the computer reports R2
0.9999.
A) this is an excellent regression
B) this is a very good regression
C) this is an average regression
D) this is not a good regression
E) not enough information to determine
33. In a multiple regression analysis, MSE = 20, n = 54, k = 3, and SST(total) = 2,000.
What is the R2 of the regression?
A) 0.0
B) 0.01
C) 0.99
D) 1.00
E) 0.50
34. The surface hyperplane of the linear regression of y on five independent variables is of
the dimension:
A) 2
B) 3
C) 4
D) 5
E) none of the above
35. Suppose that in a multiple regression the F is significant, but none of the t-ratios are
significant. This means that:
A) multicollinearity may be present
B) autocorrelation may be present
C) the regression is good
D) a nonlinear model would be a better fit
E) none of the above
37. How many degrees of freedom for error are associated with a multiple regression
model with k independent variables?
A) n - (k + 1)
B) n - k
C) n - 1
D) n - k + 1
E) none of the above
38. The F ratio used to test for the existence of' a linear relationship between the
dependent variable and any independent variable is:
A) MSE/(n-(k + l))
B) MSR/MSE
C) MSR/MST
D) MSE/MSR
E) none of the above
39. When the null hypothesis, H0: b1 = b2 = b3 = 0, is rejected, the interpretation should
be:
A) there is no linear relationship between y and any of the three independent variables
B) there is a regression relationship between y and at least one of the three
independent variables
C) all three independent variables have a slope of zero
D) all three independent variables have equal slopes
E) there is a regression relationship between y and all three independent variables
y = 5 + 10x1 + 20x2.
R2 = 0.90 Sb1 = 3.2 sb2 = 5.5
Calculate the t-test statistic to test whether x1 contributes information to the prediction
of y.
A) 0.32
B) 3.636
C) 3.125
D) 2.8125
E) 11.11
The following data gives the monthly sales (in thousands of dollars) for different advertising
expenditures (also in thousands of dollars) and sales commission percentages.
42. What amount of sales would this model predict for advertising expenditures of 25,000
and sales commission of 8%?
A) $564,318
B) $30,273.6
C) $561,734
D) $72,880
E) none of the above
43. Write the null and alternative hypotheses to test whether or not advertising
expenditures and sales commissions can be used to predict sales.
A) H0: b1 = b2 = 0; H1: at least one coefficient is not zero
B) H0: b1 b2 = 0; H1: b1 = b2
C) H0: b1 and b2 are not equal to zero; H0: b1 = b2 = 0
D) H0: b1 b2; H1: b1 < b2
E) none of the above
47. The degrees of freedom for error, for this regression model are:
A) 2
B) 5
C) 24
D) 8
E) none of the above
Eight students are selected randomly and their present graduate GPA is compared to their
undergraduate GPA and scores on standardized tests.
P re se n t G P A 3 .8 9 3 .0 3 3 .3 4 3 .8 5 3 .9 3 3 .0 6 3 .6 9 3 .9 1
U n d e rg r G P A 3 .7 7 2 .7 5 3 .1 1 3 .7 5 4 .0 0 2 .9 2 3 .7 0 3 .8 8
S td . S c o re s 700 460 550 690 720 420 670 670
Analysis of Variance
S o u rce D F SS M S F P
R e g re s s io n 2 1 .0 2 7 5 1 0 .5 1 3 7 5 1 7 0 .7 7 0 .0 0 0
E rro r 5 0 .0 1 5 0 4 0 .0 0 3 0 1
T o ta l 7 1 .0 4 2 5 5
48. Write the regression equation, letting undergraduate GPA be variable 1 and standard
scores be variable 2.
A) y = 0.4775 x1 + 0.0013392x2
B) y = 0.2059 + 0.1630x1 + 0.0006693x2
C) none of the others is correct
D) y = 1.1066 + 0.4775x1 + 0.0013392x2
E) not enough information given
49. At the 5% level of significance, are undergraduate scores and standard scores
significant?
A) both are significant
B) neither are significant
C) only undergraduate GPA is significant
D) only standard scores are significant
E) not enough information to determine
51. What is the relationship between R2 and the adjusted R2 for this regression model?
A) both are exactly the same
B) the adjusted R2 is larger than R2
C) the adjusted R2 is smaller than R2
D) both are very small
E) none of the above
54. Correlation of the values of variables with values of the same variables lagged one or
more time periods back is called:
A) multicollinearity
B) a transformation
C) autocorrelation
D) variance inflation
E) interaction
63. A multiple regression model with two independent variables exhibits a highly
significant F-ratio, but each variable's individual t-statistic is insignificant. The most
likely cause of such a situation is ____________
A) Heteroskedasticity
B) Homoskedasticity
C) Multicollinearity
D) Non-independence of residuals
E) Non-normality of residuals
65. In previous studies a market researcher has observed a significant positive relationship
between advertising expenditures and sales for a firm's 40 territories. In a recent
iteration of her study, however, she added to her model an additional independent
variable, one that tends to be highly correlated with advertising expenditures. Which
of the following is NOT a possible result of this addition?
A) Advertising expenditures will exhibit a negative relationship with sales
B) The relationship between advertising expenditures and sales will be insignificant
C) The model's overall F-ratio will increase
D) The model's coefficient of multiple determination (R2) will decrease
E) All of the above are possible results of this addition
In a multiple regression study based on 24 observations, the following results were observed:
C o e ffic ie n t E s tim a te S ta n d a r d E r ro r
B0 3 5 .2 1 4 .5 3
B1 2 3 .0 2 1 2 .0 2
B2 1 3 .0 5 6 .2 6
B3 -2 .4 5 0 .3 8
66. If a is set at 0.05, what is the critical value for the test statistic in a test of H0: bi = 0 in
this situation?
A) Z = 1.96
B) Z = 1.645
C) t = 2.086
D) t = 2.064
E) t = 1.725
67. Assuming that there are no problems with multicollinearity, which of the independent
variables appear to be useful or important in explaining the dependent variable?
A) Variable 1
B) Variable 2
C) Variable 3
D) Variable 1 and Variable 2
E) Variable 2 and Variable 3
68. Assuming that all variables were determined to be useful, what would the predicted
value of Y be if X1 = 3, X2 = 16 and X3 = 4.8?
A) 301.31
B) 266.10
C) 191.62
D) 142.57
E) 138.04
A multiple regression analysis has been done on 24 observations, with the following partial
ANOVA table resulting:
S o u rc e d .f . SS
R e g re s s io n ** 7 8 .3 1 2
E rro r 20 **
T o ta l ** 1 4 8 .0 2 8
69. How many independent variables were included in this regression analysis?
A) 2
B) 3
C) 4
D) 5
E) 6
Answer: Not really, since adjusted R2 falls to 0.958. In essence, the additional variable
adds little or no predictive power.
A multiple regression analysis has been done on 24 observations, with the following partial
ANOVA table resulting:
S o u rc e d .f. SS M S
R e g re s s io n 3 1 9 2 1 .5 4 3 **
E rro r ** ** 1 6 8 .0 5 1 4
T o ta l ** **
Answer: Yes. With a = 0.05, the critical value for the F-ratio with 3 numerator d.f. and
20 denominator d.f. is 3.10. The observed F-ratio is 3.81142. There appears to be a
significant regression relationship between the independent variables and the
dependent variable.
73. How effective in predicting the dependent variable is the regression model that has
been developed in this situation.
Answer: Despite its significant F-ratio, the regression model developed is not very
effective at explaining variability in the dependent variable, since adjusted R2 is only
0.268.