0% found this document useful (0 votes)
4 views

Machine Learning Test Regression

The document contains a set of multiple-choice questions (MCQs), multiple-select questions (MSQs), and numerical answer type (NAT) questions related to machine learning topics such as simple and multiple linear regression, bias, variance, and hypothesis testing. Each question includes options and the correct answer. The questions are designed to test knowledge and understanding of fundamental concepts in regression analysis.

Uploaded by

Dr. R. Gowri CIT
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

Machine Learning Test Regression

The document contains a set of multiple-choice questions (MCQs), multiple-select questions (MSQs), and numerical answer type (NAT) questions related to machine learning topics such as simple and multiple linear regression, bias, variance, and hypothesis testing. Each question includes options and the correct answer. The questions are designed to test knowledge and understanding of fundamental concepts in regression analysis.

Uploaded by

Dr. R. Gowri CIT
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 6

MACHINE LEARNING

TEST QUESTIONS – II
Topics: Simple Linear Regression, Multiple Linear Regression, Bias, Variance, Trade Off,
Hypothesis Testing

MCQ
1. Given a simple linear regression model Y = 2X + 3 + e, where e represents the error term, what is
the predicted value of Y when X = 4?
A) 8
B) 11
C) 10
D) 5
Answer: B) 11

2. Consider a dataset where the sum of the residuals (errors) from a simple linear regression model is
given. What is this sum?
A) 0
B) Equal to the mean of the dependent variable
C) Equal to the sum of the predicted values
D) Equal to the variance of the dependent variable
Answer: A) 0

3. In a multiple linear regression model Y = beta_0 + beta_1X_1 + beta_2X_2, suppose beta_0 = 2,


beta_1 = 3, and beta_2 = -1. What is the predicted value of Y when X_1 = 1 and X_2 = 2?
A) 4
B) 3
C) 2
D) 5
Answer: B) 3

4. Given the following multiple linear regression output:


Y = 5 + 1.5X_1 + 2.3X_2 + e
Calculate the predicted value of Y when X_1 = 2 and X_2 = 3.
A) 16.4
B) 14.4
C) 12.4
D) 10.4
Answer: B) 14.4

5. For a simple linear regression model, the null hypothesis H_0 in hypothesis testing states:
A) beta_1 = 0
B) beta_1 != 0
C) beta_0 = 0
D) beta_0 != 0
Answer: A) beta_1 = 0

6. If the p-value for the t-test of a regression coefficient is 0.03, what can we conclude at the 5%
significance level?
A) Reject the null hypothesis
B) Fail to reject the null hypothesis
C) The coefficient is not significant
D) The model is not valid
Answer: A) Reject the null hypothesis

7. In the context of bias and variance, which of the following scenarios typically results in high bias?
A) Using a very complex model
B) Using a very simple model
C) Using a large amount of training data
D) Using a high-dimensional dataset
Answer: B) Using a very simple model

8. Which of the following scenarios typically results in high variance?


A) Using a very complex model
B) Using a very simple model
C) Using a small training dataset
D) Using a low-dimensional dataset
Answer: A) Using a very complex model
9. Suppose you have a dataset and you apply a linear regression model and a polynomial regression
model to it. You observe that the linear model has a higher training error but a lower test error
compared to the polynomial model. This indicates that:
A) The linear model has high bias and low variance
B) The linear model has low bias and high variance
C) The polynomial model has high bias and low variance
D) The polynomial model has low bias and high variance
Answer: D) The polynomial model has low bias and high variance

10. Given a dataset, if you increase the complexity of your model (e.g., by adding more features),
what typically happens to bias and variance?
A) Both bias and variance increase
B) Bias increases and variance decreases
C) Bias decreases and variance increases
D) Both bias and variance decrease
Answer: C) Bias decreases and variance increases

MSQ
1. Which of the following are assumptions of simple linear regression?
A) Linearity
B) Homoscedasticity
C) Independence of errors
D) Multicollinearity
Answer: A) Linearity, B) Homoscedasticity, C) Independence of errors

2. In a simple linear regression Y = beta_0 + beta_1X + e, which of the following statements are true?
A) beta_0 is the intercept
B) beta_1 is the slope
C) e represents the residual error
D) X is the dependent variable
Answer: A) beta_0 is the intercept, B) beta_1 is the slope, C) e represents the residual error
3. In multiple linear regression, which of the following are true regarding the coefficients?
A) Each coefficient represents the change in the dependent variable for a one-unit change in the
corresponding independent variable, holding other variables constant.
B) Multicollinearity can make the coefficients unstable.
C) Coefficients are estimated using the method of maximum likelihood.
D) Coefficients are estimated using the method of least squares.
Answer: A) Each coefficient represents the change in the dependent variable for a one-unit change
in the corresponding independent variable, holding other variables constant, B) Multicollinearity can
make the coefficients unstable, D) Coefficients are estimated using the method of least squares

4. Which of the following can indicate the presence of multicollinearity in a multiple linear regression
model?
A) High variance inflation factor (VIF) values
B) Large changes in the estimated coefficients when adding or removing a variable
C) High correlation between independent variables
D) High R-squared value
Answer: A) High variance inflation factor (VIF) values, B) Large changes in the estimated
coefficients when adding or removing a variable, C) High correlation between independent variables

5. Which of the following are steps in hypothesis testing for regression coefficients?
A) Formulate the null and alternative hypotheses
B) Calculate the test statistic
C) Determine the p-value or critical value
D) Make a decision to reject or not reject the null hypothesis
Answer: A) Formulate the null and alternative hypotheses, B) Calculate the test statistic, C)
Determine the p-value or critical value, D) Make a decision to reject or not reject the null hypothesis

6. In the context of hypothesis testing, a Type I error occurs when:


A) The null hypothesis is true, but we reject it
B) The null hypothesis is false, but we fail to reject it
C) We incorrectly reject the null hypothesis
D) We incorrectly fail to reject the null hypothesis
Answer: A) The null hypothesis is true, but we reject it, C) We incorrectly reject the null hypothesis
7. High bias in a model typically results in:
A) Overfitting
B) Underfitting
C) Poor performance on training data
D) Poor performance on test data
Answer: B) Underfitting, D) Poor performance on test data

8. High variance in a model typically results in:


A) Overfitting
B) Underfitting
C) Sensitivity to small fluctuations in the training data
D) Better generalization to new data
Answer: A) Overfitting, C) Sensitivity to small fluctuations in the training data

9. Which of the following strategies can help to reduce overfitting in a regression model?
A) Use cross-validation
B) Increase the complexity of the model
C) Use regularization techniques such as Lasso or Ridge
D) Reduce the size of the training dataset
Answer: A) Use cross-validation, C) Use regularization techniques such as Lasso or Ridge

10. In the context of the bias-variance tradeoff, which of the following statements are true?
A) Increasing model complexity decreases bias but increases variance
B) Decreasing model complexity increases bias but decreases variance
C) The optimal model is one that balances bias and variance
D) The bias-variance tradeoff is irrelevant for model selection
Answer: A) Increasing model complexity decreases bias but increases variance, B) Decreasing
model complexity increases bias but decreases variance, C) The optimal model is one that balances
bias and variance
NAT
1. In a simple linear regression model Y = 5X + 2, ___________ is the predicted value of Y when X =
3
Answer: 17
2. Given the simple linear regression equation Y = 3X + 7, and a data point (X, Y) = (4, 19), the
residual error for this data point is ______.
Answer: 4

3. In a multiple linear regression model Y = 2 + 3X_1 - 4X_2, _______ is the predicted value of Y
when X_1 = 2 and X_2 = 1.
Answer: 4

4. Suppose in a multiple linear regression model Y = 1 + 2X_1 + 3X_2 + 4X_3, the values X_1 = 1,
X_2 = 2, and X_3 = 3 are given. _________ is the predicted value of Y.
Answer: 20

5. In a simple linear regression model, if the t-statistic for a coefficient beta_1 is 2.5 and the standard
error is 0.4, _______ is the value of beta_1.
Answer: 1

6. For a regression model, if the p-value for the F-test is 0.03, at ________ significance level (in
percentage) we reject the null hypothesis.
Answer: 5

7. If the mean squared error (MSE) of a model is 25, the variance of the model is 16, ______ is the
bias squared.
Answer: 9

8. In a dataset, the actual values are [3, 5, 7, 9], and the predicted values from a linear regression
model are [2.5, 5.5, 6.5, 9.5]. _________ is the mean squared error (MSE).
Answer: 0.25

9. A regression model has a bias of 2 and a variance of 3. __________ is the total error if the
irreducible error is 1.
Answer: 14

10. In a regression model, the sum of squared errors (SSE) is 50 and the total sum of squares (SST) is
100. _________ is the R-squared value of the model.
Answer: 0.5

You might also like