0% found this document useful (0 votes)
27 views15 pages

Linear Regression

Uploaded by

sridhar k
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views15 pages

Linear Regression

Uploaded by

sridhar k
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 15

1. What is Linear Regression primarily used for?

A) Classification

B) Clustering

C) Regression

D) Dimensionality Reduction

Answer: C) Regression

2. Which of the following is the correct formula for Simple Linear Regression?

A)

y=mx+b

B)

y=mx−b

C)


x=my+b

D)

x=my−b

Answer: A)

y=mx+b

3. In linear regression, what does the term "residual" mean?

A) Coefficient of the variable

B) Estimated regression coefficients

C) Difference between observed and predicted value

D) None of the above

Answer: C) Difference between observed and predicted value

4. Which method is used to find the best-fitting line in Simple Linear Regression?

A) Gradient Descent

B) Mean Squared Error

C) Least Squares Method

D) Maximum Likelihood Estimation

Answer: C) Least Squares Method


5. What does R-squared represent in the context of linear regression?

A) Slope of the line

B) Proportion of the variance in the dependent variable that is predictable from the independent
variables

C) Sum of squared residuals

D) Mean of the dependent variable

Answer: B) Proportion of the variance in the dependent variable that is predictable from the
independent variables

6. In the equation

y=mx+b, what does 'm' represent?

A) Intercept

B) Slope

C) Residual

D) Variance

Answer: B) Slope

7. Which assumption ensures that the model's errors are normally distributed?

A) Homoscedasticity

B) Multicollinearity

C) Normality of Error Terms

D) Independence of errors

Answer: C) Normality of Error Terms

8. What will be the value of R-squared if all the predicted values exactly match the observed values?
A) 0

B) 1

C) 0.5

D) It can be any value

Answer: B) 1

9. Which of the following metrics can be used to evaluate a linear regression model?

A) Accuracy

B) Precision

C) Mean Squared Error

D) F1-Score

Answer: C) Mean Squared Error

10. What happens if the linear regression model is trained with multicollinear features?

A) Model Training becomes faster

B) Coefficients become unstable

C) Residuals become zero

D) R-squared value becomes negative

Answer: B) Coefficients become unstable

11. Which of the following is not a type of linear regression?

A) Simple Linear Regression

B) Multiple Linear Regression

C) Polynomial Linear Regression

D) K-Nearest Neighbors Regression

Answer: D) K-Nearest Neighbors Regression

12. In multiple linear regression, what does the equation look like?

A)

=

y=m

+m

+…+m
n

+b

B)

y=m

x
1

−m

+…+m

+b

C)


y=m

+m

+…+m

−b

D)

1

y=m

⋅m

⋅…⋅m

n
+b

Answer: A)

y=m

+m

2
x

+…+m

+b

13. Which of the following graphs best represents the residuals in a well-fitted linear regression
model?

A) U-Shaped Pattern

B) Horizontal Line

C) Horizontal Random Scatter

D) Exponential Growth

Answer: C) Horizontal Random Scatter

14. Ridge Regression is a technique for dealing with what problem in linear regression?

A) Underfitting

B) Overfitting

C) High Bias

D) High Variance

Answer: B) Overfitting

15. In Ridge Regression, what happens as the regularization parameter approaches infinity?

A) Coefficients approach infinity

B) Coefficients approach zero

C) R-squared approaches one


D) R-squared approaches zero

Answer: B) Coefficients approach zero

16. What is the primary difference between Ridge Regression and Lasso Regression?

A) Ridge uses L1 regularization, Lasso uses L2

B) Ridge uses L2 regularization, Lasso uses L1

C) Both use L1 regularization

D) Both use L2 regularization

Answer: B) Ridge uses L2 regularization, Lasso uses L1

17. In Linear Regression, what is the consequence of heteroscedasticity?

A) Inefficient Estimators

B) Biased Estimators

C) Inconsistent Estimators

D) Unbiased Estimators

Answer: A) Inefficient Estimators

18. Which of the following is not an assumption of Linear Regression?

A) Linearity

B) Homoscedasticity

C) Independence of observations

D) All variables must be normally distributed

Answer: D) All variables must be normally distributed

19. If you plot the residuals against the predicted values, what pattern would you expect to see for a
well-fitted linear regression model?

A) A clear linear pattern

B) A clear exponential pattern

C) A clear quadratic pattern

D) No clear pattern

Answer: D) No clear pattern


20. In a well-fitted linear regression model, the residuals will have a mean of:

A) 1

B) 0

C) -1

D) 10

Answer: B) 0

21. What is a potential issue with having outliers in a dataset when performing linear regression?

A) They can unduly influence the slope and intercept

B) They make the residuals non-normal

C) They decrease the R-squared value

D) They increase the model's accuracy

Answer: A) They can unduly influence the slope and intercept

22. How can you check the linearity assumption in linear regression?

A) By plotting residuals against predicted values

B) By checking the R-squared value

C) By performing Ridge Regression

D) By plotting the dependent variable against the independent variable

Answer: D) By plotting the dependent variable against the independent variable

23. In the context of linear regression, what is collinearity?

A) The phenomenon where predictor variables are in a linear relationship with each other

B) The linear relationship between predictor and response variable

C) When residuals are in a straight line

D) When all predictor variables have a strong linear relationship with the response variable

Answer: A) The phenomenon where predictor variables are in a linear relationship with each other

24. How can you detect multicollinearity?

A) Low R-squared value


B) Visualizations like scatter plots

C) Variance Inflation Factor (VIF)

D) High Mean Squared Error

Answer: C) Variance Inflation Factor (VIF)

25. Which of the following techniques can be used to address multicollinearity?

A) Increasing the dataset size

B) Removing one of the correlated predictors

C) Adding more predictors

D) Using a non-linear regression model

Answer: B) Removing one of the correlated predictors

26. The coefficients in a Linear Regression model are estimated using:

A) Maximum Likelihood Estimation

B) Gradient Descent

C) Both A and B

D) Neither A nor B

Answer: C) Both A and B

27. Elastic Net regression is a combination of:

A) Ridge Regression and Logistic Regression

B) Lasso Regression and Polynomial Regression

C) Ridge Regression and Lasso Regression

D) Lasso Regression and Logistic Regression

Answer: C) Ridge Regression and Lasso Regression

28. Which of the following is true for the coefficients of variables in Lasso Regression?

A) They can be zero

B) They are always positive

C) They are always negative

D) They are never zero


Answer: A) They can be zero

29. A high bias in a linear regression model indicates:

A) The model fits the training data very well but performs poorly on unseen data

B) The model doesn't fit the training data well and also performs poorly on unseen data

C) The model fits both the training and test data perfectly

D) The model's complexity is too high

Answer: B) The model doesn't fit the training data well and also performs poorly on unseen data

30. Adjusted R-squared:

A) Increases as more predictor variables are added

B) Always decreases as more predictor variables are added

C) Can either increase or decrease as more predictor variables are added, depending on their
relevance

D) Remains constant irrespective of the number of predictor variables

Answer: C) Can either increase or decrease as more predictor variables are added, depending on
their relevance

These questions cover basic to intermediate concepts in Linear Regression.

You might also like