Machine Learning MCQ (Multiple Choice
Questions)
Here are 1000 MCQs on Machine Learning (Chapterwise).
1. What is Machine learning?
a) The selective acquisition of knowledge through the use of computer programs
b) The selective acquisition of knowledge through the use of manual programs
c) The autonomous acquisition of knowledge through the use of computer programs
d) The autonomous acquisition of knowledge through the use of manual programs
View Answer
Answer: c
Explanation: Machine learning is the autonomous acquisition of knowledge through the
use of computer programs.
ADVERTISEMENT
2. K-Nearest Neighbors (KNN) is classified as what type of machine learning algorithm?
a) Instance-based learning
:
b) Parametric learning
c) Non-parametric learning
d) Model-based learning
View Answer
Answer: a
Explanation: KNN doesn’t build a parametric model of the data. Instead, it directly
classifies new data points based on the k nearest points in the training data.
3. Which of the following is not a supervised machine learning algorithm?
a) K-means
b) Naïve Bayes
c) SVM for classification problems
d) Decision tree
View Answer
Answer: a
Explanation: Decision tree, SVM (Support vector machines) for classification problems
and Naïve Bayes are the examples of supervised machine learning algorithm. K-means is
an example of unsupervised machine learning algorithm.
advertisement
4. What’s the key benefit of using deep learning for tasks like recognizing images?
:
a) They need less training data than other methods.
b) They’re easier to explain and understand than other models.
c) They can learn complex details from the data on their own.
d) They work faster and are more e!cient computationally.
View Answer
Answer: c
Explanation: Deep learning is great at figuring out intricate details from data, especially
in tasks like recognizing images.
5. Which algorithm is best suited for a binary classification problem?
a) K-nearest Neighbors
b) Decision Trees
c) Random Forest
d) Linear Regression
View Answer
Answer: b
Explanation: Decision Trees are versatile and can be used for classification problems,
particularly for binary classification, where the output is divided into two classes.
ADVERTISEMENT
6. What is the key di"erence between supervised and unsupervised learning?
a) Supervised learning requires labeled data, while unsupervised learning does not.
b) Supervised learning predicts labels, while unsupervised learning discovers patterns.
:
c) Supervised learning is used for classification, while unsupervised learning is used for
regression.
d) Supervised learning is always more accurate than unsupervised learning.
View Answer
Answer: a
Explanation: The presence or absence of labeled data in the training set distinguishes
supervised and unsupervised learning approaches.
7. Which type of machine learning algorithm falls under the category of “unsupervised
learning”?
a) Linear Regression
b) K-means Clustering
c) Decision Trees
d) Random Forest
View Answer
Answer: b
Explanation: K-means Clustering is an example of unsupervised learning used for
clustering unlabeled data based on similarities.
advertisement
8. Which of the following statements is true about AdaBoost?
:
a) It is particularly prone to overfitting on noisy datasets
b) Complexity of the weak learner is important in AdaBoost
c) It is generally more prone to overfitting
d) It improves classification accuracy
View Answer
Answer: c
Explanation: AdaBoost is generally not more prone to overfitting but is less prone to
overfitting. And it is prone to overfitting on noisy datasets. If you use very simple weak
learners, then the algorithms are much less prone to overfitting and it improves
classification accuracy. So Complexity of the weak learner is important in AdaBoost.
9. Which one of the following models is a generative model used in machine learning?
a) Support vector machines
b) Naïve Bayes
c) Logistic Regression
d) Linear Regression
View Answer
Answer: b
Explanation: Naïve Bayes is a type of generative model which is used in machine
learning. Linear Regression, Logistic Regression and Support vector machines are the
types of discriminative models which are used in machine learning.
advertisement
:
10. An artificially intelligent car decreases its speed based on its distance from the car in
front of it. Which algorithm is used?
a) Naïve-Bayes
b) Decision Tree
c) Linear Regression
d) Logistic Regression
View Answer
Answer: c
Explanation: The output is numerical. It determines the speed of the car. Hence it is not
a classification problem. All the three, decision tree, naïve-Bayes, and logistic regression
are classification algorithms. Linear regression, on the other hand, outputs numerical
values based on input. So, this can be used.
11. Which of the following statements is false about Ensemble learning?
a) It is a supervised learning algorithm
b) It is an unsupervised learning algorithm
c) More random algorithms can be used to produce a stronger ensemble
d) Ensembles can be shown to have more flexibility in the functions they can represent
View Answer
Answer: b
Explanation: Ensemble learning is not an unsupervised learning algorithm. It is a
supervised learning algorithm that combines several machine learning techniques into
one predictive model to decrease variance and bias. It can be trained and then used to
make predictions. And this ensemble can be shown to have more flexibility in the
functions they can represent.
12. Which of the following statements is true about stochastic gradient descent?
a) It processes one training example per iteration
b) It is not preferred, if the number of training examples is large
c) It processes all the training examples for each iteration of gradient descent
d) It is computationally very expensive, if the number of training examples is large
View Answer
Answer: a
Explanation: Stochastic gradient descent processes one training example per iteration.
:
That is it updates the weight vector based on one data point at a time. All other three
are the features of Batch Gradient Descent.
13. Decision tree uses the inductive learning machine learning approach.
a) False
b) True
View Answer
Answer: b
Explanation: Decision tree uses the inductive learning machine learning approach.
Inductive learning enables the system to recognize patterns and regularities in previous
knowledge or training data and extract the general rules from them. A decision tree is
considered to be an inductive learning task as it uses particular facts to make more
generalized conclusions.
14. What elements describe the Candidate-Elimination algorithm?
a) depends on the dataset
b) just a set of candidate hypotheses
c) just a set of instances
d) set of instances, set of candidate hypotheses
View Answer
Answer: d
Explanation: A set of instances is required. A set of candidate hypotheses are given.
These are applied to the training data and the list of accurate hypotheses is output in
accordance with the candidate-elimination algorithm.
15. Which of the following statements is not true about boosting?
a) It mainly increases the bias and the variance
b) It tries to generate complementary base-learners by training the next learner on the
mistakes of the previous learners
c) It is a technique for solving two-class classification problems
d) It uses the mechanism of increasing the weights of misclassified data in preceding
classifiers
View Answer
Answer: a
Explanation: Boosting does not increase the bias and variance but it mainly reduces the
:
bias and the variance. It is a technique for solving two-class classification problems. And
it tries to generate complementary base-learners by training the next learner (by
increasing the weights) on the mistakes (misclassified data) of the previous learners.
Chapterwise Multiple Choice Questions on Machine Learning
Our 1000+ MCQs focus on all topics of the Machine Learning
subject, covering 100+ topics. This will help you to prepare for
exams, contests, online tests, quizzes, viva-voce, interviews, and
certifications. You can practice these MCQs chapter by chapter
starting from the 1st chapter or you can jump to any chapter of
your choice.
1. Formal Learning Model
2. Version Spaces
3. VC-Dimension
4. Linear Regression
5. Multivariate Linear Regression
6. Logistic Regression
7. Ensemble Learning
8. Stochastic Gradient Descent
9. Kernels
10. Support Vector Machines
11. Decision Trees
12. Nearest Neighbor
13. Naive-Bayes Algorithm
14. Machine Learning – Neural Networks
1. MCQ on Formal Learning Model
The section contains multiple choice questions and answers on statistical learning
:
framework, empirical minimization framework and PAC learning.
Statistical Learning Framework PAC Learning
Empirical Minimization Framework
2. Machine Learning MCQ on Version Spaces
The section contains questions and answers on version spaces, find-s algorithm and
candidate elimination algorithm.
Version Spaces Candidate Elimination Algorithm
Find-S Algorithm
3. Machine Learning Multiple Choice Questions on VC-Dimension
The section contains Machine Learning MCQs on VC-dimension and the Fundamental
Theorem of PAC Learning.
VC-Dimension Fundamental Theorem of PAC Learning
VC-Dimension – Set 2
4. MCQ on Linear Regression
The section contains Machine Learning multiple choice questions and answers on linear
regression in machine learning, linear regression cost functions, and gradient descent.
Linear Regression in Machine Learning Linear Regression – Gradient Descent
Linear Regression – Cost Function
5. Machine Learning MCQ on Multivariate Linear Regression
The section contains Machine Learning questions and answers on multivariate linear
:
regression, gradient descent for multiple variables, and polynomial regression.
Multivariate Linear Regression Polynomial Regression in Machine
learning
Gradient Descent for Multiple Variables
6. Machine Learning Multiple Choice Questions on Logistic
Regression
This section features MCQs on logistic regression, hypothesis representation, decision
boundary, cost function and gradient descent, logistic regression for multiple classification,
and advanced optimization.
Logistic Regression Logistic Regression – Cost Function and
Gradient Descent
Hypothesis Representation
Logistic Regression – Advanced
Logistic Regression – Decision Boundary
Optimization
Logistic Regression – Multiple
Classification
7. MCQ on Ensemble Learning
The section contains multiple choice questions and answers on ensemble learning,
covering error-correcting output codes, model combination schemes, boosting weak
learnability, the AdaBoost algorithm, and stacking.
Ensemble Learning – Model Boosting Weak Learnability
Combination Schemes
Adaboost Algorithm
Ensemble Learning
Stacking
Error Correcting Output Codes
8. Machine Learning MCQ on Stochastic Gradient Descent
The section contains questions and answers on optimization algorithms, specifically
:
focusing on Stochastic Gradient Descent (SGD), its variants, the standard Gradient Descent
Algorithm, and Subgradient Descent.
Gradient Descent Algorithm Stochastic Gradient Descent – Set 2
Subgradient Descent SGD Variants
Stochastic Gradient Descent
9. Machine Learning Multiple Choice Questions on Kernels
The section contains Machine Learning MCQs on kernels and kernel trick.
Kernels Kernel Trick
10. MCQ on Support Vector Machines
The section contains multiple choice questions and answers on support vector machines
(SVMs), covering key concepts like the large margin intuition, margins and hard/soft SVMs,
norm regularization, optimality conditions and support vectors, and finally, implementing
soft SVMs using Stochastic Gradient Descent (SGD).
Support Vector Machines Soft SVM and Norm Regularization
Large Margin Intuition Optimality Conditions and Support
Vectors
Margin and Hard SVM
Implementing Soft SVM with SGD
11. Machine Learning MCQ on Decision Trees
The section contains questions and answers on decision trees, covering core concepts
such as decision tree pruning, inductive bias, classification trees, regression trees, and the
powerful Random Forest algorithm.
Decision Trees Decision Trees – Inductive Bias
:
Decision Trees – Gain Measure Classification Tree
Implementation
Regression Trees
Decision Tree Pruning
Random Forest Algorithm
Decision Tree Pruning – Set 2
Decision Trees – Threshold Based
Splitting Rules
12. Machine Learning Multiple Choice Questions on Nearest
Neighbor
The section contains MCQs on K-Nearest Neighbor Algorithm and Nearest Neighbor
Analysis.
K-Nearest Neighbor Algorithm Nearest Neighbor Analysis
13. MCQ on Naive-Bayes Algorithm
The section contains multiple choice questions and answers on Naive-Bayes Algorithm.
Naive-Bayes Algorithm
14. Neural Networks in Machine Learning
The section contains multiple choice questions and answers on nonlinear hypothesis,
neurons and the brain, model representation, multiclass classification, cost function,
gradient checking, and random initialization.
Backpropagation Algorithm Model Representation
Backpropagation Algorithm – 2 Multiclass Classification
Backpropagation Algorithm – 3 Cost Function
Non-Linear Hypothesis Gradient Checking
Neurons and the Brain Random Initialization
:
If you would like to learn "Machine Learning" thoroughly, you should attempt to work on
the complete set of 1000+ MCQs - multiple choice questions and answers mentioned
above. It will immensely help anyone trying to crack an exam or an interview.
Wish you the best in your endeavor to learn and master Machine Learning!
If you find a mistake in question / option / answer, kindly take a screenshot and
email to [email protected]
» Next - Machine Learning Questions and Answers – Statistical Learning Framework
advertisement
ADVERTISEMENT
Recommended Articles:
1. Artificial Intelligence Questions and Answers – Machine Learning
2. Machine Learning Questions and Answers – Logistic Regression – Multiple
Classification
3. Machine Learning Questions and Answers – Statistical Learning Framework
4. Machine Learning Questions and Answers – Ensemble Learning – Model
Combination Schemes
5. Machine Learning Questions and Answers – PAC Learning
6. Machine Learning Questions and Answers – Gradient Descent for Multiple Variables
:
7. Machine Learning Questions and Answers – Logistic Regression – Advanced
Optimization
8. Machine Learning Questions and Answers – Logistic Regression – Decision Boundary
9. Machine Learning Questions and Answers – Logistic Regression – Cost Function and
Gradient Descent
10. Machine Learning Questions and Answers – Find-S Algorithm
advertisement
ADVERTISEMENT
Additional Resources:
Machine Tools MCQ Questions
Electrical Machine Design MCQ Questions
Machine Drawing MCQ Questions
Machine Design MCQ Questions
Mechatronics MCQ Questions
Popular Pages:
Theory of Machine MCQ Questions
Artificial Intelligence MCQ Questions
:
Computer Graphics MCQ Questions
Computer Organization and Architecture MCQ Questions
Neural Networks MCQ Questions
Subscribe Sanfoundry Newsletter and Posts
Name*
Email*
Subscribe
Subscribe to our Newsletters (Subject-wise). Participate in the Sanfoundry Certification
contest to get free Certificate of Merit. Join our social networks below and stay updated
with latest contests, videos, internships and jobs!
Youtube | Telegram | LinkedIn | Instagram | Facebook | Twitter | Pinterest
Manish Bhojasia, a technology veteran with 20+ years @ Cisco &
Wipro, is Founder and CTO at Sanfoundry. He lives in Bangalore,
and focuses on development of Linux Kernel, SAN Technologies,
Advanced C, Data Structures & Alogrithms. Stay connected with him
at LinkedIn.
Subscribe to his free Masterclasses at Youtube & discussions at
Telegram SanfoundryClasses.
.
About | Certifications | Internships | Jobs | Privacy Policy | Terms | Copyright | Contact
© 2011-2024 Sanfoundry. All Rights Reserved.
:
: