Mid 2
Mid 2
What is Neuron Models and Discuss (Biological Neuron, Artificial Neuron, Mathematical
Model).
2.Explain about Multi-Layer Perceptron (MLP) Networks and the Error-Backpropagation
Algorithm.
3.what is Classification Decision Tree.? Explain Example of a Classification Decision Tree.
4. Explain Fisher’s Linear Discriminant and Thresholding for Classification.
5.Discuss about Strengths and Weaknesses of Decision-Tree Approach
6.Assume data set ID3 Decision Trees and calculate (example).
7.Define Perceptron? Explain the Perceptron Algorithm
8.Explain the Linear Maximal Margin Classifier for Linearly Separable Data.
9.Explain about Linear Discriminant Functions for Binary Classification.
10.Discuss about Kernel-Induced Feature Spaces.
11.Explain the Regression by Support Vector Machines.
12.what is Pruning the Tree? And Explain.
30. In _______ typically some measure for error of approximation is used instead of margin
between an optimal separating hyperplane and support vectors, which was used in the design
of SV classifiers.
General:
31. In machine learning, when might SVMs be preferred over neural networks? a. When
interpretability is crucial b. When working with images c. When the dataset is large d. Both a
and c
Answer: d.
32. What is a disadvantage of neural networks compared to SVMs? a. They are less prone
to overfitting b. They require more data c. They are simpler to interpret d. They are faster to
train
Answer: b
33. Which of the following is a common use case for SVMs? a. Image classification b.
Natural language processing c. Anomaly detection d. Speech recognition
Answer: c.
34. What is the purpose of cross-validation in machine learning? a. To train a model on
multiple datasets b. To optimize hyperparameters c. To validate model performance on a
single dataset d. To replace the test set
Answer: b
35. Which algorithm is sensitive to outliers in the training data? a. SVM b. Neural
network c. Both d. Neither
Answer: a.
36. What is the activation function commonly used in the output layer of a binary
classification neural network? a. ReLU b. Sigmoid c. Tanh d. Softmax
Answer: b.
37. What does the term "overfitting" mean in the context of machine learning? a. The
model is too simple and cannot capture patterns in the data. b. The model performs well on
the training set but poorly on new data. c. The model is too complex and fits the noise in the
training data. d. The model is unable to converge during training.
Answer: c.
38. What is the primary advantage of using a radial basis function (RBF) kernel in SVM?
a. It is computationally efficient. b. It allows the model to handle non-linear relationships. c.
It reduces the risk of overfitting. d. It simplifies the interpretability of the model.
Answer: b.
39. Which of the following statements is true about neural networks? a. They are always
interpretable. b. They require fewer computational resources compared to SVMs. c. They
automatically learn hierarchical representations from data. d. They are not suitable for tasks
with non-linear relationships.
Answer: c.
40. In neural networks, what is the purpose of the activation function? a. To introduce
non-linearity b. To reduce the number of nodes c. To control the learning rate d. To
determine the kernel type
Answer: a.
41. What is the primary disadvantage of using a linear kernel in SVM? a. It cannot handle
non-linear relationships. b. It is computationally expensive. c. It is prone to overfitting. d. It
requires more training data.
Answer: a.
42. Which hyperparameter in SVM determines the trade-off between achieving a smooth
decision boundary and classifying training points correctly? a. C b. Gamma c. Kernel d.
Margin
Answer: a. C
43. In neural networks, what is the purpose of the validation set during training? a. To
train the model b. To fine-tune hyperparameters c. To evaluate model performance on unseen
data d. To test the model's accuracy
Answer: b
44. What is the primary advantage of using a non-linear activation function in a neural
network? a. It simplifies the model architecture. b. It allows the network to learn complex
patterns. c. It reduces the risk of underfitting. d. It speeds up the training process.
Answer: b.
45. What is the purpose of regularization in machine learning models? a. To increase
model complexity b. To decrease the learning rate c. To penalize overly complex models d.
To reduce the number of iterations
Answer: c.
46. Which of the following is a characteristic of a well-regularized machine learning
model? a. High training accuracy, low test accuracy b. Low training accuracy, low test
accuracy c. High training accuracy, high test accuracy d. Low training accuracy, high test
accuracy
Answer: c. High training accuracy, high test accuracy
47. In a neural network, what is the purpose of the dropout layer? a. To randomly remove
nodes during training b. To increase the number of hidden layers c. To reduce the learning
rate d. To enforce weight constraints
Answer: a. To randomly remove nodes during training
48. Which of the following is a disadvantage of using a polynomial kernel in SVM? a. It
is computationally expensive. b. It cannot handle non-linear relationships. c. It is prone to
overfitting. d. It may lead to high-dimensional feature spaces.
Answer: d. It may lead to high-dimensional feature spaces.
49. What is the role of the bias term in a neural network? a. It controls the learning rate. b.
It shifts the decision boundary. c. It reduces the number of hidden layers. d. It increases
model complexity.
Answer: b. It shifts the decision boundary.
50. In SVM, what does the term "soft margin" refer to? a. A margin that is too narrow b.
A margin that allows for some misclassification c. A margin that is too wide d. A margin that
is fixed and cannot be adjusted
Answer: b. A margin that allows for some misclassification
51. What is the purpose of the rectified linear unit (ReLU) activation function in neural
networks? a. To introduce non-linearity b. To enforce weight constraints c. To control the
learning rate d. To reduce the risk of overfitting
Answer: a. To introduce non-linearity
52. Which of the following is a characteristic of underfitting in machine learning models?
a. High training accuracy, high test accuracy b. Low training accuracy, high test accuracy c.
High training accuracy, low test accuracy d. Low training accuracy, low test accuracy
Answer: b. Low training accuracy, high test accuracy
53. In SVM, what does the term "kernel trick" refer to? a. A technique to reduce the
dimensionality of the feature space b. A method to handle non-linear relationships by
implicitly mapping data to a higher-dimensional space c. A strategy to minimize the margin
between classes d. A way to speed up the training process
Answer: b. A method to handle non-linear relationships by implicitly mapping data to a
higher-dimensional space
54. Which of the following is true about the bias-variance tradeoff in machine learning? a.
High bias leads to overfitting. b. High variance leads to underfitting. c. Both high bias and
high variance are desirable. d. It is the trade-off between model complexity and
generalization.
Answer: d. It is the trade-off between model complexity and generalization.
55. What is the purpose of cross-entropy loss in neural networks? a. To minimize the
mean squared error b. To maximize the margin between classes c. To penalize the model for
incorrect predictions d. To speed up the training process
Answer: c. To penalize the model for incorrect predictions
56. Which of the following is a common approach to prevent overfitting in neural
networks? a. Increasing the learning rate b. Adding more hidden layers c. Adding dropout
layers d. Removing the activation function
Answer: c. Adding dropout layers
57. What is the purpose of the hinge loss function in SVM? a. To minimize classification
error b. To maximize margin between classes c. To reduce the learning rate d. To enforce
weight constraints
Answer: b. To maximize margin between classes
58. What is the main advantage of using a Gaussian radial basis function (RBF) kernel in
SVM? a. It reduces the risk of overfitting. b. It allows the model to handle non-linear
relationships. c. It simplifies the interpretability of the model. d. It speeds up the training
process.
Answer: b. It allows the model to handle non-linear relationships.
59. Which of the following is a common activation function in the hidden layers of a
neural network? a. Sigmoid b. ReLU c. Softmax d. Tanh
Answer: b. ReLU
60. In SVM, how does the regularization parameter C affect the decision boundary? a.
Higher C values lead to a smoother decision boundary. b. Higher C values lead to a more
complex decision boundary. c. Lower C values lead to a wider margin. d. Lower C values
lead to a narrower margin.
Answer: b. Higher C values lead to a more complex decision boundary.
63. you'll know what data is best to use to train and test your model with ( )
A. Domain knowledge B. declarative knowledge
C. procedural knowledge D. integrated knowledge
64.which model use in Computational Learning Theory. ( )
A. PAC model B. Linear model C. Logical model D. Dtree model
Variance – High
77.
( )
degree of polynomial = 2
Low Training error
Bias – Low
Low Test error
Variance – Low
A. Over Fitting B. Generalized Model
78. ( )
degree of polynomial = 3
79.The hypothesis space defines the set of all possible models that can
be learned by the algorithm ( )
A. Set B. Model C. Value D. Process
80.What is a kind of algorithms that is able to convert weak learners to strong learners?
( )
A. Bagging B. Ada boost C. Boosting D. Xg boost