Machine Learning - XLSX - TF Questions
Machine Learning - XLSX - TF Questions
xlsx
Question T F Explanation for True Explanation for False Explanation Video Chapter
In AdaBoost weights are uniformly initialized T 1/n or 1
quantitative (numerical) data
Categorical data should be normalized before training a k-NN F knn
should be normalised
Trivially true, since d(x,x) =
0 holds for a metric,
The error of a 1-NN classifier on the training set is 0 T assuming there are no knn
equal data points with
different class labels
One-vs-all not needed; decision
trees can hand multi-class
One-vs-all is an approach to solve multi-class problems for DTs F
classification problems out-of-
the-box
Dependencies between
Boosting ensembles can be easily parallelized F models/ sequential training of
datasets
Tranform categorical/ordinal to
1-hot encoding is used to transform numerical into categorical attributes F
numerical 1/0
Normalized measure of
The Pearson coefficient has a value range from -1 to 1 T covariance -> Normalized from
-1 to 1
Off-the-shelf is a transfer learning technique that uses the output of layers from a deep-learning architecture as input for a
T
shallow model
SVMs search for a decision boundary with the maximum margin T
SVM tries to find the best
fitting hyperplane, while
Perceptron stops at the Would be careful with the word
first solution. in the context "always". ... Depends on the
SVMs always find a more optimal decision boundary (hyperplane) than Perceptrons T F
of linearly separable data. problem at hand and the kernel
the optimality here is used in SVM.
defined in terms of Margins
between classes.
(ChatGPT answer) No, this is
Its an assumption that is not necessarily true. Bayesian
made, but might not networks can represent
necessarily be true (usually dependencies between
In Bayesian Networks we assume that attributes are statistically independent given the class T F
isnt), however the attributes, even given the class.
implementation works with This assumption is related to
that assumption Naive Bayes not Bayesian
Networks
Majority voting is used
when doing a classification
Majority voting is not used when k-NN is applied for linear regression T task, when doing
regression the avg is used.
28.05.2024 1
Machine Learning.xlsx
28.05.2024 2