0% found this document useful (0 votes)
3 views2 pages

ML Mid-2 Important Questions

The document discusses various machine learning classification techniques, including the Bayes Classifier, Naive Bayes, Support Vector Machines (SVM), and logistic regression. It covers concepts such as multi-class classification, kernel tricks, linear discriminants, perceptron classifiers, and clustering methods like K-Means and fuzzy C-means. Additionally, it addresses the Expectation-Maximization algorithm and agglomerative clustering, highlighting their applications and differences in clustering approaches.

Uploaded by

Crazy World
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views2 pages

ML Mid-2 Important Questions

The document discusses various machine learning classification techniques, including the Bayes Classifier, Naive Bayes, Support Vector Machines (SVM), and logistic regression. It covers concepts such as multi-class classification, kernel tricks, linear discriminants, perceptron classifiers, and clustering methods like K-Means and fuzzy C-means. Additionally, it addresses the Expectation-Maximization algorithm and agglomerative clustering, highlighting their applications and differences in clustering approaches.

Uploaded by

Crazy World
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 2

1.

How does the Bayes Classifier extend to multi-class classification


problems? Illustrate with an example.
2 Explain in what types of problems naïve bayes is
especially effective?
3 Describe the Naive Bayes Classifier (NBC) in detail.
How does it work?

problem 𝑃(𝐶1 ) = 0.6 𝑃(𝐶2 ) = 0.4 𝑃(𝑋|𝐶1 ) = 0.5 𝑃(𝑋|𝐶2 )


4 Classify X based on given probabilities. What's given in the

= 0.8
5 Explain the kernel trick in SVM. How does it enable non-linear
classification? Give examples of commonly used kernels.
6 What are linear discriminants in machine learning?
Explain their purpose and how they are used in
classification tasks with suitable mathematical
formulation and examples.
7 Describe the perceptron classifier. How does it function as a
linear classifier, and under what conditions does it converge?
8 How is logistic regression different from linear
regression? Discuss the use of cost functions and
optimization in both models.
9 Explain how linear discriminants are used for classification tasks.
Derive the decision boundary for two-class classification.
1 Explain the working of a Support Vector Machine
0 (SVM) for linearly separable data?
1 Derive the logistic regression model for binary classification.
1 Explain how maximum likelihood estimation is used to train it.
1 What is a perceptron? Explain the architecture and
2 functioning of a single-layer perceptron used for
classification.
1 Explain the Expectation-Maximization (EM) algorithm Describe
3 the E and M steps in detail.
1 Explain agglomerative clustering in detail. What
4 linkage criteria are used, and how do they affect the
resulting clusters?
1 What is spectral clustering? Discuss its working and
5 applications.
1 What is soft partitioning in clustering? Compare and contrast it
6 with hard partitioning using relevant examples.
1 Discuss the concept of soft clustering. Why is it useful in certain
7 applications? Provide examples where soft clustering is preferred
over hard clustering.
1 Explain the K-Means clustering algorithm in detail.
8 Include the steps involved, distance measures used?
1 Describe fuzzy C-means clustering?
9
2 Explain about the Rough K means Clustering algorithm?
0

You might also like