0% found this document useful (0 votes)
17 views

Module 1 and 2 Question Bank

Uploaded by

717821f223
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views

Module 1 and 2 Question Bank

Uploaded by

717821f223
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 21

Module 1 and 2 Question Bank

1. Explain the variance and Covariance. Prove that Variance of a variable is the difference of the
expected value of the second order and the square of the expected value of the first order in
Gaussian distribution.

2. List out the conditions for the Covariance which relate with the variables.

3. Explain types of Machine Learning in detail with examples.


4. Explain the entire Machine Learning process with examples.

Example: In house price prediction, the ML process would include collecting features like
square footage, number of rooms, etc., training a model on historical data, and predicting
future prices.
5. Derive the Bayes rule from the Probability and Statistics for Machine Learning.
6. Discuss how probability theory is applied in Machine Learning with importance of probability
distributions.

7. Derive the Sum and Product Rule using Probability Theory.

8. Explain the concept of decision theory in detail with the advantages and limitations of
decision trees.
9. Describe the process of hypothesis testing and its significance in evaluating Machine
Learning models.

10. Write the application in both regression and classification tasks.

11. Explain linear models for regression in detail.


12. Describe the decision tree learning algorithm.

13. Explain the architecture of a multi-layer perceptron (MLP) and its application in supervised
learning.

14. Discuss the error backpropagation algorithm with an example

15. Explain Bayesian Learning and compare it with Naïve Bayes.

16. Illustrate the process of turning data into probabilities and how it aids in machine
learning tasks
17. Discuss the concept of ensemble methods, differences between bagging and boosting.

18. Significance of Kernel function in SVM.

19. Explain the purpose of discriminant functions in machine learning.

Numericals

1. Calculate the accuracy, precision, recall, and F1-score based on this matrix.
In note or photo…………

2. Calculate Expected cost using linear regression


3. Finding linear regression equation
4. calculate the updated weights for each sample after the first round of AdaBoost
5. Apply the Naïve Bayes classifier and calculate the probability of a new instance
belonging to a particular class.
6. calculation of information gain, and the final structure of the decision tree.

You might also like