Cs8082 Machine Learning Techniques
Cs8082 Machine Learning Techniques
QUESTION BANK
VII SEMESTER
Regulation – 2017
Academic Year 2021– 22 (ODD SEM)
Prepared by
UNIT I - INTRODUCTION
Learning Problems – Perspectives and Issues – Concept Learning – Version Spaces and
Candidate Eliminations – Inductive bias – Decision Tree learning – Representation –
Algorithm – Heuristic Space Search.
PART-A (2 - MARKS)
Q.
No QUESTIONS Competence BT Level
1. Why Machine learning is important? Remember BTL-1
Classify positive and negative examples for the target Apply BTL-3
2.
concept.
Show the summary of choices in designing the checkers Apply BTL-3
3.
learning program.
4. Point out applications of machine learning. Analyze BTL-4
5. Illustrate terms of machine learning. Remember BTL-1
6. Analyze a decision tree for an example of play tennis. Analyze BTL-4
Summarize the various steps in designing a program to learn Evaluate BTL-5
7.
to play checkers.
8. Write short notes on concept learning as a search. Remember BTL-1
9. Discuss the various issues in machine learning. Understand BTL-2
Describe the four modules of final design in checkers Remember BTL-1
10.
learning problem.
11 Explain the useful perspective on machine learning. Evaluate BTL-5
12. State the inductive Learning Hypothesis. Remember BTL-1
13. List the algorithms of concept learning. Remember BTL-1
14. Generalize the concept of Biased Hypothesis Space. Create BTL-6
15. Elucidate about the Decision tree learning. Analyze BTL-4
Discuss the effect of reduced Error pruning in decision tree Understand BTL-2
16.
algorithm.
Develop the instances for the Enjoy Sport concept learning Create BTL-6
17.
task.
Examine how we use the more-general-than partial ordering Apply BTL-3
18. to organize the search for a hypothesis consistent with the
observed training examples.
Express how are these three Hypotheses h1, h2,h3 from Understand BTL-2
19.
Enjoy Sport example related by the >=g relation?
20. Label the set of instances with an example. Understand BTL-2
PART-B (13- MARKS)
1. Remember the three features to have a well-defined Remember BTL1
learning problem for the following
i) (i) A checkers learning problem (4)
ii) (ii) A handwritten recognition learning problem (4)
iii) (iv) A robot driving learning problem. (5)
2. Discuss in detail how to design a program to learn to (13) Understand BTL2
play checkers.
3. (i)Describe in detail the rule for estimating training (7) Remember BTL1
values.
(ii)State the final design of checkers learning system. (6)
4. point out the useful perspective on machine learning. (13) Apply BTL3
5. Discuss the Issues in Machine Learning. (13) Understand BTL2
6. (i)Generalize the concept Learning task. Create BTL6
(7)
(ii)Compose the Inductive Learning Hypothesis over
(6)
the training example.
7. (i)Demonstrate the concept learning as search? Remember BTL1
(7)
(ii)Describe the General-to-Specific Ordering
(6)
of Hypotheses.
8. (i)Illustrate with a diagram the decision tree (7) Apply BTL3
representation for the concept of play tennis.
(ii)Point out the appropriate problems for Decision
tree learning. (6)
9. (i)Explain in detail the FIND-S: FINDING A (7) Evaluate BTL5
MAXIMALLY SPECIFIC HYPOTHESIS.
(ii) Conclude the key properties of FIND-S algorithm. (6)
10. Conclude the following : (7) Analyze BTL4
(i)Compact Representation for Version Spaces
(ii)The LIST-THEN-ELIMINATE Algorithm. (6)
11. Demonstrate the basic decision tree algorithm. (13) Apply BTL3
12. Discuss in detail the Candidate–Elimination (13) Understand BTL2
Algorithm with an example.
(i)List out the Genetic algorithm step with example. (8) Remember BTL1
9.
(ii)point out the prototypical genetic algorithm. (5)
(i)Point out about the common operators for Genetic Apply BTL-3
algorithm. (7)
10. (ii) State about the various crossover with diagram.
(6)
(i)Define fitness function Understand BTL2
(ii)Examine how genetic algorithm searches large (5)
11.
space of candidate objects with an example (8)
according to fitness function.
12 (i)Demonstrate hypothesis space search of Gas Apply BTL3
(7)
with neural network back propagation.
(ii)Illustrate what is Add Alternative and Drop
(6)
Condition
13. Discuss in detail the Population Evolution and the Understand BTL-2
(13)
Schema Theorem.
(i)Label the genetic programming and draw Remember BTL-1
(7)
the program tree representation in genetic
14. programming
(ii)Describe an example to explain the genetic
(6)
programming
PART-C (15 -MARKS)
Compose the Inductive Bias and Generalize the
1. (15) Create BTL6
Hidden Layer Representations.
Explain in detail the following
BTL5
2. (i) Alternative Error Functions (8) Evaluate
(ii) Alternative Error Minimization Procedures (7)
Formulate the models of evolution and learning in
3. (15) Create BTL6
Genetic algorithm.
Assess the parallelizing Genetic Algorithms with an
4. (15) Evaluate BTL5
example.
UNIT-III BAYESIAN AND COMPUTATIONAL LEARNING
. Bayes Theorem – Concept Learning – Maximum Likelihood – Minimum Description
Length Principle – Bayes Optimal Classifier – Gibbs Algorithm – Naïve Bayes Classifier –
Bayesian Belief Network – EM Algorithm – Probability Learning – Sample Complexity –
Finite and Infinite Hypothesis Spaces – Mistake Bound Model.
PART-A (2 - MARKS)
1. List the advantages of studying Bayesian learning methods. Remember BTL1
2. Define Bayes Theorem. Remember BTL1
3. Describe Maximum likelihood. Remember BTL1
4. What is Minimum Description Length principle? Remember BTL1
5. Name the Bayes optimal classification. Remember BTL1
6. State about the Gibbs Algorithm. Remember BTL1
7. Give the formulas of basic probability Understand BTL2
8. Differentiate Bayes theorem and concept learning Analyze BTL4
9. Explain Bayesian belief networks Evaluate BTL5
10. Give the formula for probability density function. Understand BTL2
Generalize the probably approximately correct (PAC) Create BTL6
11.
learning model.
12. Illustrate the mistake bound model of learning. Apply BTL3
13. Assess the true error. Analyze BTL4
14. Compose sample complexity. Create BTL6
15. Summarize the advantages of EM algorithm. Understand BTL2
16. Deduce €-exhausting the version space Evaluate BTL5
17. Describe Brute-Force Map Learning Algorithm. Understand BTL2
18. Explain about the EM algorithm. Analyze BTL4
19. Show the set of three instances shattered by eight hypotheses. Apply BTL3
20. Illustrate the Shattering a Set of Instances Understand BTL2
PART-B (13- MARKS )
(i)Examine the detail about Bayes theorem with an (7) Understand BTL2
example.
1.
(ii)Outline the features of Bayesian learning method. (6)
PART-A (2 -MARKS)
1. What is explanation based learning? Remember BTL1
2. Demonstrate first-order Horn clauses. Apply BTL3
3. State the learn-one-rule. Remember BTL1
4. Illustrate what is Sequential Covering Algorithm. Apply BTL3
5. Examine the Prolog-EBG. Apply BTL3
6. Describe Inverting resolution. Remember BTL1
7. List out the terminology in Horn clause. Remember BTL1
8. Define Turing-equivalent programming language Remember BTL1
9. Explain about the Reinforcement learning model. Remember BTL1
Point out how the learn rule sets differ from genetic Analyze BTL4
10.
algorithm.
11. Interpret the importance of Temporal learning. Understand BTL2
Discuss about the sequential covering algorithm for learning Understand BTL2
12.
a disjunctive set of rules.
13. Distinguish between the FOIL and the other algorithm. Understand BTL2
14. Generalize induction as inverted deduction. Create BTL6
15. Explain inductive logic programming. Evaluate BTL5
16. Analyze the Q learning algorithms. Analyze BTL4
17. Compare Inductive and Analytical Learning Problems Evaluate BTL5
18. Develop a Proportional form if clauses C1 and C2 is given. Create BTL6
19. Point out the Horn clause. Analyze BTL4
20. Summarize about the FOIL algorithm. Understand BTL2
PART-B(13 MARKS )
Assess the learning sets of rules and how it differs
1. (13) Evaluate BTL5
from other algorithms.
(i)Point out about the Sequential Covering Algorithm. (7)
2. Analyze BTL4
(ii)Explain the Learn one rule on one example. (6)
3. Discuss the learning task. (13) Understand BTL2
4. (i)Write in detail sequential –covering algorithm. (7) Remember BTL1
(ii)State about the AQ algorithm. (6)
Elucidate the detail the first order logic basic
5. (13) Analyze BTL4
definitions.
(13) Apply BTL3
Illustrate the diagram for the search for rule
6. preconditions as learn-one-rule precedes from general
to specific.
(i)Analyze the learning Rule sets.
(7)
7. (ii)Write some common evaluation functions in the Analyze BTL4
(6)
learning rule sets.
8. Demonstrate about induction as inverted deduction (13) Apply BTL3
9. Discuss in detail Learning First –order rules. (13) Understand BTL2
(i)List the learning sets of first-order rules: foil (7) Remember BTL1
10. (ii)Memorize about the Basic Foil algorithm. (6)
(i)Describe about learning with perfect domain (7) Remember BTL1
theories: prolog-eb.
11.
(ii)Identify any training with example for PROLOG- (6)
EBG.
Summarize about the Q-learning model and explain Understand BTL2
12. (13)
with diagram.
(i) Generalize what is Reinforcement learning? (7)
13. Create BTL6
(ii) Compose Temporal difference learning. (6)
Memorize about the Analytical learning model with Remember BTL1
14. (13)
example.
PART-C(15 MARKS)
Compose the following horn clauses
1. (i) First-Order Horn Clauses (15) Create BTL6
(ii) Basic terminology in horn clauses.
2. Generalize the concept of inverting resolution model. (15) Create BTL6
Summarize the merits and demerits of FOCL
3. (15) Evaluate BTL5
Algorithm
Assess the Temporal Difference Learning model with (15)
4. Evaluate BTL5
an example.