0% found this document useful (0 votes)
57 views

Cs8082 Machine Learning Techniques

This document contains a question bank for a Machine Learning Techniques course. It includes questions divided into three parts: Part A contains 2-mark questions, Part B contains 13-mark questions, and Part C contains 15-mark questions. The questions cover various topics related to machine learning including concept learning, decision trees, neural networks, genetic algorithms, and specific algorithms like candidate elimination. Some questions ask students to recall, understand or apply concepts, while others involve higher-order thinking like analysis, evaluation or creation. The question bank is intended to assess students' mastery of the key topics and objectives of the machine learning techniques course.

Uploaded by

reddysaik04
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
57 views

Cs8082 Machine Learning Techniques

This document contains a question bank for a Machine Learning Techniques course. It includes questions divided into three parts: Part A contains 2-mark questions, Part B contains 13-mark questions, and Part C contains 15-mark questions. The questions cover various topics related to machine learning including concept learning, decision trees, neural networks, genetic algorithms, and specific algorithms like candidate elimination. Some questions ask students to recall, understand or apply concepts, while others involve higher-order thinking like analysis, evaluation or creation. The question bank is intended to assess students' mastery of the key topics and objectives of the machine learning techniques course.

Uploaded by

reddysaik04
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

SRM VALLIAMMAI ENGINEERING COLLEGE

(An Autonomous Institution)


SRM Nagar, Kattankulathur – 603 203

QUESTION BANK

VII SEMESTER
Regulation – 2017
Academic Year 2021– 22 (ODD SEM)

CS8082 MACHINE LEARNING TECHNIQUES

(Common to CSE&ECE Department)

Prepared by

Dr. B. Muthusenthil , Asso. Prof./CSE


Mrs. K. Arthi , Assistant Professor / ECE
SRM VALLIAMMAI ENGNIEERING COLLEGE
(An Autonomous Institution)
SRM Nagar, Kattankulathur – 603203.

SUBJECT : CS8082 MACHINE LEARNING TECHNIQUES


SEM / YEAR: VII/IV

UNIT I - INTRODUCTION
Learning Problems – Perspectives and Issues – Concept Learning – Version Spaces and
Candidate Eliminations – Inductive bias – Decision Tree learning – Representation –
Algorithm – Heuristic Space Search.

PART-A (2 - MARKS)
Q.
No QUESTIONS Competence BT Level
1. Why Machine learning is important? Remember BTL-1
Classify positive and negative examples for the target Apply BTL-3
2.
concept.
Show the summary of choices in designing the checkers Apply BTL-3
3.
learning program.
4. Point out applications of machine learning. Analyze BTL-4
5. Illustrate terms of machine learning. Remember BTL-1
6. Analyze a decision tree for an example of play tennis. Analyze BTL-4
Summarize the various steps in designing a program to learn Evaluate BTL-5
7.
to play checkers.
8. Write short notes on concept learning as a search. Remember BTL-1
9. Discuss the various issues in machine learning. Understand BTL-2
Describe the four modules of final design in checkers Remember BTL-1
10.
learning problem.
11 Explain the useful perspective on machine learning. Evaluate BTL-5
12. State the inductive Learning Hypothesis. Remember BTL-1
13. List the algorithms of concept learning. Remember BTL-1
14. Generalize the concept of Biased Hypothesis Space. Create BTL-6
15. Elucidate about the Decision tree learning. Analyze BTL-4
Discuss the effect of reduced Error pruning in decision tree Understand BTL-2
16.
algorithm.
Develop the instances for the Enjoy Sport concept learning Create BTL-6
17.
task.
Examine how we use the more-general-than partial ordering Apply BTL-3
18. to organize the search for a hypothesis consistent with the
observed training examples.
Express how are these three Hypotheses h1, h2,h3 from Understand BTL-2
19.
Enjoy Sport example related by the >=g relation?
20. Label the set of instances with an example. Understand BTL-2
PART-B (13- MARKS)
1. Remember the three features to have a well-defined Remember BTL1
learning problem for the following
i) (i) A checkers learning problem (4)
ii) (ii) A handwritten recognition learning problem (4)
iii) (iv) A robot driving learning problem. (5)
2. Discuss in detail how to design a program to learn to (13) Understand BTL2
play checkers.

3. (i)Describe in detail the rule for estimating training (7) Remember BTL1
values.
(ii)State the final design of checkers learning system. (6)

4. point out the useful perspective on machine learning. (13) Apply BTL3
5. Discuss the Issues in Machine Learning. (13) Understand BTL2
6. (i)Generalize the concept Learning task. Create BTL6
(7)
(ii)Compose the Inductive Learning Hypothesis over
(6)
the training example.
7. (i)Demonstrate the concept learning as search? Remember BTL1
(7)
(ii)Describe the General-to-Specific Ordering
(6)
of Hypotheses.
8. (i)Illustrate with a diagram the decision tree (7) Apply BTL3
representation for the concept of play tennis.
(ii)Point out the appropriate problems for Decision
tree learning. (6)
9. (i)Explain in detail the FIND-S: FINDING A (7) Evaluate BTL5
MAXIMALLY SPECIFIC HYPOTHESIS.
(ii) Conclude the key properties of FIND-S algorithm. (6)
10. Conclude the following : (7) Analyze BTL4
(i)Compact Representation for Version Spaces
(ii)The LIST-THEN-ELIMINATE Algorithm. (6)
11. Demonstrate the basic decision tree algorithm. (13) Apply BTL3
12. Discuss in detail the Candidate–Elimination (13) Understand BTL2
Algorithm with an example.

13. (i)Define Inductive Bias. (3) Remember BTL1


(ii)Write short notes on biased Hypothesis Space. (10)
14. (i) Explain in detail an Unbiased Learner for Enjoy (7) Analyze BTL4
sport learning task.
(ii) List out about the Futility of Bias-Free Learning (6)
.
PART-C (15- MARK )
1. Compose what is decision tree. Draw the decision (15) Create BTL6
trees to represent the following Boolean functions:
a) A  B
b) A  [ B  C ]
c) A xor B
d) [ A  B]  [C  D]
2. Give Decision trees for the following set of training examples (15) Create BTL6
Day Outlook Temperature Humidity Wind Play Tennis
D1 Sunny Hot High Weak No
D2 Sunny Hot High Strong No
D3 Overcast Hot High Weak Yes
D4 Rain Mild High Weak Yes
D5 Rain Cool Normal Weak Yes
D6 Rain Cool Normal Strong No
D7 Overcast Cool Normal Strong Yes
D8 Sunny Mild High Weak No
D9 Sunny Cool Normal Weak Yes
D10 Rain Mild Normal Weak Yes
D11 Sunny Mild Normal Strong Yes
D12 Overcast Mild High Strong Yes
D13 Overcast Hot Normal Weak Yes
D14 Rain Mild High Strong No
.
3. Analyze the following: (15) Evaluate
(i)Will the Candidate –Elimination Algorithm
Converge to the Correct Hypothesis? BTL5
(ii)What Training Example Should the Learner
Request Next?
4. (i) Assess the Candidate-Elimination algorithm. (15) Evaluate BTL5
(ii)Explain the candidate elimination algorithm.
Apply the algorithm to obtain the final version space
for the training example
SI.No Sky Air temp Humidity Wind Water Forecast Enjoy
sport
1 Sunny Warm Normal Strong Warm Same Yes
2 Sunny Warm High Strong Warm Same Yes
3 Rainy Cold High Strong Warm Change No
4 Sunny Warm High Strong Cold Change Yes

UNIT II - NEURAL NETWORKS AND GENETIC ALGORITHMS

Neural Network Representation – Problems – Perceptrons – Multilayer Networks and Back


Propagation Algorithms – Advanced Topics – Genetic Algorithms – Hypothesis Space Search
– Genetic Programming – Models of Evaluation and Learning.
.
PART-A (2 - MARKS)
Compete
Q.No QUESTIONS BT Level nce
Validate the biological motivation for studying Create BTL6
1.
ANN.
2. State the concept of ANN. Remember BTL1
3. Describe with an example Neural network representation. Remember BTL1
4. Label the linearly separable sets of examples. Remember BTL1
List out the characteristic to which the back propagation Remember BTL1
5.
algorithm is used.
6. Compare and contrast the gradient descent and Delta rule. Analyze BTL4
Identify perceptron and What are all the Boolean functions Remember BTL1
7.
represented by perceptron.
8. Assess about the Back propagation algorithm. Evaluate BTL5
Discuss what type of unit shall we use as the basis for Understand BTL2
9.
constructing multilayer network?
Explain why perceptron to represent AND, OR, NAND and Analyze BTL4
10.
NOR is important.
11. How hypothesis in Genetic Algorithm is represented? Create BTL6
12. Describe about Genetic Algorithm. Understand BTL2
13. What are the advantages of genetic algorithm? Understand BTL2
14. Examine about the Baldwin Effect. Apply BTL3
15. Distinguish between crossover and mutation. Analyze BTL4
16. Write short notes on crowding. Remember BTL1
17. Explain about the genetic programming. Evaluate BTL5
18. Illustrate the Lamarckian Evolution. Apply BTL3
19. Summarize about the Schema in GA. Understand BTL2
Show the program tree representation in genetic Apply BTL3
20.
programming.

Analyze the multi-layer perceptron model with a (13) Analyze BTL-4


1.
neat diagram.
2. Compose for which problems is ANN learning is (13) Create BTL6
well suited and write down the characteristics.
3. (i)Illustrate the diagram for visualizing the Apply BTL3
(7)
Hypothesis space.
(ii)Examine the derivation of the Gradient Descent
(6)
Rule.
4. (i)Summarize the derivation of the Back propagation (7) Evaluate BTL5
Algorithm.
(ii)Explain Detail about the Gradient Descent (6)
algorithm.
(i)Define Perceptrons with neat diagram. Remember BTL1
(ii)Describe about perceptron with an example and (7)
5.
draw the decision surface represented by a two-input (6)
perceptron.
(i)What are the Perceptron Training rule? Remember BTL1
(3)
6. (ii)Demonstrate Back propagation algorithm.
(10)
(i)Distinguish between Gradient descent and Delta (5) Understand BTL2
7. rule.
(ii)Discuss the delta training rule with an example. (8)
(i)Explore how the hypothesis in GAs are represented Analyze BTL4
by bit strings (7)
8. (ii)Write about the IF -THEN rules and why it can be
encoded. (6)

(i)List out the Genetic algorithm step with example. (8) Remember BTL1
9.
(ii)point out the prototypical genetic algorithm. (5)
(i)Point out about the common operators for Genetic Apply BTL-3
algorithm. (7)
10. (ii) State about the various crossover with diagram.
(6)
(i)Define fitness function Understand BTL2
(ii)Examine how genetic algorithm searches large (5)
11.
space of candidate objects with an example (8)
according to fitness function.
12 (i)Demonstrate hypothesis space search of Gas Apply BTL3
(7)
with neural network back propagation.
(ii)Illustrate what is Add Alternative and Drop
(6)
Condition
13. Discuss in detail the Population Evolution and the Understand BTL-2
(13)
Schema Theorem.
(i)Label the genetic programming and draw Remember BTL-1
(7)
the program tree representation in genetic
14. programming
(ii)Describe an example to explain the genetic
(6)
programming
PART-C (15 -MARKS)
Compose the Inductive Bias and Generalize the
1. (15) Create BTL6
Hidden Layer Representations.
Explain in detail the following
BTL5
2. (i) Alternative Error Functions (8) Evaluate
(ii) Alternative Error Minimization Procedures (7)
Formulate the models of evolution and learning in
3. (15) Create BTL6
Genetic algorithm.
Assess the parallelizing Genetic Algorithms with an
4. (15) Evaluate BTL5
example.
UNIT-III BAYESIAN AND COMPUTATIONAL LEARNING
. Bayes Theorem – Concept Learning – Maximum Likelihood – Minimum Description
Length Principle – Bayes Optimal Classifier – Gibbs Algorithm – Naïve Bayes Classifier –
Bayesian Belief Network – EM Algorithm – Probability Learning – Sample Complexity –
Finite and Infinite Hypothesis Spaces – Mistake Bound Model.

PART-A (2 - MARKS)
1. List the advantages of studying Bayesian learning methods. Remember BTL1
2. Define Bayes Theorem. Remember BTL1
3. Describe Maximum likelihood. Remember BTL1
4. What is Minimum Description Length principle? Remember BTL1
5. Name the Bayes optimal classification. Remember BTL1
6. State about the Gibbs Algorithm. Remember BTL1
7. Give the formulas of basic probability Understand BTL2
8. Differentiate Bayes theorem and concept learning Analyze BTL4
9. Explain Bayesian belief networks Evaluate BTL5
10. Give the formula for probability density function. Understand BTL2
Generalize the probably approximately correct (PAC) Create BTL6
11.
learning model.
12. Illustrate the mistake bound model of learning. Apply BTL3
13. Assess the true error. Analyze BTL4
14. Compose sample complexity. Create BTL6
15. Summarize the advantages of EM algorithm. Understand BTL2
16. Deduce €-exhausting the version space Evaluate BTL5
17. Describe Brute-Force Map Learning Algorithm. Understand BTL2
18. Explain about the EM algorithm. Analyze BTL4
19. Show the set of three instances shattered by eight hypotheses. Apply BTL3
20. Illustrate the Shattering a Set of Instances Understand BTL2
PART-B (13- MARKS )
(i)Examine the detail about Bayes theorem with an (7) Understand BTL2
example.
1.
(ii)Outline the features of Bayesian learning method. (6)

(i)Summarize in detail the relationship between (7) Evaluate BTL5


Bayes theorem and Concept learning.
(ii)Write down the Brute force Bayes Concept (6)
2.
Learning.

Explain maximum likelihood algorithm. (13) Analyze BTL4


3.
Illustrate with an example why Gibbs Algorithm is (13) Apply BTL3
4.
better than the Bayes Optimal classifier.
(i)Infer the minimum description length principle. (7) Understand BTL2
5. (ii)Discuss hall we conclude from this analysis of the (6)
Minimum Description Length principle.
(i)Compose about the Bayes optimal classifier. (7) Create BTL6
6.
(ii)Elaborate the Bayes optimal classification. (6)
7. (i) Analyze the naïve Bayes classifier. (7) Analyze BTL4
(ii)Explain naive Bayes classifier with example. (6)

8. (i)Illustrate about the Bayesian belief networks (7) Remember BTL1


(ii)Describe the conditional Independence. (6)
9. (i)State about the about the EM algorithm. (7) Remember BTL1
(ii)Write short notes on Estimating Means of k
Gaussians. (6)
10. (i)Examine the detail of probability learning. (7) Remember BTL1
(ii)Define the Error of a Hypothesis (6)
11. Explain detail about the PAC Learnability. (13) Analyze BTL-4
(i)Show sample complexity for finite hypothesis (7) Understand BTL-2
12. spaces. (6)
(ii) Discuss the mistake bound model of learning.
(i)What is the €-exhausting the version space (7) Remember BTL-1
13. (ii)Write about the Learning and (6)
Inconsistent Hypotheses.
(i)Demonstrate the sample complexity for infinite (7) Apply BTL-3
14. hypothesis spaces
(ii)Construct the vapnik-chervonenkis dimension (6)
PART-C(15 -MARKS)
Does the patient have cancer, or does he not? A (15) Create BTL-6
patient takes a lab test and the result comes back
positive. The test returns a correct positive result in
only 98% of the cases in which the disease is actually
1.
present, and a correct negative result in only 97% of
the cases in which the disease is not present.
Furthermore, 0.008 of the entire population have this
cancer.
(i)Assess the Bayesian belief network. (15) Evaluate BTL5
2. (ii) Judge the Importance of Bayesian network is used
to infer values of target variable?
Day Outlook Temperature Humidity Wind Play (15) Create BTL-6
Tennis
D1 Sunny Hot High Weak No
D2 Sunny Hot High Strong No
3. D3 Overcast Hot High Weak Yes
D4 Rain Mild High Weak Yes
D5 Rain Cool Normal Weak Yes
D6 Rain Cool Normal Strong No
D7 Overcast Cool Normal Strong Yes
D8 Sunny Mild High Weak No
D9 Sunny Cool Normal Weak Yes
D10 Rain Mild Normal Weak Yes
D11 Sunny Mild Normal Strong Yes
D12 Overcast Mild High Strong Yes
D13 Overcast Hot Normal Weak Yes
D14 Rain Mild High Strong No
A set of 14 training examples of the target concept
PlayTennis, where each day is described by the
attributes Outlook, Temperature, Humidity, and
Wind. use the naive Bayes classifier and the training
data from this table to classify the following novel
instance:
(Outlook = sunny, Temperature = cool, Humidity =
high, Wind = strong)
(i)Summarize the General Statement of EM (15) Evaluate BTL5
4. Algorithm.
(ii) Deduce k -Means Algorithm.
UNIT IV- INSTANT BASED LEARNING
K-NearestNeighbourLearning–LocallyweightedRegression–RadialBasisFunctions – Case
Based learning
PART-A (2 -MARKS)
1. Define the formula for the distance between two instance. Remember BTL1
2. Demonstrate the radial basis function network. Apply BTL3
3. Describe the k-nearest neighbor learning algorithm. Remember BTL1
Illustrate how the Instance-based learning methods differ Apply BTL3
4.
from function approximation.
Explain the the k-nearest neighbour algorithm for Analyze BTL4
5.
approximating a discrete-valued function.
What is the nature of the hypothesis space H implicitly Remember BTL1
6.
considered by the k-nearest neighbor algorithm?
7. Write about the locally weighted regression. Remember BTL1
8. Identify the distance-weighted nearest neighbor algorithm. Remember BTL1
9. State about the curse of dimensionality. Remember BTL1
10. Differentiate Regression, Residual, Kernel function. Analyze BTL4
11. Give the advantages of instance –based methods. Understand BTL2
Discuss the advantage and disadvantage of Locally weighted Understand BTL2
12.
Regression.
13. Distinguish between lazy versus eager learning? Understand BTL2
14. Compose the three properties that is shared by the Instance- Create BTL6
based methods.
15. Summarize the three lazy learning methods. Evaluate BTL5
16. Express the voronoi diagram for k nearest neighbour. Apply BTL3
17. Explain radial basis functions. Evaluate BTL5
Compose the formula for Locally Weighted Linear Create BTL6
18.
Regression.
19. Analyze what is the inductive bias of k-nearest neighbor? Analyze BTL4
20. Distinguish between CADET and k-nearest Neighbor. Understand BTL2
PART-B (13- MARKS )
1. i) Illustrate the disadvantages of Instance –based (7) Apply BTL3
methods.
ii) Examine the k-nearest learning algorithm. (6)
2. Assess the detail about distance-weighted nearest Evaluate BTL5
(13)
neighbour algorithm.
3. (i)Generalize Locally weighted linear regression. Create BTL6
(ii)Illustrate Locally Weighted Linear Regression (13)
with an example.
4. (i)Elucidate the radial basis functions. Analyze BTL4
(7)
(ii)Describe the two stage process of the RDF
(6)
networks.
(i)Summarize the detail about locally Understand BTL 2
weighted regression (7)
5.
(ii)Discuss the pros and cons of Locally weighted (6)
regression.
Explain the inductive bias of k-Nearest neighbor (13)
6. algorithm with example. Analyze BTL4

Discuss the generic properties of case-based (13) Understand BTL2


7.
reasoning systems.
State the prototypical example of case-based Remember BTL1
8. (13)
reasoning system.
9. How the lazy learning differ from other learning Remember BTL1
(13)
model explain with example?.
Examine the Instance-based learning methods Remember BTL1
10. (13)
(i)Explain in detail about eager learning. Analyze BTL4
( 7)
11. (ii)Point out how the eager learning differs from lazy
(6)
learning?
12. Illustrate several generic properties of case –based (13) Apply BTL-3
reasoning systems
Demonstrate CADET system with an example ( 13) Understand BTL-2
13.
Describe the disadvantages and advantages of Lazy (13) Remember BTL-1
14.
and Eager learning.
PART-C (15-MARKS)
1. Explain about the Case-based reasoning (CBR). (15) Evaluate BTL-5
Compare the difference between the Lazy and Eager
2. (15) Evaluate BTL-5
learning algorithms.
Formulate the Generalize the Locally weighted
3. (15) Create BTL-6
regression model.
4. Compose the error E(x,) to emphasize the fact that (15) Create BTL-6
now the error is being defined as a function of the
query point x,.

UNIT V- ADVANCED LEARNING


. Learning Sets of Rules – Sequential Covering Algorithm – Learning Rule Set – First Order
Rules – Sets of First Order Rules – Induction on Inverted Deduction – Inverting Resolution
–Analytical Learning–Perfect Domain Theories–Explanation Base Learning – FOCL
Algorithm – Reinforcement Learning – Task – Q-Learning – Temporal Difference Learning

PART-A (2 -MARKS)
1. What is explanation based learning? Remember BTL1
2. Demonstrate first-order Horn clauses. Apply BTL3
3. State the learn-one-rule. Remember BTL1
4. Illustrate what is Sequential Covering Algorithm. Apply BTL3
5. Examine the Prolog-EBG. Apply BTL3
6. Describe Inverting resolution. Remember BTL1
7. List out the terminology in Horn clause. Remember BTL1
8. Define Turing-equivalent programming language Remember BTL1
9. Explain about the Reinforcement learning model. Remember BTL1
Point out how the learn rule sets differ from genetic Analyze BTL4
10.
algorithm.
11. Interpret the importance of Temporal learning. Understand BTL2
Discuss about the sequential covering algorithm for learning Understand BTL2
12.
a disjunctive set of rules.
13. Distinguish between the FOIL and the other algorithm. Understand BTL2
14. Generalize induction as inverted deduction. Create BTL6
15. Explain inductive logic programming. Evaluate BTL5
16. Analyze the Q learning algorithms. Analyze BTL4
17. Compare Inductive and Analytical Learning Problems Evaluate BTL5
18. Develop a Proportional form if clauses C1 and C2 is given. Create BTL6
19. Point out the Horn clause. Analyze BTL4
20. Summarize about the FOIL algorithm. Understand BTL2
PART-B(13 MARKS )
Assess the learning sets of rules and how it differs
1. (13) Evaluate BTL5
from other algorithms.
(i)Point out about the Sequential Covering Algorithm. (7)
2. Analyze BTL4
(ii)Explain the Learn one rule on one example. (6)
3. Discuss the learning task. (13) Understand BTL2
4. (i)Write in detail sequential –covering algorithm. (7) Remember BTL1
(ii)State about the AQ algorithm. (6)
Elucidate the detail the first order logic basic
5. (13) Analyze BTL4
definitions.
(13) Apply BTL3
Illustrate the diagram for the search for rule
6. preconditions as learn-one-rule precedes from general
to specific.
(i)Analyze the learning Rule sets.
(7)
7. (ii)Write some common evaluation functions in the Analyze BTL4
(6)
learning rule sets.
8. Demonstrate about induction as inverted deduction (13) Apply BTL3
9. Discuss in detail Learning First –order rules. (13) Understand BTL2
(i)List the learning sets of first-order rules: foil (7) Remember BTL1
10. (ii)Memorize about the Basic Foil algorithm. (6)
(i)Describe about learning with perfect domain (7) Remember BTL1
theories: prolog-eb.
11.
(ii)Identify any training with example for PROLOG- (6)
EBG.
Summarize about the Q-learning model and explain Understand BTL2
12. (13)
with diagram.
(i) Generalize what is Reinforcement learning? (7)
13. Create BTL6
(ii) Compose Temporal difference learning. (6)
Memorize about the Analytical learning model with Remember BTL1
14. (13)
example.
PART-C(15 MARKS)
Compose the following horn clauses
1. (i) First-Order Horn Clauses (15) Create BTL6
(ii) Basic terminology in horn clauses.
2. Generalize the concept of inverting resolution model. (15) Create BTL6
Summarize the merits and demerits of FOCL
3. (15) Evaluate BTL5
Algorithm
Assess the Temporal Difference Learning model with (15)
4. Evaluate BTL5
an example.

You might also like