20IST63 - Machine Learning
20IST63 - Machine Learning
Find,
i) Distinct instances.
ii) Syntactically distinct instances.
9. Define information gain in decision tree learning algorithm. [CO2,K2]
10. Use the positive & negative training examples for enjoy sport task. Apply the types [CO1,K3]
of machine learning and predict which learning suits for given task.
13. i)Describe the perspectives and issues in machine Learning. (5) [CO1,K2]
ii)Explain in detail about concept learning as task along with its algorithm. (5) [CO1,K2]
14. Apply decision tree algorithm to compute information gain for the given play tennis [CO2,K3]
dataset.
Bloom’s
Remembering Understanding Applying Analyzing Evaluating Creating
Taxonomy
(K1) (K2) (K3) (K4) (K5) (K6)
Level
Percentage 10 40 50 - - -
KONGU ENGINEERING COLLEGE, PERUNDURAI, ERODE – 638 060
Continuous Assessment Test I – February 2023
Roll Number: _________________
Programme :M.Sc Date :04-02-2023
Branch :Software Systems Time :9.15AM - 10.45AM
Semester :VI
Course Code : 20IST63 Duration :1 1/2 Hours
Course Name: Machine Learning Max Marks :50
PART – A (10*2=20 Marks)
ANSWER ALL QUESTIONS
1. List the four possibilities of selecting the targets in checkers learning problem. [CO1,K1]
2. What is meant by inductive learning hypothesis? [CO1,K1]
3. State the modules used in designing the learning systems. [CO1,K1]
4. Define the specific boundary in candidate elimination algorithm with its theorem [CO1,K1]
representation.
5. The table of attributes and values are given, [CO1,K3]
Attributes Possible Values
Citations some, many, few
Size small, medium, big
Inlibrary always, no
Price affordable, expensive
Editions one, few, many
Find the following:
i). syntactically distinct hypothesis.
ii). Semantically distinct hypothesis.
6. Differentiate Finds algorithm and Candidate elimination algorithm. [CO1,K2]
7. Mention the issues in decision tree learning. [CO2,K2]
8. Consider the instance, x = < sunny, ? , ? , strong, ? , ? > identify the attributes, [CO1,K3]
general hypothesis and specific hypothesis
9. The collection S, containing positive and negative examples of some target [CO2,K3]
concept. Calculate the entropy of S relative to this boolean classification of 9
positive and 5 negative examples.
10. The hypothesis hj and hk are provided. Give the more-general-than-or-equal-to [CO1,K3]
relationship for given hypothesis.
PART - B (3*10=30 Marks)
ANSWER ANY THREE QUESTIONS
11. Demonstrate the steps of candidate elimination algorithm. Utilize it to determine the final [CO1,K3]
version space to classify the enjoy sport game.
13. i)Describe the concept of linear and non-linear regression in detail. (5) [CO2,K2]
ii)Explain in detail about concept learning as search along with its algorithm. (5) [CO1,K2]
14. Make use of decision tree algorithm to compute information gain and find root node with [CO2,K3]
given dataset.
Bloom’s
Remembering Understanding Applying Analyzing Evaluating Creating
Taxonomy
(K1) (K2) (K3) (K4) (K5) (K6)
Level
Percentage 13.3 23.3 63.3 - - -