0% found this document useful (0 votes)
103 views4 pages

20IST63 - Machine Learning

The document provides details of a machine learning exam for a course on machine learning, including sample exam questions covering topics like decision trees, concept learning algorithms, and linear/nonlinear regression. The exam consists of two parts - a multiple choice section worth 20 marks and longer answer questions worth 30 marks selecting from 4 options. The questions test various levels of Bloom's taxonomy from remembering to applying concepts.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
103 views4 pages

20IST63 - Machine Learning

The document provides details of a machine learning exam for a course on machine learning, including sample exam questions covering topics like decision trees, concept learning algorithms, and linear/nonlinear regression. The exam consists of two parts - a multiple choice section worth 20 marks and longer answer questions worth 30 marks selecting from 4 options. The questions test various levels of Bloom's taxonomy from remembering to applying concepts.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

KONGU ENGINEERING COLLEGE, PERUNDURAI, ERODE – 638 060

Continuous Assessment Test I – February 2023


Roll Number: _________________
Programme :M.Sc Date :04-02-2023
Branch :Software Systems Time :9.15AM - 10.45AM
Semester :VI
Course Code : 20IST63 Duration :1 1/2 Hours
Course Name: Machine Learning Max Marks :50
PART – A (10*2=20 Marks)
ANSWER ALL QUESTIONS
1. The machine is trained for “Handwriting recognition learning problem”. Predict the [CO1,K3]
task, performance and experience learned by system?
2. Give version space representation for Hypothesis and training example. [CO1,K1]
3. Recall the questions that are left unanswered by find-S algorithm. [CO1,K1]
4. Utilize the training example as “black won the game “.Choose the function [CO1,K3]
approximation algorithm for the given data.
5. Draw decision trees to represent the following Boolean functions: [CO2,K3]
i)A˄7B
ii)A˅[B˄C]
iii)[A˄B]˅[C˄D]
6. Define general boundary in candidate elimination algorithm with its theorem [CO1,K2]
representation.
7. List the error predictions used in adjusting weights in function approximation [CO1,K1]
algorithm of checker’s learning problem.
8. The table with four attributes, along with the possible values are given [CO1,K3]

Attributes Possible Values


Citations some, many, few
Size small, medium, big
Inlibrary always, no
Price affordable, expensive
Editions one, few, many

Find,
i) Distinct instances.
ii) Syntactically distinct instances.
9. Define information gain in decision tree learning algorithm. [CO2,K2]
10. Use the positive & negative training examples for enjoy sport task. Apply the types [CO1,K3]
of machine learning and predict which learning suits for given task.

PART - B (3*10=30 Marks)


ANSWER ANY THREE QUESTIONS
11. Summarize the steps involved in designing the learning problem with suitable [CO1,K2]
design choices.
12. Make use of the dataset with poisonous concept learning. Apply Find-S algorithm [CO1,K3]
for the following dataset and also describe its algorithm.

Example Color Toughness Fungus Appearance Poisonous


1 Green Hard No Wrinkled Yes
2 Green Hard Yes Smooth No
3 Brown Soft No Wrinkled No
4 Orange Hard No Wrinkled Yes
5 Green Soft Yes Smooth Yes
6 Green Hard Yes Wrinkled Yes
7 Orange Hard No Wrinkled Yes

13. i)Describe the perspectives and issues in machine Learning. (5) [CO1,K2]
ii)Explain in detail about concept learning as task along with its algorithm. (5) [CO1,K2]
14. Apply decision tree algorithm to compute information gain for the given play tennis [CO2,K3]
dataset.

Day Outlook Temperature Humidity Wind Play


Tennis
D1 Sunny Hot High Weak No
D2 Sunny Hot High Strong No
D3 Overcast Hot High Weak Yes
D4 Rain Mild High Weak Yes
D5 Rain Cool Normal Weak Yes
D6 Rain Cool Normal Strong No
D7 Overcast Cool Normal Strong Yes
D8 Sunny Mild High Weak No
D9 Sunny Cool Normal Weak Yes
D10 Rain Mild Normal Weak Yes
D11 Sunny Mild Normal Strong Yes
D12 Overcast Mild High Strong Yes
D13 Overcast Hot Normal Weak Yes
D14 Rain Mild High Strong No

Bloom’s
Remembering Understanding Applying Analyzing Evaluating Creating
Taxonomy
(K1) (K2) (K3) (K4) (K5) (K6)
Level
Percentage 10 40 50 - - -
KONGU ENGINEERING COLLEGE, PERUNDURAI, ERODE – 638 060
Continuous Assessment Test I – February 2023
Roll Number: _________________
Programme :M.Sc Date :04-02-2023
Branch :Software Systems Time :9.15AM - 10.45AM
Semester :VI
Course Code : 20IST63 Duration :1 1/2 Hours
Course Name: Machine Learning Max Marks :50
PART – A (10*2=20 Marks)
ANSWER ALL QUESTIONS
1. List the four possibilities of selecting the targets in checkers learning problem. [CO1,K1]
2. What is meant by inductive learning hypothesis? [CO1,K1]
3. State the modules used in designing the learning systems. [CO1,K1]
4. Define the specific boundary in candidate elimination algorithm with its theorem [CO1,K1]
representation.
5. The table of attributes and values are given, [CO1,K3]
Attributes Possible Values
Citations some, many, few
Size small, medium, big
Inlibrary always, no
Price affordable, expensive
Editions one, few, many
Find the following:
i). syntactically distinct hypothesis.
ii). Semantically distinct hypothesis.
6. Differentiate Finds algorithm and Candidate elimination algorithm. [CO1,K2]
7. Mention the issues in decision tree learning. [CO2,K2]
8. Consider the instance, x = < sunny, ? , ? , strong, ? , ? > identify the attributes, [CO1,K3]
general hypothesis and specific hypothesis
9. The collection S, containing positive and negative examples of some target [CO2,K3]
concept. Calculate the entropy of S relative to this boolean classification of 9
positive and 5 negative examples.
10. The hypothesis hj and hk are provided. Give the more-general-than-or-equal-to [CO1,K3]
relationship for given hypothesis.
PART - B (3*10=30 Marks)
ANSWER ANY THREE QUESTIONS
11. Demonstrate the steps of candidate elimination algorithm. Utilize it to determine the final [CO1,K3]
version space to classify the enjoy sport game.

Example Sky Air Humidity Wind Water Forecast Enjoy


Temp Sport
1 Sunny Warm Normal Strong Warm Same Yes
2 Sunny Warm High Strong Warm Same Yes
3 Rainy Cold High Strong Warm Change No
4 Sunny Warm High Strong Change Change Yes
12. Utilize the designing of Learning system for checkers learning problem along with [CO1,K3]
summary generation.

13. i)Describe the concept of linear and non-linear regression in detail. (5) [CO2,K2]
ii)Explain in detail about concept learning as search along with its algorithm. (5) [CO1,K2]

14. Make use of decision tree algorithm to compute information gain and find root node with [CO2,K3]
given dataset.

Age Income Student Credit-rating Buys-computer


Youth High No Fair No
Youth High No Excellent No
Middle High No Fair Yes
Senior Medium No Fair Yes
Senior Low Yes Excellent No
Senior Low Yes Excellent Yes
Middle Low Yes Excellent Yes
Youth Medium No Fair No
Youth Low Yes Fair Yes
Senior Medium Yes Fair Yes
Youth Medium Yes Excellent Yes
Middle Medium No Excellent Yes
Middle High Yes Fair Yes
Senior Medium No Excellent No

Bloom’s
Remembering Understanding Applying Analyzing Evaluating Creating
Taxonomy
(K1) (K2) (K3) (K4) (K5) (K6)
Level
Percentage 13.3 23.3 63.3 - - -

You might also like