Course Outcomes For Assessment in This Ia: Cos Co3 Co4 Co5 Co6
Course Outcomes For Assessment in This Ia: Cos Co3 Co4 Co5 Co6
Bloom’s Mapping
PART – A (20 Marks) Marks
Level COs
A smart traffic camera is trained on various object dataset to detect the
1.a) type of vehicles on a signal. Identify the type of machine learning U CO5 1
technique.
Classification Algorithm
b) Define information gain and formulate the equation. R CO3 1
Information gain: specifies how much information a particular predictor
variable gives about the final outcome (used to choose the predictor
variable that best splits the data)
IG=¿ E(parent) – weighted average of E(input)
Sketch the decision tree for the following AND operation.
c) A CO5 2
Calculate the precision and specificity from the given confusion matrix.
Precision = 45/50 = 0.9
Specificity = 30/35 = 0.85
d) Predicted An CO6 2
Actual
Positive Negative
Positive 45 20
Negative 5 30
State the purpose of constructing a decision tree. Explain the ID3
e(i) U CO3 4
algorithm for constructing a decision tree.
A decision tree is a decision support hierarchical model that uses a tree-like
model of decisions and their possible consequences
Step by Step procedure for building decision tree
• Step 1: Calculate the entropy of the predicted (target) attribute
• Step 2: Compute the IG for all predictor (input) variables
• Step 3: Select Best Attribute (A) based on the highest Information
Gain
• best predictor variable that separates the data into different
classes most effectively or feature that best splits the data
• Step 4: Assign A as a decision variable for the root node
• Step 5: For each value of A, build a descend of the node
• Step 6: Assign classification labels to the leaf node
• Step 7: If data is correctly classified: Stop
• Step 8: Else iterate over the tree
• Keep changing the position of predictor attributes in the tree
or change the root node also, to get the correct output
Determine the parent/root node of the decision tree for the following
dataset and show all the intermediate steps.
E(Label) = -(4/7) log2(4/7) - (3/7) log2(3/7) = 0.984
ID Height Weight
1 185 72
2 170 56
3 168 60
4 179 68
5 182 72
6 188 77
Sketch a dendrogram for the following data using agglomerative
clustering algorithm.
AE BC
ii) AE 0 2 A CO5 4
BC 2 0
BC AE
E A C B
E 0 1 2 2
A 1 0 2 5
C 2 2 0 1
B 2 5 1 0