00 Decision Tree Example
00 Decision Tree Example
Dichotomiser 3)
This decision tree algorithm use the information gain for
attribute selection measure.
Example:
Attribute: Outlook
Values(Outlook) = Sunny, Overcast, Rain
For entire dataset S = (9+, 5-) which means 9 Yes and 5 No
Entropy(S) = -9/14 log2 9/14 - 5/14 log2 5/14 = 0.94
Attribute: Temp
Values(Temp) = Hot, Mild, Cool
For entire dataset S = (9+, 5-) which means 9 Yes and 5 No
Entropy(S) = -9/14 log2 9/14 - 5/14 log2 5/14 = 0.94
Attribute: Humidity
Values(Humidity) = High, Normal
For entire dataset S = (9+, 5-) which means 9 Yes and 5 No
Entropy(S) = -9/14 log2 9/14 - 5/14 log2 5/14 = 0.94
Attribute: Wind
Values(Wind) = Strong, Weak
For entire dataset S = (9+, 5-) which means 9 Yes and 5 No
Entropy(S) = -9/14 log2 9/14 - 5/14 log2 5/14 = 0.94
Repeat the process for the CGPA with >=9, >=8 which has job
offer yes and no.
The following are gini index of all the attributes of the CGPA
with >=9, >=8 is calculated as