Decision Tree
Decision Tree
(ID3 Algorithm)
(Numerical)
DECISION TREE
and ID3
Algorithm
Make a decision tree that predicts whether
tennis will be played on the day?
WHAT IS
DECISON
TREE?
A Decision Tree is a
tree where each
node represents a
Feature (Attribute),
each link (branch)
represents a
decision (rule) and
each leaf represents
an outcome.
Algorithms
Cart ID3
Gini Index Entropy Function
Information Gain
Make a decision tree that predicts whether
tennis will be played on the day?
Step 1: Create a root node
The attribute that best classifies the training data, use this
attribute at the root of the tree.
Step 1: Create a root node
How to choose the root node?
The attribute that best classifies the training data, use this
attribute at the root of the tree.
Total = 14
Calculate Entropy(S):
= 0.940
For each Attribute: (let say Outlook)
Calculate Entropy for each Values, i.e for 'Sunny', 'Rainy','Overcast'
Calculate Entropy(Outlook='Value'):
Calculate Average Information Entropy:
= 0.693
Calculate Gain: attribute is Outlook
For each Attribute: (let say Temperature)
Calculate Entropy for each Temp, i.e for 'Hot', 'Mild' and 'Cool'
Calculate Average Information Entropy:
Calculate Gain: attribute is Temperature
For each Attribute: (let say Humidity)
Calculate Entropy for each Humidity, i.e for 'High', 'Normal'
Calculate Average Information Entropy:
Calculate Gain: attribute is Humidity
For each Attribute: (let say Windy)
Calculate Entropy for each Windy, i.e for 'Strong' and 'Weak'
Calculate Average Information Entropy:
Calculate Gain: attribute is Windy
Pick the highest gain attribute.
Root Node:
OUTLOOK
Repeat the same thing for sub-trees till we get
the tree.
Outlook = "Sunny"
Outlook = "Rainy"
P= N=
2 3
Total=
5
Entropy:
For each Attribute: (let say Humidity):
Calculate Entropy for each Humidity, i.e for 'High' and 'Normal'
Entropy:
For each Attribute: (let say Humidity):
Calculate Entropy for each Humidity, i.e for 'High' and 'Normal'
Next Node in
Rainy:
Windy
Weak Strong
nk
ha
T
ou
Y