Week 6 Decision Trees
Week 6 Decision Trees
LEARNING
DR. SAEED UR REHMAN
Department of Computer Science,
COMSATS University Islamabad
Wah Campus
Introduction to Machine Learning
CHAPTER 9:
Decision Trees
1 2 3 4
0 10 20 30 40
•For example,
•If an attribute is color ∈ {red, blue, green}
•Given that an instance reaches node m, the estimate for the probability of class Ci is
•If the error is not acceptable, data reaching node m is split further
such that the sum of the errors in the branches is minimum.
•S — The current (data) set for which entropy is being calculated (change
every iteration of the ID3 algorithm)
•x — Set of classes in S x={ yes, no }
•p(x) — The proportion of the number of elements in class x to the number
of elements in set S
Step 5 : The ID3 algorithm is run recursively on the non-leaf branches, until all data is classified