0% found this document useful (0 votes)
37 views45 pages

Decision Trees ID3

Uploaded by

awa371157
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views45 pages

Decision Trees ID3

Uploaded by

awa371157
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 45

Outline :

What are Decision Trees


Components of Decision Tree
ID3
• Entropy, Information gain
• Example 1
• Example 2
example :
What are a Decision Trees?
Decision trees are One of the best
and mostly used supervised learning
methods are tree-based algorithms.

They empower predictive


modeling with higher accuracy, better
stability and provide ease of
interpretation.

Hence, for every analyst, it’s important to


learn these algorithms and apply them at
the time of modeling.

Our brain works like a decision tree every


fore we Know how to make a decision tree,
ere are terms you must know first:
• Root Node:

This top-level node represents the


ultimate objective, or big decision
you’re trying to make..

• Internal Node:

Each internal node in decision tree is a


test which splits the objects in to
• Branches:
Branches, which stem from the root,
represent different options—or courses of
action—that are available when making a
particular decision. They are most
commonly indicated with an arrow line .

• Leaf Node:
The leaf nodes—which are attached at the
end of the branches—represent possible
outcomes for each action.
When training a dataset to classify a
variable, the idea of the Decision Tree is to
divide the data into smaller datasets based
on a certain attribute value until the target
variables all fall under one category.
For example:
While the human brain decides to select the
“splitting attribute” based on the
experience ,a computer splits the dataset
based on the maximum information gain.
Let’s define a simple problem and jump into
some calculations to see exactly what this
means!
Example 1 :
There are two terms one needs to
be familiar with in order to define
the "best":
entropy
uses entropy to calculate the
homogeneity of a sample. If the sample is
completely homogeneous the entropy is
zero and if the sample is an equally

divided it hasgain
information entropy of one.

shows how much the entropy of a set of


examples will decrease if a specific
attribute is chosen.
0.940

0.940
0.940

0.940
0.940
0.940

Outlook factor on decision


produces the highest
score.
That’s why, outlook
decision will appear in the
root node of the tree.

0.940
(3,7,12,1
3)
(1,2,8)
(9,10)
0.91
7
0.91
7
0.97
1
(6,14)
(4,5,10)
ID3 algorithm
ID3 (Iterative Dichotomiser 3) is
an algorithm invented by Ross
Quinlan used to generate a decision
tree from a dataset. ID3 is the precursor
to the C4.5 algorithm, and is typically
used in the machine learning and natural
language processing domains.
Example 2 :
or the following Medical Diagnosis Da
Create decision tree.
Reference:
http://intellspot.com/decision-tree-examples/

https://www.youtube.com/watch?v=qDcl-FRnwSU

https://www.lucidchart.com/pages/how-to-make-a-decision-tree-diagram

https://www.coursera.org/lecture/ml-classification/principle-of-occams-razor-learning-simpler-
decision-trees-tUvBS

https://medium.com/@chiragsehra42/decision-trees-explained-easily-28f23241248

https://towardsdatascience.com/decision-trees-d07e0f420175

https://en.wikipedia.org/wiki/ID3_algorithm

https://towardsdatascience.com/decision-tree-an-algorithm-that-works-like-the-human-brain-
8bc0652f1fc6

You might also like