0% found this document useful (0 votes)
4 views8 pages

Decision Tree

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views8 pages

Decision Tree

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 8

Decision tree

Unit 4
Decision Tree

▪ A decision tree is a non-parametric supervised learning algorithm, which


is utilized for both classification and regression tasks.
Decision Tree

▪ A decision tree is a hierarchical data structure implementing the


divide-and-conquer strategy.
▪ Labeled training sample  Simple rules
▪ A decision tree is a hierarchical model for supervised learning
whereby the local region is identified in a sequence of recursive
splits in a smaller number of steps.
▪ Each decision node m implements a test function fm(x) with discrete
outcomes labeling the branches.
▪ Given an input, at each node, a test is applied and one of the
branches is taken depending on the outcome.
Decision Tree

▪ This process starts at the root and is repeated recursively until a leaf
node is hit, at which point the value written in the leaf constitutes the
output.
▪ A decision tree is also a nonparametric model in the sense that we do
not assume any parametric form for the class densities and the tree
structure is not fixed a priori but the tree grows, branches and leaves
are added, during learning depending on the complexity of the
problem inherent in the data.
▪ Split -> allow all kinds of decision rules at the interior nodes
▪ Tree size is measured by 1. Nodes 2. The complexity of the decision
nodes
Univariate Decision Tree

▪ In a univariate tree, in each internal node, the test uses only one of
the input dimensions.
▪ A split is called univariate if it uses only a single variable, otherwise
multivariate.
▪ Example:
“Petal.Width < 1.75” is univariate,
“Petal.Width < 1.75 and Petal.Length < 4.95” is bivariate.
▪ The decision node divides the input space into two:
▪ Lm = {x|xj > wmO} and Rm = {x|xj ≤ WmO}
Univariate Decision Tree
Univariate Classification Decision
Tree

▪ The goodness of a split is quantifies by an impurity


measure.
▪ A split with minimal impurity is desired because the
smallest tree is desired.
▪ A split is pure if all the branches of the examples
choosing a branch belongs to a class.
▪ For node m, Nm examples reach m,

You might also like