Decision trees are a supervised learning technique that can be used for both classification and regression problems. They use a tree-like structure with internal nodes representing features, branches representing decision rules, and leaf nodes representing the outcome. Decision trees use decision and leaf nodes to make decisions - decision nodes split the data while leaf nodes are the final outcomes. The decisions are made based on evaluating features of the training data to construct a tree-like model that represents all possible solutions to a problem based on conditions. Decision trees are easy to understand and interpret due to their tree structure that mimics human decision making.
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0 ratings0% found this document useful (0 votes)
33 views11 pages
Decision Trees
Decision trees are a supervised learning technique that can be used for both classification and regression problems. They use a tree-like structure with internal nodes representing features, branches representing decision rules, and leaf nodes representing the outcome. Decision trees use decision and leaf nodes to make decisions - decision nodes split the data while leaf nodes are the final outcomes. The decisions are made based on evaluating features of the training data to construct a tree-like model that represents all possible solutions to a problem based on conditions. Decision trees are easy to understand and interpret due to their tree structure that mimics human decision making.
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 11
What is Decision Tree?
• Decision Tree is a Supervised learning
technique that can be used for both classification and Regression problems, but mostly it is preferred for solving Classification problems. It is a tree-structured classifier, where internal nodes represent the features of a dataset, branches represent the decision rules and each leaf node represents the outcome. What is Decision Tree? • In a Decision tree, there are two nodes, which are the Decision Node and Leaf Node. Decision nodes are used to make any decision and have multiple branches, whereas Leaf nodes are the output of those decisions and do not contain any further branches. • The decisions or the test are performed on the basis of features of the given dataset. • It is a graphical representation for getting all the possible solutions to a problem/decision based on given conditions. • It is called a decision tree because, similar to a tree, it starts with the root node, which expands on further branches and constructs a tree-like structure. Applications Why use Decision Trees? • Decision Trees usually mimic human thinking ability while making a decision, so it is easy to understand. • The logic behind the decision tree can be easily understood because it shows a tree-like structure. • IF(Income ≥ 106) AND (Education < 1.5) AND (Family < 2.5) THEN Class = 0 (nonacceptor). Attribute selection measure or ASM
• By this measurement, we can easily select the
best attribute for the nodes of the tree. There are two popular techniques for ASM, which are: • Information Gain • Gini Index