0% found this document useful (0 votes)
124 views8 pages

Faculty of Information Management Universiti Teknologi Mara Puncak Perdana

This document provides information about decision trees, including: 1) Decision trees can be expressed as a graphical model describing decisions and outcomes, with nodes forming a rooted tree. Internal nodes represent test conditions and leaves represent outcomes. 2) An example decision tree is presented showing how it can predict customer responses to direct mail based on attributes like age, gender, etc. 3) Decision trees are useful tools for evaluating options and investigating outcomes of different courses of action, helping form a balanced view of risks and rewards. 4) The process of drawing a decision tree is described, starting with an initial decision and expanding out options as squares for decisions and circles for uncertain outcomes.

Uploaded by

Nabila Ibrahim
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
124 views8 pages

Faculty of Information Management Universiti Teknologi Mara Puncak Perdana

This document provides information about decision trees, including: 1) Decision trees can be expressed as a graphical model describing decisions and outcomes, with nodes forming a rooted tree. Internal nodes represent test conditions and leaves represent outcomes. 2) An example decision tree is presented showing how it can predict customer responses to direct mail based on attributes like age, gender, etc. 3) Decision trees are useful tools for evaluating options and investigating outcomes of different courses of action, helping form a balanced view of risks and rewards. 4) The process of drawing a decision tree is described, starting with an initial decision and expanding out options as squares for decisions and circles for uncertain outcomes.

Uploaded by

Nabila Ibrahim
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 8

FACULTY OF INFORMATION MANAGEMENT

UNIVERSITI TEKNOLOGI MARA


PUNCAK PERDANA

IMS 555
INDIVIDUAL ASSIGNMENT

TITLE:
DECISION TREE

PREPARED FOR:
SIR SAFAWI BIN ABDUL RAHMAN

PREPARED BY:
NURUL NABILA BINTI IBRAHIM
2016709711

GROUP:
IM245ST1

28TH DECEMBER2018

1
TABLE OF CONTENT

Introduction………………………………………………………………………….…page 3
Definition………………………………………………………………………………. page 4
Example……………………………....……………………………………………....…page 4
The Use of Decision Tree ……………….……………………………………………. page 6
Drawing a Decision Tree ………………………………….…………………………. page 6
References …………………………………………………………………….………. page 7

2
INTRODUCTION

Frequently, a decision maker will be faced with a sequential decision problem involving
decisions that lead to different outcomes depending on chance. When the decision process
involves many sequential decisions, the decision problem will become difficult to visualize and
to implement. Decision trees are great graphical tools in such situation. They allow for intuitive
understanding of the problem and can aid in decision making. Thus, when facilitating decision
making in sequential decision problems, a decision tree is a powerful tool for classification and
prediction of the outcomes.

3
Definition

Decision trees can be expressed as a graphical model describing decisions and their
possible outcomes. It consists of nodes that form a rooted tree, meaning it is a directed tree
with a node called “root” that has no incoming edges. All other nodes have exactly one
incoming edge. A node with outgoing edges is called an internal or test node. All other nodes
are called leaves (also known as terminal or decision nodes). In a decision tree, each internal
node splits the instance space into two or more sub-spaces according to a certain discrete
function of the input attributes values. In the simplest and most frequent case, each test
considers a single attribute, such that the instance space is partitioned according to the
attribute’s value. In the case of numeric attributes, the condition refers to a range.

Each leaf is assigned to one class representing the most appropriate target value.
Alternatively, the leaf may hold a probability vector indicating the probability of the target
attribute having a certain value. Instances are classified by navigating them from the root of
the tree down to a leaf, according to the outcome of the tests along the path.

Example

Figure 1 shows a decision tree that reasons whether a potential customer will respond
to a direct mailing. Internal nodes are represented as circles, whereas leaves are denoted as
triangles. Note that this decision tree incorporates both nominal and numeric attributes. Given
this classifier, the analyst can predict the response of a potential customer (by sorting it down
the tree) and understand the behavioural characteristics of the entire potential customers
population regarding direct mailing. Each node is labelled with the attribute it tests, and its
branches are labelled with its corresponding values.

4
Figure 1: Decision Tree presenting Response to Direct Mailing

In case of numeric attributes, decision trees can be interpreted as a collection of


hyperplanes, each orthogonal to one of the axes. Usually, decision-makers prefer less complex
decision trees, since they may be considered more comprehensible. Furthermore, according to
Breiman et al. (1984) the tree complexity has a crucial effect on its accuracy. The tree
complexity is explicitly controlled by the stopping criteria used and the pruning method
employed. Normally the tree complexity is measured by one of the following metrics: the total
number of nodes, total number of leaves, tree depth and number of attributes used. Decision
tree induction is closely related to rule induction. Each path from the root of a decision tree to
one of its leaves can be transformed into a rule simply by conjoining the tests along the path to
form the antecedent part and taking the leaf’s class prediction as the class value. For example,
one of the paths in Figure 1 can be transformed into the rule: “If customer age is less than or
equal to or equal to 30, and the gender of the customer is “Male” – then the customer will
respond to the mail”. The resulting rule set can then be simplified to improve its
comprehensibility to a human user, and possibly its accuracy (Quinlan, 1987).

5
The use of Decision Tree

Decision Trees are excellent tools for helping you to choose between several courses
of action. They provide a highly effective structure within which you can lay out options and
investigate the possible outcomes of choosing those options. They also help you to form a
balanced picture of the risks and rewards associated with each possible course of action.

Drawing a Decision Tree

You start a Decision Tree with a decision that you need to make. Draw a small square
to represent this towards the left of a large piece of paper. From this box draw out lines towards
the right for each possible solution and write that solution along the line. Keep the lines apart
as far as possible so that you can expand your thoughts.

At the end of each line, consider the results. If the result of taking that decision is
uncertain, draw a small circle. If the result is another decision that you need to make, draw
another square. Squares represent decisions, and circles represent uncertain outcomes. Write
the decision or factor above the square or circle. If you have completed the solution at the end
of the line, just leave it blank.

Starting from the new decision squares on your diagram, draw out lines representing
the options that you could select. From the circles draw lines representing possible outcomes.
Again, make a brief note on the line saying what it means. Keep on doing this until you have
drawn out as many of the possible outcomes and decisions as you can see leading on from the
original decisions. An example of the sort of thing you will end up with is shown in figure 2:

6
Figure 2: Special Instrument Product Decision

Decision trees, by their very nature, are simple and intuitive to understand. For example,
a binary classification tree assigns data by dropping a data point (case) down the tree and
moving either left or right through nodes depending on the value of a given variable. The nature
of a binary tree ensures that each case is assigned to a unique terminal node. The value for the
terminal node (the predicted outcome) defines how the case is classified. By following the path
as a case moves down the tree to its terminal node, the decision rule for that case can be read
directly off the tree. Such a rule is simple to understand, as it is nothing more than a sequence
of simple rules strung together.

7
REFRENCES

Breiman, L., Friedman, J. H., Olshen, R. A., & Stone, C. J. (1984). Classification and
regression trees. Belmont, CA: Wadsworth.

Segal, M. R. (1988). Regression trees for censored data. Biometrics, 44, 35–47.

You might also like