AAM Unit 2
AAM Unit 2
Bayes’ Theorem
P (A ∣ B)
Is the conditional probability of event A occurring, given that B is true.
P (B ∣ A)
Is the conditional probability of event B occurring, given that A is true.
P (A) and P(B)
Are the probabilities of A and B occurring independently of one another.
Example 1 of Bayes Theorem
Three bags contain 6 red, 4 black; 4 red, 6 black, and 5 red, 5 black balls
respectively. One of the bag is selected at random and a ball is drawn from it. If the
ball drawn is red, find the probability that it is drawn from the first bag.
Solution:
Let E1, E2, E3, and A be the events defined as follows:
E1 = bag first is chosen, E2 = bag second is chosen, E3 = bag third is chosen,
A = ball drawn is red
Thus, the probability that the red ball was drawn from the first bag is 2/5 or 40%
Example 2: of Bayes Theorem
Amy has two bags. Bag I has 7 red and 4 blue balls and bag II has 5 red and 9 blue
balls. Amy draws a ball at random and it turns out to be red. Determine the
probability that the ball was from the bag I.
Solution:
X = Ball is from Bag I, Y = Ball is from Bag II, A = Ball drawn is red
Assume A to be the event of drawing a red ball. We know that the probability of
choosing a bag for drawing a ball is 1/2, that is,
Step 1: Find the Probability of Drawing a Red Ball from Each Bag:
Thus, so, the probability that the red ball came from Bag I is 64%.
Given data:
Step 2: Interpretation
Medical diagnosis: This algorithm is used in medical diagnosis and helps you
to predict the patient’s risk level for certain diseases.
Weather prediction: You can use this algorithm to predict whether the weather
will be good.
It will assume that all the attributes are independent, which rarely
happens in real life. It will limit the application of this algorithm in
real-world situations.
In a Decision tree, there are two nodes, which are the Decision
Node and Leaf Node.
Decision nodes are used to make any decision and have multiple
branches, whereas Leaf nodes are the output of those decisions and do
not contain any further branches.
Decision Tree Terminologies
• Root Node: Root node is from where the decision tree starts. It
represents the entire dataset, which further gets divided into two or more
homogeneous sets.
• Leaf Node: Leaf nodes are the final output node, and the tree cannot be
segregated further after getting a leaf node.
• Parent/Child node: The root node of the tree is called the parent node,
and other nodes are called the child nodes.
Decision Tree working
Step-1: Begin the tree with the root node, says S, which contains
the complete dataset.
Step-3: Divide the S into subsets that contains possible values for
the best attributes.
Step-4: Generate the decision tree node, which contains the best
attribute.
Example:
Suppose there is a candidate who has a job offer and wants to decide
whether he should accept the offer or Not. So, to solve this problem, the
decision tree starts with the root node (Salary attribute by ASM).
Advantages of the Decision Tree
It is simple to understand as it follows the same process which a
human follow while making any decision in real-life.
It predicts output with high accuracy, even for the large dataset it
runs efficiently.
Medicine: With the help of this algorithm, disease trends and risks
of the disease can be identified.
Land Use: We can identify the areas of similar land use by this
algorithm.