Bayesian Decision Theory
Bayesian Decision Theory
Introduction
Definition
Basic Decision
Now, we will look into the past records of our customer database. We will note
down the number of customers buying computers and also the number of
customers not buying a computer. Now, we will calculate the probabilities of
customers buying a computer. Let it be P(w1). Similarly, the probability of
customers not buying a customer is P(w2).
And, if P(w2) > P(w1), then the customer will not buy a computer (w2)
But, what is the problem with this basic Decision method? Well, most of you
might have guessed right. Based on just previous records, it will always give the
same decision for all future customers. This is illogical and absurd.
So we need something that will help us in making better decisions for future
customers. We do that by introducing some features. Let’s say we add a feature
‘x’ where ‘x’ denotes the age of the customer. Now with this added feature, we
will be able to make better decisions.
1. Prior – P(w1) is the Prior Probability that w1 is true before the data is observed
2. Posterior – P(w1 | x) is the Posterior Probability that w1 is true after the data is
observed.
3. Evidence – P(x) is the Total Probability of the Data
4. Likelihood – P(x | w1) is the information about w1 provided by ‘x’
P(w1 | x) is read as Probability of w1 given x
More Precisely, it is the probability that a customer will buy a computer, given a
specific customer’s age.
And, if P(w2 | x) > P(w1 | x), then the customer will not buy a computer (w2)
This decision seems more logical and trustworthy since we have some features
here to work upon and our decision is based on the features of our new
customers and also past records and not just past records as in earlier cases.
Now, from the formula, you can see that for both our classes w1 and w2, our
denominator P(x) is constant. So, we can utilize this idea and can form another
form of decision as below:
If P(x | w1)*P(w1) > P(x | w2)*P(w2), then the customer will buy a computer
(w1)
And, if P(x | w2)*P(w2) > P(x | w1)*P(w1), then the customer will not buy a
computer (w2)
We can notice an interesting fact here. If somehow, our prior probabilities P(w1)
and P(w2) are equal, we can still be able to make our decision based on our
likelihood probabilities P(x | w1) and P(x | w2). Similarly, if our likelihood
probabilities are equal, we can make decisions based on our prior probabilities
P(w1) and P(w2).
Risk Calculation
Let us consider we have some data and we have made a decision according to
Bayesian Decision Theory.
P(w1 | x) = P(w2 | x)
But, as you can see in the graph, there is some non-zero magnitude of w2 to the
left of the decision boundary. Also, there is some non-zero magnitude of w1 to
the right of the decision boundary. This extension of another class over another
class is what you call a risk or probability error.
To calculate the probability of error for class w1, we need to find the probability
that the class is w2 in the area that is to the left of the decision boundary.
Similarly, the probability of error for class w2 is the probability that the class is
w1 in the area that is to the right of the decision boundary.
w1 is P(w2 | x)
Let us denote the probability of total error for a feature x to be P(E | x). Total
error for a feature x would be the sum of all the probabilities of error for that
feature x. Using simple integration, we can solve this and the result we get is:
Therefore, our probability of total error is the minimum of the posterior probability for both
the classes. We are taking the minimum of a class because ultimately we will give a
decision based on the other class.
Conclusion