Bayes Decision Theorylect3
Bayes Decision Theorylect3
• Conditional Probability
• Conditional probability is when the occurence of an event is wholly or partially
affected by other event(s). Alternatively put, it is the probability of occurrence of an
event A when an another event B has already taken place. It is denoted by P(A|B) and
read as “probability of A given B”.
• Joint Probability
• Joint probability is calculated when we are interested in the occurrence of two
different events simultaneously. It is also often referenced as probability of
intersection of two events. It is denoted by P(A, B) and read as “probability of A and
B”.
Derivation of Bayes’ Theorem:
• A model can be viewed as any one of the process, relationship,
equation or an approximation to understand and describe the
data.
• We know from the conditional probability:
• P(A|B) = P(A, B) / P(B)
• => P(A, B) = P(A|B) * P(B) ... (i)
• Similarly,
• P(A, B) = P(B|A) * P(A) ... (ii)
• From equation (i) and (ii):
• P(A|B) * P(B) = P(B|A) * P(A)
• => P(A|B) = [P(B|A) * P(A)] / P(B)
For each sample input, it calculates its posterior and assign it to the class corresponding to the
maximum value of the posterior probability.