Bayes Theorem Intro
Bayes Theorem Intro
Key Components
• Prior Probability (P (A)): The initial probability of the hypothesis before new evi-
dence.
• Likelihood (P (B|A)): The probability of observing the evidence given the hypoth-
esis.
• Posterior Probability (P (A|B)): The updated probability of the hypothesis after
considering the evidence.
• Marginal Probability (P (B)): The total probability of the evidence, often com-
puted using the law of total probability.
Simple Example
Suppose a medical test for a disease is 95% accurate (i.e., P (Positive|Disease) = 0.95),
and the disease affects 1% of the population (i.e., P (Disease) = 0.01). The test has a 5%
false positive rate (i.e., P (Positive|No Disease) = 0.05). If a person tests positive, what is
the probability they have the disease?
Using Bayes’ Theorem:
P (Positive|Disease) · P (Disease)
P (Disease|Positive) =
P (Positive)
1
Calculate P (Positive) using the law of total probability:
Applications
Bayes’ Theorem is widely used in:
• Statistics: Bayesian inference for updating model parameters.
• Machine Learning: Naive Bayes classifiers for text analysis.
• Medical Diagnostics: Interpreting test results.
• Decision Making: Risk assessment and prediction.
Further Reading
Explore topics like Bayesian networks and Monte Carlo methods. Refer to Probability
and Statistics by DeGroot and Schervish for deeper insights.