0% found this document useful (0 votes)
2 views2 pages

Bayes Theorem Intro

Bayes' Theorem is a principle in probability theory that updates the probability of a hypothesis based on new evidence, expressed mathematically through conditional probabilities. It consists of key components such as prior probability, likelihood, posterior probability, and marginal probability, which are crucial for applications in statistics, machine learning, and medical diagnostics. A practical example illustrates how to calculate the probability of having a disease given a positive test result, demonstrating the theorem's utility.

Uploaded by

jamesnorris8902
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views2 pages

Bayes Theorem Intro

Bayes' Theorem is a principle in probability theory that updates the probability of a hypothesis based on new evidence, expressed mathematically through conditional probabilities. It consists of key components such as prior probability, likelihood, posterior probability, and marginal probability, which are crucial for applications in statistics, machine learning, and medical diagnostics. A practical example illustrates how to calculate the probability of having a disease given a positive test result, demonstrating the theorem's utility.

Uploaded by

jamesnorris8902
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Introduction to Bayes’ Theorem

What is Bayes’ Theorem?


Bayes’ Theorem is a fundamental concept in probability theory that describes how to
update the probability of a hypothesis based on new evidence. It provides a mathemat-
ical framework for conditional probability, which is the probability of an event given
that another event has occurred.
Formally, Bayes’ Theorem is expressed as:
P (B|A) · P (A)
P (A|B) =
P (B)
where:
• P (A|B): Posterior probability of event A given event B.
• P (B|A): Likelihood of event B given event A.
• P (A): Prior probability of event A.
• P (B): Marginal probability of event B.

Key Components
• Prior Probability (P (A)): The initial probability of the hypothesis before new evi-
dence.
• Likelihood (P (B|A)): The probability of observing the evidence given the hypoth-
esis.
• Posterior Probability (P (A|B)): The updated probability of the hypothesis after
considering the evidence.
• Marginal Probability (P (B)): The total probability of the evidence, often com-
puted using the law of total probability.

Simple Example
Suppose a medical test for a disease is 95% accurate (i.e., P (Positive|Disease) = 0.95),
and the disease affects 1% of the population (i.e., P (Disease) = 0.01). The test has a 5%
false positive rate (i.e., P (Positive|No Disease) = 0.05). If a person tests positive, what is
the probability they have the disease?
Using Bayes’ Theorem:
P (Positive|Disease) · P (Disease)
P (Disease|Positive) =
P (Positive)

1
Calculate P (Positive) using the law of total probability:

P (Positive) = P (Positive|Disease) · P (Disease) + P (Positive|No Disease) · P (No Disease)

= (0.95 · 0.01) + (0.05 · 0.99) = 0.0095 + 0.0495 = 0.059


Then:
0.95 · 0.01
P (Disease|Positive) = ≈ 0.161
0.059
So, there is approximately a 16.1% chance the person has the disease.

Applications
Bayes’ Theorem is widely used in:
• Statistics: Bayesian inference for updating model parameters.
• Machine Learning: Naive Bayes classifiers for text analysis.
• Medical Diagnostics: Interpreting test results.
• Decision Making: Risk assessment and prediction.

Further Reading
Explore topics like Bayesian networks and Monte Carlo methods. Refer to Probability
and Statistics by DeGroot and Schervish for deeper insights.

You might also like