0% found this document useful (0 votes)
19 views12 pages

Bayesian Inference

Uploaded by

jagadishmec18
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views12 pages

Bayesian Inference

Uploaded by

jagadishmec18
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 12

BAYESIAN

INFERENCE
& PROBABILITY
Bayes' theorem
● Bayes' theorem is a fundamental concept in probability theory and
statistics, particularly in Bayesian inference. It provides a way to
update our beliefs about the probability of an event in light of new
evidence.

● Where:

● ( P(A|B) \) is the posterior probability of event \( A \) given evidence \( B \).

● ( P(B|A) \) is the likelihood of observing evidence \( B \) given that event \( A \) has


occurred.

● ( P(A) \) is the prior probability of event \( A \) before observing evidence \( B \).

● ( P(B) \) is the probability of observing evidence \( B \) (also called the marginal


likelihood or evidence).
Applications of Bayes' Theorem

● 1. **Medical Diagnosis**: To update the probability of a disease given


a test result.
● 2. **Spam Filtering**: To determine the probability that an email is
spam based on certain features.
● 3. **Decision Making**: In various fields such as finance, insurance,
and quality control.
● 4. **Machine Learning**: Especially in Bayesian inference methods,
which form the basis of various algorithms.
Intuitive Understanding

● Bayes' theorem helps us understand how to adjust our beliefs in light


of new evidence. If we have a strong prior belief about something,
but new evidence contradicts that belief, Bayes' theorem provides a
rational way to reconcile the two.

● In summary, Bayes' theorem is a powerful mathematical tool for


updating probabilities based on new evidence, making it essential for
many practical applications in statistics and beyond.
EXAMPLE:

● Let's say we want to determine the probability that a patient has a


disease (D) given that they tested positive (T) for it. We know:
● - \( P(T|D) \): Probability of testing positive if the patient has the
disease (true positive rate).
● - \( P(D) \): Prior probability of having the disease (prevalence).
● - \( P(T) \): Total probability of testing positive.
Bayesian inference

Bayesian inference is a method of statistical inference where


“Bayes' theorem” is used to update the probability for a hypothesis
as more evidence or information becomes available. It's a way to
incorporate prior knowledge, through a prior probability distribution,
to calculate a posterior probability distribution. This posterior
distribution reflects our updated belief about the hypothesis after
considering the new evidence.
Bayesian inference works

Here's a basic overview of how Bayesian inference works:

1. “Prior Probability” ($P(H)$): This is the initial degree of belief in a hypothesis, before
considering the new evidence.

2. “Likelihood” ($P(E|H)$): This is the probability of observing the evidence if the


hypothesis is true.

3. “Evidence” ($P(E)$): The probability of the evidence under all possible hypotheses.

4. “Posterior Probability”($P(H|E)$): The updated probability of the hypothesis given the


new evidence.

The relationship between these elements is described by “Bayes' theorem”:


“Prior Probability”

● “Prior Probability” : We start with a prior belief about the


probability of the coin landing heads up, represented by a
probability distribution. Let's say we have a prior belief that
the coin is fair, so we use a Beta(1,1) distribution, which is a
uniform distribution between 0 and 1.
“Likelihood”:
● Likelihood: We then collect data by flipping the coin n times
and observing the outcomes. Let's say we flipped the coin 10
times and observed 7 heads and 3 tails. The likelihood
function gives the probability of observing these outcomes
given a particular value of θ. In this case, the likelihood is
given by the binomial distribution:
“Likelihood”:
● Likelihood: We then collect data by flipping the coin n times
and observing the outcomes. Let's say we flipped the coin 10
times and observed 7 heads and 3 tails. The likelihood
function gives the probability of observing these outcomes
given a particular value of θ. In this case, the likelihood is
given by the binomial distribution:
“Posterior Probability”:

● Posterior probability, in Bayesian inference, refers to the


updated probability distribution of a parameter or
hypothesis after considering observed data or evidence.
It represents our updated beliefs about the parameter or
hypothesis in light of the observed data, combining prior
knowledge or beliefs with the likelihood of the data given
the parameter values.
Thank you

You might also like