0% found this document useful (0 votes)
9 views26 pages

Aiml1 3

Probability refers to the likelihood of an event occurring based on known information. Prior probability is the probability of an event before new data is collected, while posterior probability is calculated using Bayes' theorem and updates the prior probability with new data. Likelihood refers to how well the data fits a hypothesis or model, and conditional probability is the probability of an event given that another event has occurred.

Uploaded by

Vivek Tg
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views26 pages

Aiml1 3

Probability refers to the likelihood of an event occurring based on known information. Prior probability is the probability of an event before new data is collected, while posterior probability is calculated using Bayes' theorem and updates the prior probability with new data. Likelihood refers to how well the data fits a hypothesis or model, and conditional probability is the probability of an event given that another event has occurred.

Uploaded by

Vivek Tg
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 26

Probability

• Probability is the chance that something will happen, or how likely it is that
an event will occur.
• When we toss a coin in the air, we use the word probability to refer to how
likely it is that the coin will land with the heads side up.
• When we talk about probability,
• we use a few words that help us understand the chance for something to
happen. Probability can be expressed in the following ways:
• Certain: an event will happen without a doubt
• Likely: the probability of one event is higher than the probability of
another event
• Equal probability: the chance of each event happening is the same

• Impossible: there's no chance of an event happening


What Is Prior Probability?
• Prior probability, in Bayesian statistics, is the probability of an
event before new data is collected.
• This is the best rational assessment of the probability of an
outcome based on the current knowledge before an experiment
is performed.
• The posterior probability is calculated by updating the prior
probability using Bayes' theorem.
• In statistical terms, the prior probability is the basis for posterior
probabilities.
For Understanding
What is Probability?
Probability is a measure of the likelihood that an event will actually occur
based on information or assumptions that are currently known.

The probability of the event is commonly stated as a number between 0


and 1, where 0 indicates impossibility and 1 indicates inevitability.

To determine probability, use the following formula −


Probability=Number of favorable outcomes/Total numberof outcomes

For instance, the probability of getting heads when flipping a fair coin is
0.5 because there are two possible outcomes (heads or tails), and each
outcome has an equal likelihood of occurring.
Probability is used to describe the likelihood of events based on
assumptions or to make predictions about the future.
For Understanding
• The prior probability is the probability assigned to an event before
the arrival of some information that makes it necessary to revise
the assigned probability.
• The revision of the prior is carried out using Bayes' rule.
• The new probability assigned to the event after the revision is
called posterior probability.
What is prior probability in Naive Bayes?
• The probability of each class before any characteristics are
observed is known as the prior probability in the Naive Bayes
method
• Posterior probability = prior probability + new data
For Understanding
What is likelihood probability in Machine Learning with example?
• In simple words, as the name suggests, the likelihood is a
function that tells us how likely the specific data point suits
the existing data distribution.
For example.
• Suppose there are two data points in the dataset.
• The likelihood of the first data point is greater than the second

• Likelihood deals mainly with the probability that something is going


to happen
• The likelihood of something happening is how likely it is to happen.
Examples of Probability and Likelihood

Examples 1 – Coin Toss


• In the context of coin tosses, likelihood and probability represent
different aspects of the same experiment.
• The likelihood refers to the probability of observing a specific
outcome given a particular model or hypothesis.
• On the other hand, probability represents the long-term frequency of
an event occurring over multiple trials.

• The distinction between probability and likelihood is


fundamentally important: Probability attaches to possible
results; likelihood attaches to hypotheses.
• What Is a Posterior Probability?
• A posterior probability, in Bayesian statistics, is the revised or
updated probability of an event occurring after taking into
consideration new information.
• The posterior probability is calculated by updating the prior
probability using Bayes' theorem.

• In statistical terms, the posterior probability is the probability of


event A occurring given that event B has occurred.
• What Is Conditional Probability?
• Conditional probability is defined as the likelihood of an event or
outcome occurring, based on the occurrence of a previous
event or outcome.
• Conditional probability refers to the chances that some
outcome occurs given that another event has also
occurred.

• It is often stated as the probability of B given A and is written as

P(B|A), where the probability of B depends on that of A


happening.
What is the difference between chance and likelihood?

• Probability corresponds to finding the chance of something given a


sample distribution of the data.
• Likelihood refers to finding the best distribution of the data given a
particular value of some feature or some situation in the data.

What is the difference between likelihood and conditional probability?


• For con- ditional probability, the hypothesis is treated as a given, and
the data are free to vary. For likelihood, the data are treated as a
given, and the hypothesis varies.
Naive Bayes Classifier: Calculation of
Prior, Likelihood, Evidence & Posterior
• Naive Bayes is a non-linear classifier, a type of supervised
learning and is based on Bayes theorem.
• Basically, it’s “naive” because it makes assumptions that may
or may not turn out to be correct.
• In other words, the calculation of the probabilities for each
hypothesis are simplified to make their calculation tractable.
• Rather than attempting to calculate the values of each
attribute value, they are assumed to be conditionally
independent.
• Maximum a Posteriori (MAP) estimation is a statistical technique used to estimate the probability
distribution of a dataset by incorporating prior knowledge or experience

You might also like