0% found this document useful (0 votes)
14 views4 pages

Class 4 Naive Bayes Classification 1

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views4 pages

Class 4 Naive Bayes Classification 1

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

Naive Bayes algorithm is a supervised machine learning

algorithm which is based on Bayes Theorem used mainly


for classification problem.
Conditional probability is a fundamental concept in
probability theory, statistics, and machine learning that helps understand
the likelihood of an event occurring given that another event has already
happened. Conditional probability is one type
of probability in which the possibility of an event depends upon the
existence of a previous event. Conditional probability is the likelihood of
an outcome occurring based on a previous outcome in similar
circumstances.

Conditional probability is written as P(A|B), where P(A|B) is the


conditional probability of A given that B has occurred.
Bayes’ Theorem is used to determine the conditional probability of
an event. It is used to find the probability of an event, based on prior
knowledge of conditions that might be related to that event.
Bayes theorem (also known as the Bayes Rule or Bayes Law) is
used to determine the conditional probability of event A when event
B has already occurred.
Prior probability is defined as the initial assessment or the likelihood
of the event or an outcome before any new data is considered. In simple
words, it tells us about what we know based on previous knowledge or
experience.
P(B|A) is likelihood which is defined as the probability of event B
given that event A has occurred

P(A|B) is Posterior probability: Probability of hypothesis


A on the observed event B.
P(B|A) is Likelihood probability: Probability of the
evidence given that the probability of a hypothesis is true.

P(A) is Prior Probability: Probability of hypothesis


before observing the evidence.

P(B) is Marginal Probability: Probability of Evidence.


IS PRIOR PROBABILITY

You might also like