0% found this document useful (0 votes)
18 views16 pages

Baysian Modelling

Bayesian modeling provides a principled way to incorporate external information into data analysis using probability distributions. The Bayesian approach defines a prior probability distribution before analyzing data, then uses Bayes' theorem to update beliefs based on new data to derive a posterior probability distribution. Naive Bayes classification is a simple technique that assumes independence between features; it calculates the probability of classes using frequency tables and Bayes' theorem to make predictions.

Uploaded by

ansh aggarwal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views16 pages

Baysian Modelling

Bayesian modeling provides a principled way to incorporate external information into data analysis using probability distributions. The Bayesian approach defines a prior probability distribution before analyzing data, then uses Bayes' theorem to update beliefs based on new data to derive a posterior probability distribution. Naive Bayes classification is a simple technique that assumes independence between features; it calculates the probability of classes using frequency tables and Bayes' theorem to make predictions.

Uploaded by

ansh aggarwal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 16

Bayesian Modeling

Bayesian Modeling-Statistics
 Classical statistics provides methods to analyze data, from simple
descriptive measures to complex and sophisticated models. The available
data are processed and then conclusions about a hypothetical population
of which the data available are supposed to be a representative sample are
drawn.
 Suppose, for example, we need to guess the outcome of an experiment that
consists of tossing a coin. How many biased coins have we ever seen?
Probably not many, and hence we are ready to believe that the coin is fair
and that the outcome of the experiment can be either head or tail with the
same probability.
 On the other hand, imagine that someone would tell us that the coin is
forged so that it is more likely to land head. How can we take into account
this information in the analysis of our data?
Bayesian Modeling-Statistics
 Bayesian methods provide a principled way to incorporate this
external information into the data analysis process. To do so,
however, Bayesian methods have to change entirely the vision of
the data analysis process with respect to the classical approach. In a
Bayesian approach, the data analysis process starts already with a
given probability distribution. As this distribution is given before
any data is considered, it is called prior distribution.
Bayesian Modeling-Statistics
 Bayesian statistics is a mathematical procedure that applies probabilities to
statistical problems. It provides people the tools to update their beliefs in
the evidence of new data.
 Suppose, out of all the 4 championship races (F1) between Niki Lauda
and James hunt, Niki won 3 times while James managed only 1.
 So, if you were to bet on the winner of next race, who would he be ?
 I bet you would say Niki Lauda.
 Here’s the twist. What if you are told that it rained once when James won
and once when Niki won and it is definite that it will rain on the next date.
So, who would you bet your money on now ?
 By intuition, it is easy to see that chances of winning for James have
increased drastically. But the question is: how much ?
Bayesian Modeling-Statistics

Conditional
Bayesian
Statistics
Probability

Bayes Theorem
Conditional Probability
 Conditional probability is known as the possibility of an event or
outcome happening, based on the existence of a previous event or
outcome. It is calculated by multiplying the probability of the preceding
event by the renewed probability of the succeeding, or conditional, event.
 The probability of occurrence of any event A when another event B in
relation to A has already occurred is known as conditional probability. It is
depicted by P(A|B).
Conditional Probability

A A∩B B

40 students 30 students
like Apple 20 students like orange
like both
Conditional Probability

30
P( B)   0.3
100
20
P( A  B)   0.2
100

0.2
P( A | B)   0.67
0.3
Bayes Theorem
 Bayes’’theorem is a mathematical formula used to determine the
conditional probability of the events.
 Bayes theorem describes the probability of an event based on prior
knowledge of the conditions that might be relevant to the event.
 Invented – Thomas Bayes
 Year- 1763
Posterior P(A|B)=> Probability of event A being True, given event B has already
accrued.
Likelihood P(B|A)=> Probability of the evidence given that the hypothesis is
True.
Prior P(A)=> Probability of hypothesis before considering the evidence.
Marginal P(B)=> Probability of evidence/Data.
Bayes Theorem
 P(A|B)- Probability of hypothesis A, given that evidence or data B.
 P(B|A)- Probability of data/evidence, given that hypothesis is true.
 P(A)- Probability of A PB
P( A | B) 
 P(B)- Probability of B P( B)
P ( B  A)
P ( B | A) 
P ( A)
LHS RHS
P ( A | B ).P ( B )  P ( A  B )
P ( B | A).P ( A)  P ( B  A)

P ( A  B )  P ( A | B ).P ( B )  P ( B | A).P ( A)

P ( B | A).P ( A)
P( A | B) 
P( B)
Bayes Theorem
 For Example:
 Calculate P(King|Face)------Posterior Probability

( Face | King ).P( King )


P( King | Face) 
P( Face)
(1).(4 / 52)
P( King | Face) 
12 / 52
(1).(1/13) 1
P( King | Face)    0.33
(3 /13) 3
Naïve Bayes
 Naive Bayes algorithm is a classification technique based on
Bayes’ theorem, which assumes that the presence of a particular
feature in a class is unrelated to the presence of any other feature.
There are various applications of this algorithm including face
recognition, NLP problems, medical diagnoses and a lot more.
Naive Bayes example:
Below is training data on which Naive Bayes algorithm is applied:

Step 1: Make a Frequency table of the data.


Step 2: Create a Likelihood table by finding probabilities like Overcast
probability = 0.29.

Step 3: Use Naive Bayes equation to calculate the posterior probability for
each class. The class with the highest posterior probability is the outcome of
prediction.
Problem: Players will play if the weather is Rainy. Is this statement correct?

You can solve it using the above discussed method of posterior probability.
P(Yes | Rainy) = P( Rainy | Yes) * P(Yes) / P (Rainy)
Here, you have P (Rainy |Yes) = 2/9 = 0.22, P(Rainy) = 5/14 = 0.36, P(Yes)=
9/14 = 0.64
Now, P (Yes | Rainy) = 0.22 * 0.64 / 0.36 = 0.39, which has a higher probability.
Naive Bayes uses a similar method to predict the probability of different classes
based on various attributes. This algorithm is mostly used in NLP problems like
sentiment analysis, text classification, etc.

You might also like