0% found this document useful (0 votes)
15 views7 pages

Naivebayes

Uploaded by

rajarshiwork1
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views7 pages

Naivebayes

Uploaded by

rajarshiwork1
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 7

Naïve Bayes Classifier

Dr. Arijit Dey


Advantages of Naive Bayes
Classifier
• Easy to implement and computationally efficient.
• Effective in cases with a large number of features.
• Performs well even with limited training data.
• It performs well in the presence of categorical features.
• For numerical features data is assumed to come from normal
distributions
Disadvantages of Naive Bayes
Classifier
• Assumes that features are independent, which may not always hold in
real-world data.
• Can be influenced by irrelevant attributes.
• May assign zero probability to unseen events, leading to poor
generalization.
Applications of Naive Bayes
Classifier
• Spam Email Filtering: Classifies emails as spam or non-spam based on
features.
• Text Classification: Used in sentiment analysis, document
categorization, and topic classification.
• Medical Diagnosis: Helps in predicting the likelihood of a disease
based on symptoms.
• Credit Scoring: Evaluates creditworthiness of individuals for loan
approval.
• Weather Prediction: Classifies weather conditions based on various
factors.
Consider the following example

Find out whether the object with attribute Confident = Yes, Sick = No will Fail or Pass
using Bayesian classification.
The data tuples are described by the attributes Confident, Studied and Sick. The class label
attribute, Result, has two distinct values (namely, {Pass, Fail}).
Let, C1 correspond to the class Result = Pass and C2 correspond to Result = Fail.
The tuple we wish to classify is X = (Confident = Yes, Sick = No)

Step 1: (Compute prior probability)


We need to maximize P (X| Ci) P(Ci), for i = 1, 2. P(Ci), the prior probability of each class, can be
computed based on the training tuples:
P(Result = Pass) = 3/5 = 0.6
P(Result = Fail) = 2/5 = 0.4

Step 2: (Compute likelihood probability)


To compute P(X| Ci), for i = 1, 2, we compute the following conditional probabilities:
P(Confident = Yes | Result = Pass) = 2/3 = 0.6667
P(Confident = Yes | Result = Fail) = 1/2 = 0.5
P(Sick = No | Result = Pass) = 1/3 = 0.3333
P(Sick = No | Result = Fail) = 1/2 = 0.5
Step 3: (Compute posterior probability)
P(X| Result = Pass)
= P(Confident = Yes | Result = Pass) × P(Sick = No | Result = Pass)
= 0.6667 * 0.3333
= 0.2222
P(X| Result = Fail)
= P(Confident = Yes | Result = Fail) × P(Sick = No | Result = Fail)
= 0.5 * 0.5
= 0.25

Step 4: (predict the class for X)


To find the class, Ci , that maximizes P(X| Ci)P(Ci), we compute
P(X| Result = Pass)P(Result = Pass) = 0.2222 × 0.6 = 0.1333
P(X| Result = Fail)P(Result = Fail) = 0.25 × 0.4= 0.1
Therefore, the naive Bayesian classifier predicts Result = Pass for tuple X.

You might also like