0% found this document useful (0 votes)
30 views3 pages

Experiment No 6

The document discusses implementing and evaluating the Naive Bayes classifier algorithm. It provides an overview of Naive Bayes classifiers, describes the assumptions and mathematics behind Naive Bayes, and discusses different types of Naive Bayes models and applications.

Uploaded by

Apurva Patil
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
30 views3 pages

Experiment No 6

The document discusses implementing and evaluating the Naive Bayes classifier algorithm. It provides an overview of Naive Bayes classifiers, describes the assumptions and mathematics behind Naive Bayes, and discusses different types of Naive Bayes models and applications.

Uploaded by

Apurva Patil
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

EXPERIMENT NO.

6
Title: Naïve Bayes’ Classifier
Aim: Implement and evaluate Naïve Bayes’ Classifier algorithm
Outcomes: At the end of the experiment the student should be able to:
1. Understand Naïve Bayes’ Classifier algorithm
2. Implement and evaluate Naïve Bayes’ Classifier algorithm
Contents:
 Naive Bayes Classifiers:
Naive Bayes classifiers, a family of algorithms based on Bayes’ Theorem. Despite the “naive”
assumption of feature independence, these classifiers are widely utilized for their simplicity
and efficiency in machine learning.
 What is Naive Bayes classifiers?
Naive Bayes classifiers are a collection of classification algorithms based on Bayes’
Theorem. It is not a single algorithm but a family of algorithms where all of them share a
common principle, i.e. every pair of features being classified is independent of each other. To
start with, let us consider a dataset.
One of the most simple and effective classification algorithms, the Naïve Bayes classifier
aids in the rapid development of machine learning models with rapid prediction capabilities.
 Why it is called Naive Bayes?
The “Naive” part of the name indicates the simplifying assumption made by the Naïve Bayes
classifier. The classifier assumes that the features used to describe an observation are
conditionally independent, given the class label. The “Bayes” part of the name refers to
Reverend Thomas Bayes, an 18th-century statistician and theologian who formulated Bayes’
theorem.
 Assumption of Naive Bayes:
The fundamental Naive Bayes assumption is that each feature makes an:
 Feature independence: The features of the data are conditionally independent
of each other, given the class label.
 Continuous features are normally distributed: If a feature is continuous, then
it is assumed to be normally distributed within each class.
 Discrete features have multinomial distributions: If a feature is discrete, then
it is assumed to have a multinomial distribution within each class.
 Features are equally important: All features are assumed to contribute equally
to the prediction of the class label.
 No missing data: The data should not contain any missing values.

 Bayes’ Theorem
Bayes’ Theorem finds the probability of an event occurring given the probability of another
event that has already occurred. Bayes’ theorem is stated mathematically as the following
equation:

where A and B are events and P(B) ≠ 0


 Basically, we are trying to find probability of event A, given the event B is true.
Event B is also termed as evidence.
 P(A) is the priori of A (the prior probability, i.e. Probability of event before
evidence is seen). The evidence is an attribute value of an unknown instance(here,
it is event B).
 P(B) is Marginal Probability: Probability of Evidence.
 P(A|B) is a posteriori probability of B, i.e. probability of event after evidence is
seen.
 P(B|A) is Likelihood probability i.e the likelihood that a hypothesis will come
true based on the evidence.
 Types of Naive Bayes Model:
 There are three types of Naive Bayes Model:
 Gaussian Naive Bayes classifier
 In Gaussian Naive Bayes, continuous values associated with each feature are
assumed to be distributed according to a Gaussian distribution. A Gaussian
distribution is also called Normal distribution When plotted, it gives a bell-shaped
curve which is symmetric about the mean of the feature values
 Multinomial Naive Bayes
 Feature vectors represent the frequencies with which certain events have been
generated by a multinomial distribution. This is the event model typically used for
document classification.
 Bernoulli Naive Bayes
 In the multivariate Bernoulli event model, features are independent booleans (binary
variables) describing inputs. Like the multinomial model, this model is popular for
document classification tasks, where binary term occurrence(i.e. a word occurs in a
document or not) features are used rather than term frequencies(i.e. frequency of a
word in the document).
 Advantages of Naive Bayes Classifier
 Easy to implement and computationally efficient.
 Effective in cases with a large number of features.
 Performs well even with limited training data.
 Disadvantages of Naive Bayes Classifier
 Assumes that features are independent, which may not always hold in real-world
data.
 Can be influenced by irrelevant attributes.
 May assign zero probability to unseen events, leading to poor generalization.
 Applications of Naive Bayes Classifier
 Spam Email Filtering: Classifies emails as spam or non-spam based on features.
 Text Classification: Used in sentiment analysis, document categorization, and
topic classification.
 Medical Diagnosis: Helps in predicting the likelihood of a disease based on
symptoms.
 Credit Scoring: Evaluates creditworthiness of individuals for loan approval.
 Weather Prediction: Classifies weather conditions based on various factors.

 Conclusion:

You might also like