Experiment No 6
Experiment No 6
6
Title: Naïve Bayes’ Classifier
Aim: Implement and evaluate Naïve Bayes’ Classifier algorithm
Outcomes: At the end of the experiment the student should be able to:
1. Understand Naïve Bayes’ Classifier algorithm
2. Implement and evaluate Naïve Bayes’ Classifier algorithm
Contents:
Naive Bayes Classifiers:
Naive Bayes classifiers, a family of algorithms based on Bayes’ Theorem. Despite the “naive”
assumption of feature independence, these classifiers are widely utilized for their simplicity
and efficiency in machine learning.
What is Naive Bayes classifiers?
Naive Bayes classifiers are a collection of classification algorithms based on Bayes’
Theorem. It is not a single algorithm but a family of algorithms where all of them share a
common principle, i.e. every pair of features being classified is independent of each other. To
start with, let us consider a dataset.
One of the most simple and effective classification algorithms, the Naïve Bayes classifier
aids in the rapid development of machine learning models with rapid prediction capabilities.
Why it is called Naive Bayes?
The “Naive” part of the name indicates the simplifying assumption made by the Naïve Bayes
classifier. The classifier assumes that the features used to describe an observation are
conditionally independent, given the class label. The “Bayes” part of the name refers to
Reverend Thomas Bayes, an 18th-century statistician and theologian who formulated Bayes’
theorem.
Assumption of Naive Bayes:
The fundamental Naive Bayes assumption is that each feature makes an:
Feature independence: The features of the data are conditionally independent
of each other, given the class label.
Continuous features are normally distributed: If a feature is continuous, then
it is assumed to be normally distributed within each class.
Discrete features have multinomial distributions: If a feature is discrete, then
it is assumed to have a multinomial distribution within each class.
Features are equally important: All features are assumed to contribute equally
to the prediction of the class label.
No missing data: The data should not contain any missing values.
Bayes’ Theorem
Bayes’ Theorem finds the probability of an event occurring given the probability of another
event that has already occurred. Bayes’ theorem is stated mathematically as the following
equation:
Conclusion: