0% found this document useful (0 votes)
1 views

_Bayes theorem & Naive Bayes algorithm

For machine learning student

Uploaded by

Kk
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
1 views

_Bayes theorem & Naive Bayes algorithm

For machine learning student

Uploaded by

Kk
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Illustrate the Naive Bayes algorithm with an example.

State Bayes theorem


with example.

Naive Bayes Algorithm

Naive Bayes is a classification algorithm based on Bayes' theorem, assuming that features
are independent given the class. Despite the "naive" assumption, it often performs well in
real-world situations.

Steps of Naive Bayes:

1. Compute prior probabilities for each class based on the training data.
2. Calculate the likelihood of each feature for a given class.
3. Use Bayes' theorem to compute the posterior probability of each class given the
features of the test data.
4. Classify the test data into the class with the highest posterior probability.

Bayes' Theorem

Bayes' theorem is a mathematical formula that calculates the conditional probability of an


event, given that another event has already occurred. It's also known as Bayes' law or
Bayes' rule
Example of Bayes' Theorem:

Example of Naive Bayes Algorithm

Problem:

Classify whether a given fruit is an apple or orange based on features:

● Shape (Round, Oblong)


● Color (Red, Orange)
This illustrates how the Naive Bayes algorithm uses Bayes' theorem for classification tasks.

Bayes' Theorem is a fundamental concept in probability theory, enabling the calculation of


conditional probabilities.
Naive Bayes uses Bayes' Theorem with the assumption that features are independent,
which works well for many classification problems.

You might also like