Slide 3
Slide 3
Classifiers
COLLECTION OF CLASSIFICATION ALGORITHMS
Principle of Naive Bayes
Classifier:
A Naive Bayes classifier is a probabilistic machine learning model
that’s used for classification task. The crux of the classifier is
based on the Bayes theorem.
Now, you can obtain the values for each by looking at the dataset
and substitute them into the equation. For all entries in the dataset,
the denominator does not change, it remain static. Therefore, the
denominator can be removed and a proportionality can be
introduced.
In our case, the class variable(y) has only two outcomes, yes or no.
There could be cases where the classification could be multivariate.
Therefore, we need to find the class y with maximum probability.
Using the above function, we can obtain the class, given the
predictors.
We need to find P(xi | yj) for each xi in X and yj in y. All these
calculations have been demonstrated in the tables below:
So, in the figure above, we have calculated P(x i | yj) for each xi in X
and yj in y manually in the tables 1-4. For example, probability of
playing golf given that the temperature is cool, i.e P(temp. = cool |
play golf = Yes) = 3/9.
Also, we need to find class probabilities (P(y)) which has been calculated in the
table 5. For example, P(play golf = Yes) = 9/14.
So now, we are done with our pre-computations and the classifier is ready!
Let us test it on a new set of features (let us call it today):
Types of Naive Bayes Classifier: