0% found this document useful (0 votes)
142 views19 pages

Binary Cross Entropy and Categorical Cross Entropy

Uploaded by

LAKSHYA SINGH
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
142 views19 pages

Binary Cross Entropy and Categorical Cross Entropy

Uploaded by

LAKSHYA SINGH
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 19

Topic:

Binary Cross Entropy and


Categorical Cross Entropy
Name – Lakshya Singh
Roll.no – 2021UEA6511
Date – 05/03/2024
Introduction

Entropy is a concept from information theory. It


quantifies the amount of uncertainty involved in
predicting the outcome of a random variable. In the
context of machine learning, we often use entropy to
quantify the “impurity” of an input set.
Importance of Entropy

In machine learning, entropy is used to quantify the


uncertainty or randomness of a set. It’s used in
various algorithms like decision trees and clustering.
It also forms the basis for more advanced concepts
like cross entropy and information gain.
Binary Cross Entropy - Definition

Binary Cross Entropy is a loss function used in


binary classification tasks. If ‘p’ is the prediction and
‘y’ is the actual value, then the binary cross entropy is
defined as:
Binary Cross Entropy - Intuition

Binary Cross Entropy measures how far away our


predictions are from the actual values. In other
words, it tells us the cost of our wrong predictions.
Binary Cross Entropy - Use Cases

Binary Cross Entropy is used in binary classification


problems like spam detection, tumor detection etc.
Binary Cross Entropy - Example

Let’s say we have a binary classification problem with the


following predictions and actual values: Predictions =
[0.1, 0.9, 0.6], Actual Values = [0, 1, 1]. The Binary Cross
Entropy can be calculated as follows:
Binary Cross Entropy - Properties

Binary Cross Entropy is always positive, and tends


towards zero as we get better at correctly predicting
the actual value.
Categorical Cross Entropy - Definition

Categorical Cross Entropy is a loss function used in


multi-class classification tasks. If ‘p’ is the prediction and
‘y’ is the actual value, then the categorical cross entropy is
defined as:

where ‘m’ is the number of classes.


Categorical Cross Entropy - Intuition

Categorical Cross Entropy measures how far away


our predictions are from the actual values for multi-
class classification problems. It tells us the cost of
our wrong predictions.
Categorical Cross Entropy - Use Cases

Categorical Cross Entropy is used in multi-class


classification problems like digit recognition, animal
classification etc.
Categorical Cross Entropy - Example

Let’s say we have a 3-class classification problem with


the following predictions and actual values: Predictions =
[[0.2, 0.3, 0.5], [0.1, 0.6, 0.3]], Actual Values = [[0, 0, 1],
[0, 1, 0]]. The Categorical Cross Entropy can be
calculated as follows:
Categorical Cross Entropy - Properties

Categorical Cross Entropy is always positive, and tends


towards zero as we get better at correctly predicting the
actual value.
Comparison

Both Binary and Categorical Cross Entropy are used as a


measure of error in classification problems. While Binary
Cross Entropy is used for binary classification problems,
Categorical Cross Entropy is used for multi-class
classification problems.
Comparison
Practical Tips

• Be careful with the base of the logarithm used in the


entropy calculations. In machine learning, we usually use
the natural logarithm.
• Make sure the actual values are one-hot encoded for
multi-class classification problems.
Common Pitfalls

• One common pitfall is thinking that a lower entropy


always means a better model. While it’s true that we want
to minimize the entropy, a model with a very low entropy
could be overfitting the data.
• Another common pitfall is using Binary Cross Entropy for
multi-class classification problems. Make sure to use the
right loss function for the right problem.
Summary

• Entropy is a measure of uncertainty or randomness. In


machine learning, it’s used to quantify the “impurity” of
an input set.
• Binary Cross Entropy and Categorical Cross Entropy are
two types of entropy used in binary and multi-class
classification problems respectively.
• Both these measures tell us how far away our predictions
are from the actual values, and we aim to minimize these
quantities to build a good model.
References

Bing Chat with GPT-4

You might also like