0% found this document useful (0 votes)
15 views29 pages

Entropy

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views29 pages

Entropy

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 29

Applied

Mathematics for
Machine Learning
José David Vega Sánchez
[email protected]
2025
2
Probabilidad y
Estadística
Outline

1. Entropy
2. Entropy in Machine Learning
Entropy
1. Entropy
Intuition 1907
1. Entropy
Intuition
1. Entropy
Intuition
1. Entropy
Intuition
1. Entropy
Intuition
1. Entropy
Intuition

Información de un mensaje se puede medir


como la cantidad de preguntas de Si o No
1. Entropy
Intuition
1. Entropy
Intuition
1. Entropy
Intuition
1. Entropy
Intuition

Información de un mensaje

2 sonidos/seg
1. Entropy
Intuition
1. Entropy
Intuition
1. Entropy
Intuition
1. Entropy
Intuition

Si se al 100% la respuesta de una pregunta no Si se al 50% la respuesta de una pregunta no


aporta información (0 entropía) aporta información (1 bit de información)
1. Entropy
Intuition
1. Entropy
Intuition
1. Entropy
Formal definition

Entropy is a mathematical measure of uncertainty or randomness in a system or


dataset. It quantifies how unpredictable the outcomes of a process are. The more
uncertain a situation, the higher the entropy.

Real-Life Examples with Connections to the Formula

1. Flipping a Coin
Each side (heads or tails) has an equal chance of 50%.
Uncertainty is maximum because either outcome is equally likely.
1. Entropy
Formal definition

Entropy is a mathematical measure of uncertainty or randomness in a system or


dataset. It quantifies how unpredictable the outcomes of a process are. The more
uncertain a situation, the higher the entropy.

Real-Life Examples with Connections to the Formula

1. Flipping a Coin
This question reflects the binary
Each side (heads or tails) has an equal chance of 50%. nature of the coin flip: there are
only two equally likely outcomes
Uncertainty is maximum because either outcome is equally likely. (50% heads, 50% tails), so 1 bit
of information is sufficient to
fully resolve the uncertainty.
Entropy in
Machine
Learning
2. Entropy in Machine Learning
Cross-Entropy Loss

Used in classification tasks to measure the difference between the true labels
and the model’s predicted probabilities.

Formula:

Example:
2. Entropy in Machine Learning
Cross-Entropy Loss

Used in classification tasks to measure the difference between the true labels
and the model’s predicted probabilities.

Formula:
Machine Learning: Entropy helps
models reduce uncertainty and
measure prediction quality.
Example:
2. Entropy in Machine Learning
Cross-Entropy Loss in Neural Networks

Cross-Entropy Loss is one of the most common loss functions used in classification tasks
in neural networks. It measures the difference between the true label distribution (ground
truth) and the predicted probability distribution produced by the model. The goal of
cross-entropy loss is to minimize this difference, thereby improving the accuracy of the
predictions.

Why Use Cross-Entropy Loss?

In classification problems, the model predicts probabilities for each class.


The cross-entropy loss:

Encourages the model to assign high probability to the correct class.

Penalizes predictions where the probability of the correct class is low.


2. Entropy in Machine Learning
Cross-Entropy Loss in Neural Networks (Example)

Binary Classification

For two classes


2. Entropy in Machine Learning
Cross-Entropy Loss in Neural Networks (Example)

Multi-Class Classification

For three classes (C=3):

Behavior of Cross-Entropy Loss


Low Loss: When the model predicts a high probability for the correct class.
High Loss: When the model predicts a low probability for the correct class.
Thanks

You might also like