Entropy
Entropy
Mathematics for
Machine Learning
José David Vega Sánchez
[email protected]
2025
2
Probabilidad y
Estadística
Outline
1. Entropy
2. Entropy in Machine Learning
Entropy
1. Entropy
Intuition 1907
1. Entropy
Intuition
1. Entropy
Intuition
1. Entropy
Intuition
1. Entropy
Intuition
1. Entropy
Intuition
Información de un mensaje
2 sonidos/seg
1. Entropy
Intuition
1. Entropy
Intuition
1. Entropy
Intuition
1. Entropy
Intuition
1. Flipping a Coin
Each side (heads or tails) has an equal chance of 50%.
Uncertainty is maximum because either outcome is equally likely.
1. Entropy
Formal definition
1. Flipping a Coin
This question reflects the binary
Each side (heads or tails) has an equal chance of 50%. nature of the coin flip: there are
only two equally likely outcomes
Uncertainty is maximum because either outcome is equally likely. (50% heads, 50% tails), so 1 bit
of information is sufficient to
fully resolve the uncertainty.
Entropy in
Machine
Learning
2. Entropy in Machine Learning
Cross-Entropy Loss
Used in classification tasks to measure the difference between the true labels
and the model’s predicted probabilities.
Formula:
Example:
2. Entropy in Machine Learning
Cross-Entropy Loss
Used in classification tasks to measure the difference between the true labels
and the model’s predicted probabilities.
Formula:
Machine Learning: Entropy helps
models reduce uncertainty and
measure prediction quality.
Example:
2. Entropy in Machine Learning
Cross-Entropy Loss in Neural Networks
Cross-Entropy Loss is one of the most common loss functions used in classification tasks
in neural networks. It measures the difference between the true label distribution (ground
truth) and the predicted probability distribution produced by the model. The goal of
cross-entropy loss is to minimize this difference, thereby improving the accuracy of the
predictions.
Binary Classification
Multi-Class Classification