Lect 6
Lect 6
4. Update Weights:
• Once backpropagation has calculated how much each weight
contributed to the error, it adjusts the weights to reduce the error.
This adjustment is typically done using gradient descent.
• The weights are changed by a small amount, so the network's next
prediction will be closer to the correct answer.
5. Repeat:
• This process (forward pass, calculate error,
backpropagation, update weights) is repeated
many times, using multiple examples in the
training data.
• With each repetition, the network gets better
at making predictions, because it is gradually
reducing the error through weight
adjustments.
In Summary:
• Forward Pass: The network makes a prediction.
• Error Calculation: The network compares its prediction to the correct
answer and calculates the error.
• Backpropagation: The network works backward to figure out how
much each weight caused the error.
• Weight Update: The weights are adjusted to reduce the error.
• Repeat: This process continues, and over time, the network learns to
make better predictions.
Backpropagation is like a way for the network to learn from its mistakes
and improve, layer by layer.