0% found this document useful (0 votes)
8 views17 pages

Lect 6

This document explains the concept of backpropagation in neural networks, detailing the process of making predictions, calculating errors, and adjusting weights to improve accuracy. It outlines the steps involved, including the forward pass, error calculation, backpropagation, weight updates, and the iterative nature of the training process. Overall, backpropagation is essential for enabling neural networks to learn from their mistakes and enhance their predictive capabilities.

Uploaded by

mihirgupta665
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views17 pages

Lect 6

This document explains the concept of backpropagation in neural networks, detailing the process of making predictions, calculating errors, and adjusting weights to improve accuracy. It outlines the steps involved, including the forward pass, error calculation, backpropagation, weight updates, and the iterative nature of the training process. Overall, backpropagation is essential for enabling neural networks to learn from their mistakes and enhance their predictive capabilities.

Uploaded by

mihirgupta665
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 17

UNIT 2: Neural Networks

•Backpropagation neural networks


•Kohonen neural network
•Learning vector quantization
•Radial basis function neural networks
•Support vector machines
• Backpropagation is a key algorithm used to
train neural networks, and it helps the
network learn from its mistakes to improve its
predictions.
Here’s a simple way to understand it:
1. Forward Pass (Prediction):
• First, the neural network makes a prediction. It takes an input,
processes it through its layers, and gives an output (prediction).
• At this point, the network doesn't know if it's right or wrong.

2. Calculate the Error:


• After making a prediction, the network compares its output to the
actual target value (the correct answer). The difference between
the predicted and actual values is the error.
• For example, if the network predicts 8 and the actual value is 10,
the error is 2.
3. Backpropagation (Error Adjustment):
• Now, the network needs to learn how to reduce that error. This is
where backpropagation comes in.
• Backpropagation starts by calculating how much each weight
(parameter) in the network contributed to the error. It does this by
working backward through the network, starting from the output
layer (where the error is) and moving backward to the input layer.
• The goal is to figure out which weights need to change to reduce the
error.

4. Update Weights:
• Once backpropagation has calculated how much each weight
contributed to the error, it adjusts the weights to reduce the error.
This adjustment is typically done using gradient descent.
• The weights are changed by a small amount, so the network's next
prediction will be closer to the correct answer.
5. Repeat:
• This process (forward pass, calculate error,
backpropagation, update weights) is repeated
many times, using multiple examples in the
training data.
• With each repetition, the network gets better
at making predictions, because it is gradually
reducing the error through weight
adjustments.
In Summary:
• Forward Pass: The network makes a prediction.
• Error Calculation: The network compares its prediction to the correct
answer and calculates the error.
• Backpropagation: The network works backward to figure out how
much each weight caused the error.
• Weight Update: The weights are adjusted to reduce the error.
• Repeat: This process continues, and over time, the network learns to
make better predictions.

Backpropagation is like a way for the network to learn from its mistakes
and improve, layer by layer.

You might also like