0% found this document useful (0 votes)
25 views12 pages

Neural Network

Back-propogation Part 3

Uploaded by

jesawid181
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views12 pages

Neural Network

Back-propogation Part 3

Uploaded by

jesawid181
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 12

CSE4403 3.

0 & CSE6002E - Soft Computing


Winter Semester, 2011

Neural Networks Videos

Brief Review

The Next Generation


Neural Networks - Geoff
Hinton
CSE4403 3.0 & CSE6002E - Soft Computing
Winter Semester, 2011

Part 3
CSE4403 3.0 & CSE6002E - Soft Computing
Winter Semester, 2011

Neural Networks – Back Propagation Algorithm

Let's do an example with an actual network to see how the process


works. We’ll just look at one connection initially, between a neuron in
the output layer and one in the hidden layer.

Single connection learning in


a Back Propagation network. WAB
A B

WAC
C

The connection we’re interested in is between neuron A (a hidden


layer neuron) and neuron B (an output neuron) and has the weight
WAB. The diagram also shows another connection, between neuron A
and C, but we’ll return to that later. The algorithm works like this:
CSE4403 3.0 & CSE6002E - Soft Computing
Winter Semester, 2011

Neural Networks – Back Propagation Algorithm

1. First apply the inputs to the network and work out the output -
remember this initial output could be anything, as the initial weights
were random numbers.
2. Next work out the error for neuron B. The error is What you want -
What you actually get, in other words:
ErrorB = OutputB (1-OutputB)(TargetB - OutputB)
The “Output(1-Output)” term is necessary in the equation because of
the Sigmoid Function - if we were only using a threshold neuron it
would just be (Target - Output).
3. Change the weight. Let W+AB be the new (trained) weight and W AB be
the initial weight.
W+AB = WAB + (ErrorB x OutputA)
Notice that it is the output of the connecting neuron (neuron A) we use
(not B). We update all the weights in the output layer in this way.
CSE4403 3.0 & CSE6002E - Soft Computing
Winter Semester, 2011

Neural Networks – Back Propagation Algorithm

4. Calculate the Errors for the hidden layer neurons. Unlike the output
layer we can’t calculate these directly (because we don’t have a
Target), so we Back Propagate them from the output layer (hence
the name of the algorithm). This is done by taking the Errors from
the output neurons and running them back through the weights to
get the hidden layer errors. For example if neuron A is connected as
shown to B and C then we take the errors from B and C to generate
an error for A.
ErrorA = Output A (1 - Output A)(ErrorB WAB + ErrorC WAC)
Again, the factor “Output (1 - Output )” is present because of the
sigmoid squashing function.
5. Having obtained the Error for the hidden layer neurons now proceed
as in stage 3 to change the hidden layer weights. By repeating this
method we can train a network of any number of layers.
CSE4403 3.0 & CSE6002E - Soft Computing
Winter Semester, 2011

Neural Networks – Back Propagation Algorithm

This may well have left some doubt in your mind about the
operation, so let’s clear that up by explicitly showing all the
calculations for a full sized network with 2 inputs, 3 hidden layer
neurons and 2 output neurons as shown below. W+ represents the
new, recalculated, weight, whereas W represents the old weight.

QuickTime™ and a
decompressor
are needed to see this picture.
CSE4403 3.0 & CSE6002E - Soft Computing
Winter Semester, 2011

Neural Networks – Back Propagation Algorithm

QuickTime™ and a
decompressor
are needed to see this picture.
CSE4403 3.0 & CSE6002E - Soft Computing
Winter Semester, 2011

Neural Networks – Back Propagation Algorithm - worked example

QuickTime™ and a
decompressor
are needed to see this picture.
CSE4403 3.0 & CSE6002E - Soft Computing
Winter Semester, 2011

Neural Networks – Back Propagation Algorithm - worked example

QuickTime™ and a
decompressor
are needed to see this picture.
CSE4403 3.0 & CSE6002E - Soft Computing
Winter Semester, 2011

Neural Networks – Back Propagation Algorithm


CSE4403 3.0 & CSE6002E - Soft Computing
Winter Semester, 2011

Neural Networks – Geoff Hinton’s Lecture

Please view the 59 minute video at

http://www.youtube.com/watch?v=AyzOUbkUf3M
CSE4403 3.0 & CSE6002E - Soft Computing
Winter Semester, 2011

Concluding Remarks

T. T. T.
Put up in a place
where it's easy to see
the cryptic admonishment
T. T. T.
When you feel how
Depressingly slowly you climb,
it's well to remember that
Things Take Time.

You might also like