ISC Unit II Topic-3
ISC Unit II Topic-3
Concept of Learning
Backpropagation Algorithm
V=
W =
For simplicity, let us consider the connecting weights are the only
design parameter.
Suppose, V and W are the weights parameters to hidden and
output layers, respectively.
Thus, given a training set of size N, the error surface, E can be
represented as E=
where Ii is the i-th input pattern in the training set and e i (…)
denotes the error computation of the i-th input.
Now, we will discuss the steepest descent method of computing
error, given a changes in V and W matrices.
• Let us consider any k-th neuron at the output layer. For an input
pattern Ii TI (input in training) the target output T Ok of the k-th
neuron be TOk .
• Then, the error ek of the k-th neuron is defined corresponding to
the input Ii as ek = (TOk - OOk )2
• where OOk denotes the observed output of the k-th neuron.
• For a training session with Ii TI , the error in prediction considering
all output neurons can be given as
• e= =
• where n denotes the number of neurons at the output layer.
• The total error in prediction for all output neurons can be
determined considering all training session < T I , TO > as
• e=