Counterpropagation Networks
Counterpropagation Networks
An example of a hybrid network which combine the features of two or more basic network designs. Proposed by Hecht-Nielsen in 19 !. "he hidden layer is a #ohonen network with unsuper$ised learning and the output layer is a %rossberg &outstar' layer fully connected to the hidden layer. "he output layer is trained by the (idrow-Hoff rule. Allows the output of a pattern rather than a simple category number. )an also be $iewed as a bidirectional associati$e memory.
*ig. abo$e shows a unidirectional counterpropagation network used for mapping pattern A of si+e n to pattern B of si+e m. "he output of the A subsection of the input layer is fanned out to the competiti$e middle layer. ,ach neuron in the output layer recei$es a signal corresponding to the input pattern-s category along one connection from the middle layer.
)ounterpropagation Network
"he B subsection of the input layer has +ero input during actual operation of the network and is used to pro$ide input only during training. "he role of the output layer is to produce the pattern corresponding to the category output by the middle layer. "he output layer uses a super$ised learning procedure. with direct connection from the input layer-s B subsection pro$iding the correct output. "raining is a two-stage procedure. *irst. the #ohonen layer is trained on input patterns. No changes are made in the output layer during this step. /nce the middle layer is trained to correctly categorise all the input patterns. the weights between the input and middle layers are kept fixed and the output layer is trained to produce correct output patterns by ad0usting weights between the middle and output layers. "raining algorithm stage 11 1. 2. 4. 8. :. Apply normalised input $ector x to input A. 3etermine winning node in the #ohonen layer. 5pdate winning node-s weight $ector w&t61' 7 w&t' 6 & x - w ' 9epeat steps 1 through 4 until all $ectors ha$e been processed. 9epeat steps 1 to 8 until all input $ectors ha$e been learned.
"raining algorithm stage 21 1. 2. 4. Apply normalised input $ector x and its corresponding output $ector y. to inputs A and B respecti$ely. 3etermine winning node in the #ohonen layer. 5pdate weights on the connections from the winning node to the output unit 2
)ounterpropagation Network
wi&t61' 8.
wi&t' 6
&yi - wi'
9epeat steps 1 through 4 until all $ectors of all classes map to satisfactory outputs.
)ounterpropagation Network
;idirectional )ounterpropagation Network A bidirectional )ounterpropagation network is capable of a two-way mapping. *or example. an A pattern input produces a B pattern output and a B pattern input produces an A pattern output. "he *ig. below illustrates the connections in a bidirectional )ounterpropagation network. "he input and output layers are now of the same si+e. e<ual to the sum of the si+es of the A and B patterns. ;oth A and B sections ha$e full connections to the middle layer. and one-to-one connections to the corresponding neurons in the output layer. "he middle layer recei$es input from all elements of the input layer and transmits its output to the entire output layer.
)ounterpropagation Network
As a pattern associator the )ounterpropagation network has the ad$antage o$er other networks such as backprop in its ability to do in$erse mapping.
)ounterpropagation Network
Possible drawback of counterpropagation networks "raining a counterpropagation network has the same difficulty associated with training a #ohonen network. )ounterpropagation networks tend to be larger than backpropagation networks. =f a certain number of mappings are to be learned. the middle layer must ha$e that many number of neurons.
9eferences1 9obert Hecht-Nielsen. Counterpropagation Networks. Applied /ptics. 2!&24'1 89>9-89 8. 3ec. 19 >. 9obert Hecht-Nielsen. Neurocomputing. Addison-(esley. 9eading. ?A. 199@.