Unsupervised Learning: Part III Counter Propagation Network

Download as pdf or txt
Download as pdf or txt
You are on page 1of 17

Unsupervised Learning: Part III Counter Propagation Network

Klinkhachorn:CpE520

Classication of ANN Paradigms


Semi-Supervised Counter Propagation

Klinkhachorn:CpE520

Counterpropagation Networks
Multilayer networks based on a combination of input, clustering, and output layers. Introduced by R. Hecht-Nielsen, 1987 Applications compress data approximate functions associate patterns

Fundamentals of Neural Networks, L. Fausett, Prentice Hall, 1994

Klinkhachorn:CpE520

Robert Hecht-Nielsen
Adjunct Professor, Electrical & Computer Engineering An authority on neural networks, he introduced the first comprehensive theory of the mammalian cerebral cortex and thalamus in 2002. His research revolves around scientific testing, elaboration, and extension of this theory. Professor Hecht-Nielsen is an expert on brain theory, associative memory neural networks and Perceptron theory. His theory of thalamocortex is currently being promulgated and integrated into research worldwide. Robert Hecht-Nielsen has been adjunct professor at UCSD since 1986. He teaches the popular ECE 270 three-quarter graduate course Neurocomputing, which focuses on the basic constructs of his theory of thalamocortex and their applications. He is a member of the UCSD Institute for Neural Computation and is a founder of the UCSD Graduate Program in Computational Neurobiology. An IEEE Fellow, he has received the IEEE Neural Networks Pioneer Award and the ECE Graduate Teaching Award. He received his Ph.D. in Mathematics from Arizona State University in 1974.

http://www.jacobsschool.ucsd.edu/FacBios/findprofile.pl?fmp_recid=89

Klinkhachorn:CpE520

Counter Propagation Training


Two stages
Unsupervised
Input vectors are clustered (similar to SOM without neighbor)

Supervised
Weights from the cluster units to the output units are adapted to produce the desire response

Fundamentals of Neural Networks, L. Fausett, Prentice Hall, 1994

Klinkhachorn:CpE520

Counter Propagation Nets


Two types
Full Counterpropagation
Efficient method to represent a large number of vector pairs by adaptively constructing a look up table. Produces an approximation input to output relationship (hetero associative)

Forward only Counterpropagation


Simplified version of the full counterpropagation Produces a mapping from input to output

Fundamentals of Neural Networks, L. Fausett, Prentice Hall, 1994

Klinkhachorn:CpE520

Full Counterpropagation Nets

Fundamentals of Neural Networks, L. Fausett, Prentice Hall, 1994

Klinkhachorn:CpE520

Full CP Nets: Cluster Layer

Fundamentals of Neural Networks, L. Fausett, Prentice Hall, 1994

Klinkhachorn:CpE520

Full CP Nets:

Classification Layer

Fundamentals of Neural Networks, L. Fausett, Prentice Hall, 1994

Klinkhachorn:CpE520

Forward only Counter Propagation

Klinkhachorn:CpE520

Counter Propagation Operation


Present input to network Calculate output of all neurons in Kohonen layer Determine winner (neuron with maximum output) Set output of winner to 1 (others to 0) Calculate output vector
Klinkhachorn:CpE520

Counter Propagation Training


Present input vector (x) Determine winner in competitive layer Adapt weights to winner
Wci(t+1) = Wci (t) + a(xi-Wci (t))

Normalize weights going to winner (divide each weight by magnitude of vector) Adapt weights of output layer
Vji(t+1) = Vji (t) + b*zi*(Yj-Y j) if i = c = Vji (t) if i != c
Klinkhachorn:CpE520

Counter Propagation - Example

Y = 1/X

Fundamentals of Neural Networks, L. Fausett, Prentice Hall, 1994

Klinkhachorn:CpE520

Counter Propagation - Example

Y = 1/X

Fundamentals of Neural Networks, L. Fausett, Prentice Hall, 1994

Klinkhachorn:CpE520

Counter Propagation - Example

Klinkhachorn:CpE520

Counterpropagation Network Notes


Not as general as Backpropagation Trains faster than Backpropagation May not generalize well on new patterns Input clusters must be well separated and represented Comprtitive layer can become unstable if enough units are not present Uses include Pattern Classifications
Klinkhachorn:CpE520

Counterpropagation Network Notes

Klinkhachorn:CpE520

You might also like