Lecture 11 - Supervised Learning - Hopfield Networks - (Part 4)
Lecture 11 - Supervised Learning - Hopfield Networks - (Part 4)
Qadri Hamarsheh
Figure 1
Common Properties and Architecture:
Every neuron is connected to every other neuron.
Fully connected with Symmetric weights: wij = wji.
No self-loops : wii = 0.
Each neuron has a single input from the outside world.
In the Hopfield model the neurons have a binary output taking the values –1
and 1.
Random updating for neurons.
A Hopfield Network is a model of associative memory. It is based on
Hebbian learning but uses binary neurons.
1
Dr. Qadri Hamarsheh
Figure 2
Memories are attractors in state space as shown in the figure.
Figure 3
The dynamics of the system carries starting points into one of the attractors as
shown in the above figure.
A Hopfield net is composed of binary threshold units with recurrent
connections between them. Recurrent networks of non-linear units are
generally very hard to analyze. They can behave in many different ways:
Settle to a stable state.
Oscillate.
Follow chaotic trajectories that cannot be predicted.
But Hopfield realized that if the connections are symmetric, there is a global
energy function.
Each “configuration” of the network has energy.
The binary threshold decision rule causes the network to settle to an energy
minimum.
Pattern recognizer.
2
Dr. Qadri Hamarsheh
Hopfield networks have two applications. First, they can act as associative
memories. Second, they can be used to solve optimization problems.
According to figure 1:
The Hopfiled model starts with the standard McCulloch-Pitts model of a
neurons with the sign activation function:
The current state is determined by the current outputs of all neurons, y1, y2,
. . ., yn. Thus, for a single-layer n-neuron network, the state can be defined
by the state vector as:
3
Dr. Qadri Hamarsheh
4
Dr. Qadri Hamarsheh
The remaining six states are all unstable. However, stable states (also
called fundamental memories) are capable of attracting states that
are close to them.
The fundamental memory (1, 1, 1) attracts unstable states (-1, 1, 1),
(1,-1, 1) and (1, 1, -1). Each of these unstable states represents a
single error, compared to the fundamental memory (1, 1, 1).
The fundamental memory (-1, -1, -1) attracts unstable states (-1, -
1, 1), (-1, 1, -1) and (1, -1, -1).
Thus, the Hopfield network can act as an error correction network.