Hopfield Neural Network
Hopfield Neural Network
Binary (0/1)
Bipolar (-1/1)
The weights associated with this network are symmetric in nature and
have the following properties.
1. wij = wji
2. wii = 0
Training Algorithm
For storing a set of input patterns S(p) [p = 1 to P], where S(p) = S1(p) …
Si(p) … Sn(p), the weight matrix is given by:
P
wij = ∑p=1 [si (p)sj (p)] (where wij = 0 f or all i = j)
yi = xi : (f or i = 1 to n)
Apply activation over the total input to calculate the output as per
the equation given below:
⎧ 1 if yin > θi
yi = ⎨
yi if yin = θi
⎩
0 if yin < θi
Now feedback the obtained output yi to all other units. Thus, the
activation vectors are updated.
Test the network for convergence.
1
1
= [1 1 1 −1]
1
−1
1 1 1 −1
1 1 1 −1
=
1 1 1 −1
− 1 −1 −1 1
−1 −1 −1 0
(we will do the next steps for all values of yi and check if there is
convergence or not)
4
yin1 = x1 + ∑j=1 [yj wj1 ]
0
1
= 0 + [0 0 1 0]
1
−1
=0 +1
=1
Applying activation, yin1 > 0 ⟹ y1 = 1
Hence, no covergence. ‘
Now for next unit, we will take updated value via feedback. (i.e.
y = [1 0 1 0])
yin3 = x3 + ∑4j=1 [yj wj3 ]
1
1
= 1 + [1 0 1 0]
0
−1
=1 +1
=2
Applying activation, yin3 > 0 ⟹ y3 = 1
Hence, no covergence.
Now for next unit, we will take updated value via feedback. (i.e.
y = [1 0 1 0])
4
yin4 = x4 + ∑j=1 [yj wj4 ]
−1
−1
= 0 + [1 0 1 0]
−1
0
= 0 + (−1) + (−1)
= −2
Applying activation, yin4 < 0 ⟹ y4 = 0
Hence, no covergence.
Now for next unit, we will take updated value via feedback. (i.e.
y = [1 0 1 0])
1
0
= 0 + [1 0 1 0]
1
−1
=0 +1 +1
=2
Applying activation, yin2 > 0 ⟹ y2 = 1
vi = g(ui )
where,
Energy Function
The Hopfield networks have an energy function associated with them. It
either diminishes or remains unchanged on update (feedback) after
every iteration. The energy function for a continuous Hopfield network
is defined as:
n n n
E = 0.5 ∑i=1 ∑j=1 wij vi vj + ∑i=1 θi vi
The network is bound to converge if the activity of each neuron wrt time
is given by the following differential equation:
d −ui n
u
dt i
= τ
+ ∑j=1 wij vj + θi