Associative Memory
Associative Memory
The outer product of the two vectors is the product of the matrices S =
sT and T = t, i.e., between [n X 1] marrix and [1 x m] matrix. The
transpose is to be taken for the input matrix given.
This weight matrix is same as the weight matrix obtained by Hebb rule to
store the pattern association s:t. For storing a set of associations, s(p):t(p),
p = 1 to P, wherein,
Courtesy: Studyglance
Auto Associative Memory Algorithm
Training Algorithm
For training, this network is using the Hebb or Delta learning rule.
Step 1 − Initialize all the weights to zero as w ij = 0 i=1ton, j=1ton
Step 2 − Perform steps 3-4 for each input vector.
Step 3 − Activate each input unit as follows −xi=si(i=1ton)
Step 4 − Activate each output unit as follows −yj=sj(j=1ton)
Step 5 − Adjust the weights as follows −wij(new)=wij(old)+xiyj
The weight can also be determine form the Hebb Rule or Outer
Products Rule learning.
Testing Algorithm
Step 1 − Set the weights obtained during training for Hebb’s rule.
Step 3 − Set the activation of the input units equal to that of the input
vector.
In a hetero-associate memory, the training input and the target output vectors are different. The
weights are determined in a way that the network can store a set of pattern associations. The
association here is a pair of training input target output vector pairs (s(p), t(p)), with p = 1,2,…p. Each
vector s(p) has n components and each vector t(p) has m components. The determination of weights
is done either by using Hebb rule or delta rule. The net finds an appropriate output vector, which
corresponds to an input vector x, that may be either one of the stored patterns or a new pattern.
Training Algorithm
The weight can also be determined form the Hebb Rule or Outer Products Rule learning
Testing Algorithm
Step 1 − Set the weights obtained during training for Hebb’s rule.
Step 3 − Set the activation of the input units equal to that of the input vector.