Topic 2.bidirectional Associative Memory (BAM)
Topic 2.bidirectional Associative Memory (BAM)
UNSUPERVISEDLEARNING NETWORKS
Input (X) And Output (Y) Pattern Maps Assigned To The BAM Model 9
Associations Stored In Memory
In the above figure, the BAM model learns the associations
of data, encoded into the pattern maps 𝙓 and 𝒀 of the
shapes (𝙨 x 𝙣) and (𝙨 x 𝙢), respectively.
While learning, the values from the input and output patterns
1
BAM’s Topology And Structure
In the figure, above, the neurons (𝒏) of the input layer perform no
output computation, feedforwarding the inputs 𝑿 to the memory
cells (𝒎) in the BAM’s output layer.
Its sizes basically depend on the quantities of inputs and outputs of the
BAM (e.g., the dimensions of the input and output pattern maps).
The NN, the shape (4 x 3), consists of 4 neurons and 3 memory
cells in its input and output layers, respectively.
Unlike the conventional ANNs, the output layer of BAM consists of
memory cells that perform the BAM’s outputs computation.
Each memory cell computes its output 𝒀, based on the multiple
inputs 𝑿, forwarded to the cell by the synaptic weights 𝑾, each one
is of specific strength.
1
BAM’s Topology And Structure
1
BAM’s Topology And Structure
The BAM’s outputs are computed as the weighted sum
of all BAM’s inputs 𝑾ᵀ𝙓, applied as an argument to the
bipolar threshold function.
The positive (+) or negative (-) value (e.g., 1 or -1)
of each output 𝒀 is always proportional to the
magnitude of the sum of the cell’s weighted inputs.
All synaptic weights, interconnecting the input neurons
and memory cells store the fundamental memories,
learned by the BAM, based on the Hebbian supervised
learning algorithm, discussed, below.
1
BAM’s Learning Algorithm
Algebraically, the ability of BAMs to store and recall
associations solely relies on the bidirectional property
of matrices, studied by T. Kohonen and J. Anderson, in the
mids of 1950s:
The inner dot product of two matrices 𝑿 and 𝒀, which
gives a correlation matrix 𝑾 is said to be
bidirectionally stable, as the result of recollection and
feedback of 𝑿 and 𝒀 into 𝑾 .
Generally, it means that the corresponding rows and
columns of both matrices 𝑿 and 𝒀, as well as their
scalar vector products, equally contribute to the values
1
In this case, 𝑾 is a matrix, which infers the correlation of the 𝑿ₖ
BAM’s Learning Algorithm
or 𝒀ₖ vectors.
An entire learning process is much similar to
keeping (𝙬ᵢⱼ≠ 0) or
discarding (𝙬ᵢⱼ=0) the specific weights,
that interconnect input neurons and memory cells of the
BAM.
The values of synaptic weights 𝙬ᵢⱼ∈ ** 𝑾 can be either
greater than, less, or equal to 0.
The BAM keeps only those synaptic weights, which values
are either positive or negative.
1
BAM’s Learning Algorithm
The bidirectional matrix property was formulated by Donald
Hebb as the famous supervised learning rule,
which is fundamental for the Hopfield network, and other
associative memory models.
According to the Hebbian algorithm, all associated patterns
, 𝒀ₚₓₘ matrices.
course title
2
An algorithm for recalling the association
from the BAM’s memory is listed below:
For each of the 𝞭-iterations,𝟏 ≤ 𝞭 ≤ 𝒕, do the following:
1. Initialize the BAM’s inputs with the input vector 𝑿₁←𝑿.
2. Compute the BAM output vector 𝒀ₖ , as an inner dot product of the weights matrix
ᵀ transpose, and the input vector 𝑿ₖ, for the 𝒌-th iteration:
3. Obtain the new input vector 𝑿ₖ₊₁ for the next (𝒌 +𝟏 )-th iteration, such as:
4. Check if new and existing vectors 𝑿ₖ₊₁≠𝑿ₖ, are NOT the same:
5. If not, return to step 1, to compute the output 𝒀ₖ₊₁, for the (𝒌 +𝟏 )-th iteration, or
proceed to the next step 5, unless otherwise.
6. Return the output vector 𝒀←𝒀ₖ from the procedure, as the correct association for
the input 𝑿 vector.
7. Proceed with steps 2–4, until convergence. 2
Memory Evaluation (Testing)
memory.
Perform the model evaluation for each input pattern
𝙭ᵢ ∈ 𝙓 it recalls an association from the BAM’s
memory, bidirectionally.
When an association y⃗ₚ for the input pattern x⃗ was
recalled from the memory, it does the consistency
check whether the output y⃗ₚ target y⃗ pattern are
identical,
displaying the results for each of the input patterns 𝙭ᵢ
2
THANK YOU