Associative Memory

Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 15

ASSOCIATIVE MEMORY

NETWORKS

.
ASSOCIATIVE MEMORY
NETWORK
• One of the primary functions of the brain is
associative memory. We associate the faces
with names, letters with sounds, or we can
recognize the people even if they have
sunglasses or if they are somehow elder now.
• Recall through Key pattern
• Content addressable memory (not address
addressable memory)

2
April 2007 3
Types of Associative memories:-
(1) Auto-associative memory
(2) Hetro-associative memory
(3) Bi-directional memory

April 2007 4
(1)Auto-associative memory

5
(2)Hetero-associative memory

6
(3)Bidirectional associative
memory
BAM is also Known as Recurrent hetero-associative
matching network

April 2007 7
Start

Initialize weights to Zero

For i=1 to n
For j=1 to n
For
each
s:t

Flowchart
For
Training
Process of
Autoassoc-
Continue
iatve
Continue network

Stop
Training ALGORITHM:-

P
W ( p)   s T ( p)  t ( p)
p 1
Testing ALGORITHM:-
TRAINING ALGORITHM FOR Pattern
Association)

There are two Algorithms developed for training of Pattern association


nets.
(1) Hebb Rule
(2) Outer product rule
(1)HEBB RULE FOR PATTERN ASSOCIATION
 Hebb Rule already Done( Refer Previous Slides)
(2) OUTER PRODUCT FOR PATTERN
ASSOCIATION
Let s and t be row vectors.

Then for a particular training pair s:t

Input  s
Output  t
W ( p )  s T ( p )  t ( p ) 
 s1   s1t1...s1t m   w11 ...w1m 
     
   t ,..., t    s2t1...s2t m    
  1 m
   
     
 sn   sn t1...sn t m   wn1 ...wnm 

and
P
W ( p)   s T ( p)  t ( p)
p 1
• Encoding:-The process of constructing the
connection weigh matrix is called encoding
• Decoding:-The process of retrieving a stored
pattern given an input pattern is called
decoding
• Storage capacity:-Number of patterns that
can be correctly stored & recalled by a
network.

14
Reference:-

BOOK:-
“Principles of Soft Computing, 2nd
Edition”
by S.N. Sivanandam & SN Deepa

15

You might also like