Open navigation menu
Close suggestions
Search
Search
en
Change Language
Upload
Sign in
Sign in
Download free for days
0 ratings
0% found this document useful (0 votes)
36 views
6 pages
Complex-Valued Multistate Neural Associative Memory
IEEE Transactions on Neural Networks, vol. 7, no. 6, Nov 1996
Uploaded by
alozows
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here
.
Available Formats
Download as PDF or read online on Scribd
Download
Save
Save Complex-Valued Multistate Neural Associative Memor... For Later
Share
0%
0% found this document useful, undefined
0%
, undefined
Print
Embed
Report
0 ratings
0% found this document useful (0 votes)
36 views
6 pages
Complex-Valued Multistate Neural Associative Memory
IEEE Transactions on Neural Networks, vol. 7, no. 6, Nov 1996
Uploaded by
alozows
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here
.
Available Formats
Download as PDF or read online on Scribd
Carousel Previous
Carousel Next
Download
Save
Save Complex-Valued Multistate Neural Associative Memor... For Later
Share
0%
0% found this document useful, undefined
0%
, undefined
Print
Embed
Report
Download
Save Complex-Valued Multistate Neural Associative Memor... For Later
You are on page 1
/ 6
Search
Fullscreen
IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL, 7, NO. 6, NOVEMBER 1996, Brief Papers. Complex-Valued Multistate Neural Associative Memory Stanislaw Jankowski, Andrzej Lozowski, and Jacek M. Zurada Abstract—A. model of a multivalued associative memory is presented. This memory has the form of fully connected Attractor neural network composed of mulistate complex-valued neurons. Such a network is able to perform the task of storing and recalling gray-scale images. It is also shown that the complex- valued fully connected neural network may be considered as 2 generalization of a Hopfield network containing real-valued ‘neurons. A computational energy function Is introduced and evaluated in order to prove network stability for asynchronous dynamics. Storage capacity as rlated to the number of accessible ‘neuron states is also estimated. I. INTRODUCTION LASSIC neural networks are usually based on the McCulloch-Pitts neuron model. This model uses two- stale neurons and is well suited to the processing of binary-valued vectors. In many situations, however, discrete but nonbinary state values would offer an advantageous representation of information, This includes processing of ‘gray-scale images or mapping of input data into multiclass partitions using a nonbinary class encoding scheme. The ‘concept presented below addresses the theoretical foundations of multivalued neurons and their use in associative memories. ‘There are several extensions to the basic Hopfield associa- tive memory [1] leading to multivalued processing. Apparently the most simple is grouping a certain number of neurons into fone cluster that functionally represents a single multivalued state, A network composed of such clusters can represent gray- scale simply by adjusting the density of activated neurons. In reality, however, the size of such a network would become enormously large even for small images unless information compression techniques are employed. Another choice is 10 assign each bit of a multivalued state representation to different neurons in a cluster as proposed in (2}- The corresponding bits can then be spit off into separate noninteracting Hopfield networks and processed independently from each other. Once the separate networks have reached a fixed point, the final ‘multivalued state is formed based on the bits recombination. Mulkivalued patterns storage is very efficient inthis approach, however, it suffers from the lack of interactions between the bits describing multivalued states. For instance in an image recognition task a small change in the pattern brightness (e.g. Manvscriperesived Apil 27, 195: revised December 18,1995 STankowski is withthe Insitute of Electro Fundamentals, Wars University of Techaoogy. Nowowieska 15/9, 00-665 Warsaw, Poland "A. Lazowsks and JM, Zorada ae withthe Depareat of Elctial Engineering, University of Louie, Louse, KY 40292 USA. Publisher em Inter 8 1085-9327(96)07449. each pixel intensity increased by one may result in totally
0) and asynchronous dynamics # — 5! expressed by (7), the change ofthe energy function AE = E(5") ~ (3) < 0 with equality holding only when s! = 3 if the activation function ofthe network's units performs uniform phase quantization represented by (3) 'A detailed proof of this theorem is provided in the Appen- dix [Note that the energy E is properly defined when only ‘wis = Ty. Also note that for the ease K = 2 this con- dition implies wig = wy. Therefore, the case of Hermitian interconnections in the complex-valued associative memory generalizes the symmetry condition for the Hopfield model ‘Also, the multivalued associative memory presented here constitutes a generalization of the classical Hopfield binary- valued nerwork TV. STORAGE CAPACITY ESTIMATION Similarly to the analysis of a Hopfield network storage capacity [12], storage capacity ofa complex-valued associative memory can be based on estimation of probability of an error in the network response. First assume that the network state corresponds to one of the stored patterns: 7 = e. Due to (2) and (5) the input A of ith neuron is a sum of two = seat RedegLbdse % where the fist term eis the proper response of the network and the second can be regarded as cross-corelation noise. Note that h? can belong to any of K’ sectors. Rotation of h to the first sector would be more convenient for analysis. The rotated representation of (9) becomes Reba E DD tehed ao) where the second term of (10) is a crosstalk term ¢}, which represents the distance between hz? and one on the complex plane, as shown in Fig. 3. According to (3), if the sum 1+? does not belong to the first sector of the complex plane, the state of the ith neuron after its update changes to an eroneous state. In order 10 estimate the storage capacity of the network it s necessary to find the probability of such an event by calculating the probability distribution ofthe crosstalk values © The crosstalk c} is a sum of N(p ~ 1) components efefEde3. Assuming that the e are randomly chosen paterns, we consider the components 2"2#2e} as independent random complex. K-valved numbers from the distibution Pr(efetete) = 2* O-K-10 with a mean value of zero anda variance of one. According to the Lindenberg-Levy central limit theorem {15}, a5 Np tends to infinity, the distribution for the crosstalk oS becomes @Fig.3, Cros in complex-valued associative memory in ste 5 coesponding to 086 ofthe stored puters = complex Gaussian distribution with mean zero and variance == 0/N sim PriReed
|y| cot(~o/2). Thus Jeoe 22 loot $9 The probability P. may be estimated by integrating the dis tribution fy259(@.y) in the region comesponding to the frst sector of the s-y plane and using the axial symmetry of the distribution. Thus we obtain EL By changing the variables w = (x ~ mg)/V3a and (y — m,)/v20, an explicit formula can be obtained for calculating , for case when gy and @ are specified Ls [7 eret(rat 2.) a] 9 P (Re Ade) > [lm a4) P (z,y) dz dy. as) mB 2 It should be noted that for the case K = 2, formula (16) is ‘identical with the one obtained for the classic Hopfield model IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL 7, NO. 6, NOVEMBER 1996 y x >ycot 2 > -yoot® Fig. 4. Probability disibuion f,3_9(#,9): The distribution is Gausian with mean (1,0) aed vance p —'/. [12] and takes the form Pale an 1 1 ap-«( za) } ‘The resus of the numerical caleation of probability P, for various resolution factors Kad aang of land parame Grae shown in Fp 3, The land parameter i elated fo he Varage anid is dtd a follows A a=k as) Stall values fr the probaly of an eroneous reall ae expecta for low values of Ke seviden tae probably Incete eer whee storage pteres Geren of a) oF whee sinking te Sctors ofthe complex plane Gncease of XK) Wwe may deine he storage capacity by chosing a ereron Pon for an acepable vel of enor provi, Ths he Storage capi isthe lags lod parte afr which the conesonding enor probability sll saisies P, Poa Fig. 6 shows several a Ke couout pls sashing Oe condion Pe = Pa for vis eerie Pon Kis pact from te gure that he sewage capecity oni dopendet ton tte tuner of valves Kof he scion moses AS Me reslulon Kis need the capeiy of be memoryIEEE TRANSACTIONS ON NEURAL NETWORKS. VOL. 7, NO. 6, NOVEMBER 19% 0.05 ‘a0 a Fig. 6. The a-~ F como pots saying P, 015 ans (0 vrs Pras decreases, as shown in Table I. The storage capacity estimation for K = 2 agrees with the one known for the classic Hopfield, ‘model [12] Note that the used definition of storage capacity is related to the number of stored pattems regardless of the resolution factor K. Although memory capacity for higher K reduces, teach of the patterns contains more information. Thus, one is ‘capable to distribute same amount of information either in various patterns of limited resolution XC or in a fewer high-K- valued patterns. For a class of information processing tasks, ‘the multivalued encoding of information can be recognized as, ‘an advantage of the present approach. V. Concwusions ‘This paper formulates the novel concept of a complex- valued fully coupled neural network which computes associ- ations based on phase-encoded information. Each neuron can, be assigned @ multivalued state from the finite set of complex. numbers of unity magnitudes and various discrete phases ‘Since the set of availabe states possesses a circular symmetry fon the complex plane, the states are indistinguishable from each other and none of the states is privileged. Moreover, the memory is able to recognize patterns correctly as long as, the relative phase shifts (or gray-level distances) between the patterns components are preserved. In other words, recognition of images is invariant under the absolute level of the pattern brightness. The result of multivalued processing in complex- valued neural networks appears to be more general and yet, computationally efficient as compared to real-valued multistate networks. In a sense, complex-valued calculus generalizes the def- initions of Hopfield’s bilevel associative memory. Usually, preprocessing of gray-scale (multivalued) pattems is required for attractor neural-network processing since the network Fesolution factors K ‘operates with bivalent neurons. Here, multivalued informa- tion representation corresponds to a natural characteristics of ‘complex-valued neural networks and such additional prepro- cessing step is eliminated. ‘The complex multivalued associative memory is particularly well suited to interpret images transformed by 2-D Fourier transformation and the 2-D autoconelation functions. Even, for binary images, both of these algorithms result in multi- valued transformations whose storage requires a multivalued representation ‘APPENDIX PROOF OF ENERGY FUNCTION MINIMIZATION FOR ASYNCHRONOUS DYNAMICS (SEE THEOREM 1) “The change of the energy function AE for two consecutive states ¥ and 3 can be derived by inspection of energy levels ‘of both states AE = E(s')- EH) as) Since the mth neuron is the only neuron that changes its state, the value of E(3) is expressed, from (6) as follows: (EE wees Emaar Simi jem E@= $Y win dism + vue) (20) im Equation (7) can be thought of as the relationship between the consecutive states ¥ and 5’. Consequently, for some k € {0,1,---,K -1} ES") =-3 (= DY wisi; + Do wmjZ*Fnsj in Sem Jem + Di tn tema) an im Comparing (20) and (21) one may obtain the energy change AE (E« —DuimFms3 ifm (2) +Der- sent) im196 REE TRANSACTIONS ON NEURAL NETWORKS, VOL 7, NO. 6 NOVEMBER 1996 Hermitian couplings ensure that ny and Win are mutually conjugate for 1 = j. Thus both sums in (22) are composed of mutually conjugate terms (2 — I)timjtima) and (2* ~ Djtin dsr. This yields thatthe energy change i real valued and reads ae [Ee -pears] eo sem After completing the sum in the brackets with the mth component, the change of energy function becomes AE=-Re [ee = 1wmjimsy — (2* — tena] 24) ‘Since Wmam is greater Or equal to 2er0 and By my = 1, the term =(2*—1}tmmmsin Of (24) is located in the right half-plane. Hence, the following holds: to[ Set —rhatar] 20 2850 es The content ofthe brackets in (25) is a product of (2*—1)m and Dj tmjs;. Thus, according to (2) and (7), (25) is equivalent t0 the following implication: Re[(1-2)%tm] 20> AESO. 2H) Taking (8) into account and using j,sfq = 1 we obtain (1 = 249 ton = Hin Bian — 2 Em = [bm|fe#2" ~ eltF*e-90), 7) Since hm] 2 0, implication (26) may be rewriten by using. Euler's equations for the exponential terms as cos Ay — cos(kigo— Ay) 20 AE <0. (28) For all k from the set {0,--»,K ~ 1} the condition of {implication (28) yields that the bound of the phase shift Ay which preserves energy minimization needs to be lavis $= ago. 2) Note that condition (29) is equivalent to uniform phase- {quantization of the neurons activation function ensured in (3). ‘This concludes the proof of Theorem 1. REFERENCES, a 4.1, Hopi, “Neural networks and physical sysems with emergent csllesve compuaonalailies” in Proc Not. Academy Sei USA, ‘ol 79, 1982, pp. 2554-2558, GA. Koning On the problems of nesral nworks with molisine ‘neuton" J. De Physigue I ol. 2p. 1589-1552, Avg, 1992 [3] J: Hop, "Newron wth raed responses ave callie compat tional properties like those of two-state neurons” in Pro. Nat. Academy ‘Set USA, vol 81,1988, pp. 3088-8052. [s) JM. Zarda, I. Close, snd B. van der Poel, “Generalized Hopfield fetworks with maliple sable ses” Newocompuaing, vol. 13. A 254, Oe. 197. [5] "Neural assosintive memories with maliple sable states,” in roe ind int. Conf Fue Lone Neural Nets. and Sof Comput, Luka, Fukuoka, Japan, 194, pp. 45-51 [61 He Rieger, "Storing a extesive number of gray-‘oned paters in 2 nearal networking muliateseuroas" J-PINE. A, VO. 23, pp 1273-11280, 1990 17) B. Baird and F. Ezckan, “A normal form projection algorithm for sesocative memory.” in Associative Neural Memories—Theory and Implementation, M. H. Hassoun, Ed. Oxford, UIC: Oxford Uni Press, 193 pp 135-166, (8) AUN. Noes “Associative memory in sparse phasor neral network: Europhysics Let. vol. 6, pp. 4-475, 1988 (9) RT Marks 1S. Oh, abd LE. As, "Alerting projection nara networks" IEEE Trane. Circ ya. vl. 36, pp 846-357, 1989 (U0) NON: Aizenberg and ¥. Le vad Muliolued Threshold Logie. Kiev, ‘Uaioe: Naukova Duma, 1977 (i Russia), 1 NUN Aizenberg and TN. Aizeaberp. “CNN based on mkivaled neuron as 1 model of ascitve memory for grayscale images,” I Proc IEEE 2nd Int. Whshp. CNN's and Their Applicat. CNNA 92 Munch, Germany, pp. 36-41 [02] J: Hera, A: Krogh snd RG. Palmer, Invoduton 10 the Theory of Neural Compuation. Reading, MA: Addison-Wesley. 191 [03] 1M. Zara, Inraduction 10 Arial Newal Systems. Bost, MA: ws, 192 (0) S-Tathowsli and A. Lozowski, "Complex-valued neural networks,” in Proc XVI Nat Co. Creu Theory Electron. Set, Kolobae, Poland, 1983, pp. 582-587. [15] W. Fel, An Inoduction 10 Probability Theory and is Applications New York: Wiley, 1960, pp. 228-230 2
You might also like
NL Hopfield NNETS
PDF
No ratings yet
NL Hopfield NNETS
78 pages
CS407 Neural Computation: Associative Memories and Discrete Hopfield Network. Lecturer: A/Prof. M. Bennamoun
PDF
No ratings yet
CS407 Neural Computation: Associative Memories and Discrete Hopfield Network. Lecturer: A/Prof. M. Bennamoun
62 pages
T H e Capacity of The Hopfield Associative M e M o Ry: (Cf. FL)
PDF
No ratings yet
T H e Capacity of The Hopfield Associative M e M o Ry: (Cf. FL)
22 pages
Associative Memory Neural Networks
PDF
100% (1)
Associative Memory Neural Networks
35 pages
Network Learning and Training of A Cascaded Link-Based Feed Forward Neural Network (CLBFFNN) in An Intelligent Trimodal Biometric System
PDF
No ratings yet
Network Learning and Training of A Cascaded Link-Based Feed Forward Neural Network (CLBFFNN) in An Intelligent Trimodal Biometric System
21 pages
A Logical Calculus of The Ideas Immanent in Nervous Activity - McCulloch Pitts
PDF
No ratings yet
A Logical Calculus of The Ideas Immanent in Nervous Activity - McCulloch Pitts
13 pages
Associative Momoriy
PDF
No ratings yet
Associative Momoriy
6 pages
Memories in Context: Andre S Pomi Brea, Eduardo Mizraji
PDF
No ratings yet
Memories in Context: Andre S Pomi Brea, Eduardo Mizraji
16 pages
Paper Malalties Complexes
PDF
No ratings yet
Paper Malalties Complexes
13 pages
Neuro-Fuzzy Systems and Their Applications: Bogdan
PDF
No ratings yet
Neuro-Fuzzy Systems and Their Applications: Bogdan
15 pages
Approach To The Synthesis of Neural Network Structure During Classification
PDF
No ratings yet
Approach To The Synthesis of Neural Network Structure During Classification
7 pages
Notes Lect 17autoassociated - Hopfield
PDF
No ratings yet
Notes Lect 17autoassociated - Hopfield
25 pages
Complex-Valued Bidirectional Auto-Associative Memory: Yozo Suzuki and Masaki Kobayashi
PDF
No ratings yet
Complex-Valued Bidirectional Auto-Associative Memory: Yozo Suzuki and Masaki Kobayashi
7 pages
Physlca: The Three-Dimensional Rotation Neural Network
PDF
No ratings yet
Physlca: The Three-Dimensional Rotation Neural Network
16 pages
Lecture 11 - Supervised Learning - Hopfield Networks - (Part 4)
PDF
No ratings yet
Lecture 11 - Supervised Learning - Hopfield Networks - (Part 4)
5 pages
Neural Networks and Physical Systems With Emergent Collective Computational Abilities HOPFIELD 82
PDF
No ratings yet
Neural Networks and Physical Systems With Emergent Collective Computational Abilities HOPFIELD 82
6 pages
Chapter4 Associative Memory
PDF
No ratings yet
Chapter4 Associative Memory
27 pages
Neural Basic
PDF
No ratings yet
Neural Basic
6 pages
Associative Memory
PDF
No ratings yet
Associative Memory
25 pages
1982 Neural Networks and Physical Systems With Emergent Collective Computational Abilities
PDF
No ratings yet
1982 Neural Networks and Physical Systems With Emergent Collective Computational Abilities
5 pages
ECE/CS 559 - Neural Networks Lecture Notes #8: Associative Memory and Hopfield Networks
PDF
No ratings yet
ECE/CS 559 - Neural Networks Lecture Notes #8: Associative Memory and Hopfield Networks
9 pages
Soft Computing: Pattern Associators
PDF
No ratings yet
Soft Computing: Pattern Associators
36 pages
NNDL Assignment Ans
PDF
No ratings yet
NNDL Assignment Ans
15 pages
Omkar Sabnis B4-764 Experiment No. 7 Aim: Implementation of MC-Culloch Pitt Model For AND Gate Using Python. Theory
PDF
No ratings yet
Omkar Sabnis B4-764 Experiment No. 7 Aim: Implementation of MC-Culloch Pitt Model For AND Gate Using Python. Theory
10 pages
Unit 5 - Associative Memory
PDF
No ratings yet
Unit 5 - Associative Memory
46 pages
Lecture 12 - Supervised Learning - Hopfield Networks - (Part 5)
PDF
No ratings yet
Lecture 12 - Supervised Learning - Hopfield Networks - (Part 5)
3 pages
Soft Computing: Dynamic Neural Networks
PDF
No ratings yet
Soft Computing: Dynamic Neural Networks
33 pages
Neural Network Fundamentals With Graphs
PDF
No ratings yet
Neural Network Fundamentals With Graphs
6 pages
1 Online
PDF
No ratings yet
1 Online
8 pages
SCT Unit2
PDF
No ratings yet
SCT Unit2
11 pages
Supervised Perceptron Learning Vs Unsupervised Hebbian Unlearning: Approaching Optimal Memory Retrieval in Hopfield-Like Networks
PDF
No ratings yet
Supervised Perceptron Learning Vs Unsupervised Hebbian Unlearning: Approaching Optimal Memory Retrieval in Hopfield-Like Networks
11 pages
Hopfield 1982 Neural Networks and Physical Systems With Emergent Collective Computational Abilities
PDF
No ratings yet
Hopfield 1982 Neural Networks and Physical Systems With Emergent Collective Computational Abilities
5 pages
Mitja Perus - Neural Networks As A Basis For Quantum Associative Networks
PDF
No ratings yet
Mitja Perus - Neural Networks As A Basis For Quantum Associative Networks
12 pages
Universal Hopfield Networks - A General Framework For Single-Shot Associative Memory Models
PDF
No ratings yet
Universal Hopfield Networks - A General Framework For Single-Shot Associative Memory Models
24 pages
Dynamical Complexity in Cognitive Neural Networks: y 1 W X - B
PDF
No ratings yet
Dynamical Complexity in Cognitive Neural Networks: y 1 W X - B
8 pages
DL Unit - 4
PDF
No ratings yet
DL Unit - 4
14 pages
Hopfield Networks
PDF
No ratings yet
Hopfield Networks
9 pages
Ann-Unit Iv
PDF
No ratings yet
Ann-Unit Iv
27 pages
ANNand Its Applicationsin Civil Engineering
PDF
No ratings yet
ANNand Its Applicationsin Civil Engineering
264 pages
Hopefield Net
PDF
No ratings yet
Hopefield Net
45 pages
Hopfield 1982 Neural Networks and Physical Systems With Emergent Collective Computational Abilities
PDF
No ratings yet
Hopfield 1982 Neural Networks and Physical Systems With Emergent Collective Computational Abilities
5 pages
UNIT-3 AIMl
PDF
No ratings yet
UNIT-3 AIMl
14 pages
NN (Ass 4)
PDF
No ratings yet
NN (Ass 4)
16 pages
2360 PDF C06
PDF
No ratings yet
2360 PDF C06
13 pages
Associative Memory Networks
PDF
No ratings yet
Associative Memory Networks
26 pages
Unit II-NNDL
PDF
No ratings yet
Unit II-NNDL
19 pages
Neural Networks and Physical Systems With Emergent Collective Computational Abilities (Hopfield, 1982)
PDF
No ratings yet
Neural Networks and Physical Systems With Emergent Collective Computational Abilities (Hopfield, 1982)
5 pages
Machine Learning For Pattern Recognition
PDF
No ratings yet
Machine Learning For Pattern Recognition
28 pages
Castro Fideliszanettide D
PDF
No ratings yet
Castro Fideliszanettide D
118 pages
Beyond Disorder: Unveiling Cooperativeness in Multidirectional Associative Memories
PDF
No ratings yet
Beyond Disorder: Unveiling Cooperativeness in Multidirectional Associative Memories
8 pages
Associative Memory Hop Field Networks
PDF
No ratings yet
Associative Memory Hop Field Networks
66 pages
ETAM
PDF
No ratings yet
ETAM
25 pages
Kernel Memory Networks: A Unifying Framework For Memory Modeling
PDF
No ratings yet
Kernel Memory Networks: A Unifying Framework For Memory Modeling
24 pages
Lecture-6 Associative Memory
PDF
No ratings yet
Lecture-6 Associative Memory
38 pages
MHN 7
PDF
No ratings yet
MHN 7
20 pages
Resilient H State Estimation For Discrete-Time Stochastic Delayed Memristive Neural Networks A Dynamic Event-Triggered Mechanism
PDF
No ratings yet
Resilient H State Estimation For Discrete-Time Stochastic Delayed Memristive Neural Networks A Dynamic Event-Triggered Mechanism
9 pages
Hop Field Neural Networks
PDF
No ratings yet
Hop Field Neural Networks
5 pages