0% found this document useful (0 votes)
2 views

Machine Learning for Pattern Recognition

The document discusses the evolution and applications of Hopfield Neural Networks (HNNs), particularly focusing on modern continuous HNNs and their integration with fractional calculus. It highlights the advantages of fractional-order HNNs in enhancing memory characteristics and their potential applications in controlling chaotic systems. The objectives include developing a mathematical framework for fractional-order HNNs and analyzing their dynamics in nonlinear systems.

Uploaded by

Expeckting rock
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Machine Learning for Pattern Recognition

The document discusses the evolution and applications of Hopfield Neural Networks (HNNs), particularly focusing on modern continuous HNNs and their integration with fractional calculus. It highlights the advantages of fractional-order HNNs in enhancing memory characteristics and their potential applications in controlling chaotic systems. The objectives include developing a mathematical framework for fractional-order HNNs and analyzing their dynamics in nonlinear systems.

Uploaded by

Expeckting rock
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 28

A fractional order approach

for Hopfield Neural Networks


and its application on
nonlinear dynamical systems
Machine learning
• Machine Learning (ML) has emerged as one of the most impactful branches of science in
recent years, revolutionizing fields ranging from healthcare and finance to autonomous
systems and natural language processing. ML empowers systems to learn from data, adapt
to changing environments, and make intelligent decisions without explicit programming.
• When discussing ML, the term often evokes thoughts of Artificial Neural Networks (ANNs),
which have become a cornerstone of modern machine learning. The concept of ANNs
dates back to the pioneering work of McCulloch and Pitts (1943)13, who introduced a
computational model of neurons inspired by biological neural networks. This foundational
model established the groundwork for subsequent developments in the field.
• Over the years, significant advancements in ANN architectures and training algorithms
have further propelled the field.

[13] McCulloch, W.S., Pitts, W. A logical calculus of the ideas immanent in nervous activity. Bulletin of Mathematical Biophysics 5, 115–133 (1943).
Machine learning
• Widrow and Hoff (1960)12: Proposed the ADALINE (Adaptive Linear Neuron)
model, introducing the Least Mean Squares (LMS) algorithm for weight adjustment.
• Rosenblatt (1962)14: Developed the Perceptron, one of the earliest supervised
learning models capable of binary classification.
• Rumelhart et al. (1986)10: Revolutionized ML with the introduction of
backpropagation, a method to efficiently compute gradients for training multi-layer
neural networks, enabling the rise of deep learning.
• These milestones laid the foundation for the sophisticated neural network models
we see today, including convolutional neural networks (CNNs), recurrent neural
networks (RNNs), and transformer-based architectures, all of which drive cutting-
edge applications in artificial intelligence.
[10] Rumelhart, David E., Geoffrey E. Hinton, and Ronald J. Williams (Oct. 1986).“Learning representations by back-propagating errors”. en.
In: Nature 323.6088,pp. 533–536. issn: 0028-0836, 1476-4687. doi: 10 . 1038 / 323533a0.
[12] Widrow, Bernard and Marcian E. Hoff (Apr. 1988). “(1960) Bernard Widrowand Marcian E. Hoff, ”Adaptive switching circuits,” 1960 IRE
WESCONConvention Record, New York: IRE, pp. 96-104”. en. In: Neurocomputing,Volume 1. Ed. by James A. Anderson and Edward Rosenfeld.
The MITPress, pp. 126–134. isbn: 978-0-262-26713-7. doi: 10.7551/mitpress/4943.003.0012.
[14] Rosenblatt, F. (1962) Principles of Neurodynamics: Perceptrons and the Theory of Brain Mechanisms. Spartan Books, Washington DC.
ANN and Associative Memory
• The field of pattern recognition is concerned
with the automatic discovery of regularities
in data through the use of computer
algorithms and with the use of these
regularities to take actions such as classifying
the data into different categories (Bishop,
2006)2.

• The result of running the machine learning


algorithm can be expressed as a function
which takes a new digit image as input and
that generates an output vector , encoded in
the same way as the target vectors.
[2] Bishop, Christopher M. (2006). Pattern recognition and machine learning. eng.Information science and statistics. New York: Springer. isbn:
978-0-387-31073-2.
Hopfield Neural Networks
• Hopfield Neural Networks (HNNs) are a specialized type of artificial neural network
designed for storing and retrieving patterns, a property commonly referred to as associative
memory.
• The classical HNN model was introduced by J. Hopfield in 19826, 7, where he demonstrated
that associative memory can be described through the phase space flow of a system's
state.
• In this framework, the system's evolution is governed by a carefully designed energy
function, ensuring that the dynamics converge to stable states corresponding to stored
patterns.
• The core concept of HNNs involves adapting the parameters of a dynamical system—

( or ) starting from an initial condition () naturally converges to the nearest stable state (𝑋).
described as either discrete () or continuous () such that the long-term behavior of the state

These stable states represent the stored patterns, making HNNs particularly effective for
tasks involving pattern recognition and noise-tolerant retrieval.
[6] Hopfield, J J (Apr. 1982). “Neural networks and physical systems with emergent collective computational abilities.” en. In: Proceedings of
the National Academy of Sciences 79.8, pp. 2554–2558. issn: 0027-8424, 1091-6490. doi:10.1073/pnas.79.8.2554.
[7] Hopfield, J J (May 1984). “Neurons with graded response have collective computationalproperties like those of two-state neurons.” en. In:
Proceedings of the National Academy of Sciences 81.10, pp. 3088–3092. issn: 0027-8424, 1091-6490. doi:10.1073/pnas.81.10.3088.
Energy
• The property of main interested in HNN is the energy, in general every
HNN model has an potential energy associated with it.
• For classical HNN of binary neurons, storing memories, denoted by its
corresponding energy is given by

An update rule can be derived by the energy function such that the
energy is minimized, the basic being
Energy Landscape
• An energy landscape is a map of the possible
states of a system, a graph of the free energy
throughout the configurational or
conformational space (Alaei, 2021)1.
• In a HNN, When presented with an initial
prompt that resembles one of the memory
vectors, the energy descent dynamics finds the
most similar memory vector based on the
similarity between the initial prompt and the
set of available memory patterns (Krotov,
2023)15.

[1] Alaei, Loghman, Morahem Ashengroph, and Ali A. Moosavi-Movahedi (2021).“The concept of protein folding/unfolding and its impacts on
human health”.en. In: Advances in Protein Chemistry and Structural Biology. Vol. 126. El-sevier, pp. 227–278. isbn: 978-0-323-85317-0. doi:
10 . 1016 / bs . apcsb .2021.01.007.
[15] Krotov, Dmitry. (2023). A new frontier for Hopfield networks. Nature Reviews Physics. 5. 10.1038/s42254-023-00595-y.
Example with classical HNN
Storing an image of 80x80 pixels results in a
correct retrive.

Saving 3
But storing more images
images results in a
incorrect retrival

Saving 6
images
Polynomial energy function
• The main problem with classical HNN is the small amount of
memories it can store given by . In 2016 Krotov and Hopfield8
proposed a polynomial energy function given by

where is the classical case and for the energy landscape becomes
sharper. In this case the memory recover is given by

For higher powers the capacity rapidly grows with in a non-linear way.

[8] Krotov, Dmitry and John J. Hopfield (2016). “Dense Associative Memory forPattern Recognition”. In: Advances in Neural Information
Processing Sys-tems. Ed. by D. Lee et al. Vol. 29. Curran Associates, Inc.
Exponential energy function
• Further in Demircigil et al. 20173 explores the energy function

proving that the amount of memories possible to store is exponential.


• The main issue on classical HNN models were the store capacity being linear
and making them unpractical for modern AI applications.
• Based on this work, this issue was solved and HNN can be more suitable for
modern AI applications.
• Furthermore, it can be proven that the retrival of the pattern occurs after
one update.
• The only disadvantage being that HNN were a discrete model, not allowing
differentiability neccesary for modern IA (gradient descent).
[3] Demircigil, Mete et al. (July 2017). “On a Model of Associative Memory with Huge Storage Capacity”. en. In: Journal of Statistical
Physics 168.2, pp. 288–299. issn: 0022-4715, 1572-9613. doi: 10.1007/s10955-017-1806-y.
Example with exponential energy
function
Storing the six images used in the classical HNN example we can obtain
adequate results:

For even a larger data set we get:


Modern Hopfield Networks
• Up until this point HNN models were discrete.
• In Ramsauer et al. (2020)9 a generalized version of HNN in
continous time is given allowing differentiability and with it a
wider range of application (gradient descent).
• This version of HNN keep the huge memory capacity and retrieval
after one update. The novel energy function is

where

and an update rule

[9] Ramsauer, Hubert et al. (2020). Hopfield Networks is All You Need. VersionNumber: 3. doi: 10.48550/ARXIV.2008.02217.
Modern continuous Hopfield Neural
Networks
• In general this new approch to HNNs gives the next advantages:
- Exponential storage capacity (theorem 3).
- Convergence after only one update (theorem 4).
- Global convergence to local mínima (theorem 2).
Such a model has garnered significant attention from researchers, as
evidenced by the impact of the paper illustrated in the graph below
obtained in google scholar.
Example with modern Hopfield
Networks
• Since neurons are now continous the image is not only in B/N.
• Results for the same 24 patterns is present as follows
Transformers
• The introduction of transformers has mark a landscape in the field of
neural networks. Transformers relies on a technique called Attention, a
technique used to provide weights to different parts of an input sequence
so that a better understanding of its underlying context is achieved.
Attention allows to perform machine translation, text generation and
many other Natural Language Processing tasks.
• Transformers were proposed in the paper with major impact “Attention is
all you need” (Close to 150, 000 cites).
• The recent advancements in HNN (Hoover et al. 20235) allows us use this
model as attention in transformer and with such, an impactful field to be
further improved.
[5] Hoover, Benjamin et al. (2023). “Energy Transformer”. In: Publisher: arXivVersion Number: 2. doi: 10.48550/ARXIV.2302.07253.
Modern Hopfield Networks
Some applications of modern HNN are
- Association of two sets - Transformer attention
- Pattern search in sets - Sequance-to-sequence
- Pooling operations - Point set processing
- Memories (LSTMs, GRUs) - Multiple instance learning
- Vector quantization - SVMs, k-nearest neighbors

** I highlighted in black the most impactful and the most plausible applications in dynamical
systems.
Comparitions between methods of
ML
• Some comparision for multiple ML task have been done in Ramsauer et al. , some
9

of them being:
- Hopfield layers that equip neural network layers with memories improved state-of-
the-art in three out of four considered multiple instance learning problems and on
immune repertoire classification, and on two drug design dataset. They yielded the
best results among different machine learning methods on the UCI benchmark
collections of small classification tasks.
Fractional Calculus
• Fractional calculus (FC) is a generalization of integer calculus where
the order of derivative can a be a real number. Various defintion for
fractional derivatives can be found in the literature, amoung them,
Riemann-Liuville, Caputo and Grundwald-Lietnikov are the most well
known .
• FC have diverse advanteges over traditional calculus, some of them
being:
- One degree more of freedom.
- Dependence on all previus states.
Fractional Neural Networks
• The implementation of fractional neural networks is a relatively new and
emerging field of study. Researchers have increasingly combined machine
learning with fractional calculus to enhance certain characteristics of neural
networks. Several advantages have been identified, some of which are
outlined below (Joshi, 2023)16:
- Combining FC to an ANN endows it with the memory feature.
- Fractional derivative-based activation functions in ANN provide additional
adjustable hyperparameters to the networks.
- The FANN has more degree of freedom for adjusting parameters compared
to an ordinary ANN.
- Use of multiple types of activation functions can be employed in FANN.
[16] Joshi, M., Bhosale, S. & Vyawahare, V.A. A survey of fractional calculus applications in artificial neural networks. Artif Intell Rev 56,
13897–13950 (2023).
Fractional-order application on HNN
• Some applications of Fractional Hopfield Neural Networks (FHNN) can be found
in the literature. For instace, Fazzino et al. (2021)17 successfully utilized FHNN for
parameter estimation. Their study concluded that the fractional-order
implementation enhanced the performance of the parameter identification
process compared to integer-order implementations, highlighting its novelty
and effectiveness.
• In another contribution, Ahmad and Al-Solami (2020)18 proposed a time-delayed
Fractional Hopfield Neural Network (FHNN) to address systems with high
nonlinearity. Their construction of key-dependent dynamic S-boxes, supported
by comparative analysis, demonstrated improved security features compared to
many existing chaos-based and other S-boxes.
[17] Fazzino, S., Caponetto, R. & Patanè, L. A new model of Hopfield network with fractional-order neurons for parameter
estimation. Nonlinear Dyn 104, 2671–2685 (2021). https://doi.org/10.1007/s11071-021-06398-z.
[18] Ahmad, M.; Al-Solami, E. Evolving Dynamic S-Boxes Using Fractional-Order Hopfield Neural Network Based
Scheme. Entropy 2020, 22, 717. https://doi.org/10.3390/e22070717.
Justification
• Modern Hopfield Neural Networks (HNNs) offer exponentially large
storage capacity, making them well-suited for contemporary AI and
deep learning applications.
• Given that modern HNNs are continuous, exploring their behavior in
the fractional-order domain presents an opportunity for further
enhancement. This is because fractional calculus improves memory
characteristics and introduces an additional degree of freedom,
potentially leading to more efficient and versatile networks.
• The exploration of modern HNN with FC have not been done.
Justification
• The study of fractional-order modern HNN in its fractional order domain
can give some promising applications on the field of dynamical systems,
an application can be:
- Chaos and nonlinear control: Fractional modern Hopfield Neural
Networks (HNNs) can be utilized to learn and stabilize nonlinear and
chaotic systems. By training the network to recognize periodic,
quasiperiodic, and chaotic patterns and accurately predict their
behavior, these networks can aid in designing control strategies to
suppress or leverage unwanted dynamics. Applications include systems
such as power grids, fluid dynamics, mechanical vibrations, or biological
rhythms. The inherent memory properties of fractional-order
dynamical systems enhance the sensitivity of predictors, enabling
faster and more accurate pattern identification.
Objectives
• General objective:
Explore the fractional-order domain of modern Hopfield Neural Networks and apply their
associative memory attributes to control a highly chaotic system within a physical environment.
• Specific objectives:
- Mathematical Modeling: Develop a mathematical framework for integrating fractional-order
calculus into modern Hopfield Neural Networks, ensuring compatibility with their continuous
nature and associative memory properties.
- System Analysis: Analyze the dynamics of fractional-order modern Hopfield Neural Networks,
including stability, convergence, and memory capacity, in the context of nonlinear systems.
- Design and Simulation: Design and implement a simulation platform to study the behavior of
fractional-order modern Hopfield Neural Networks when applied to nonlinear complex
systems, such as the Lorenz or Rössler attractor.
Objectives
- Control Strategy Development: Formulate and test control strategies based on
fractional-order Hopfield Neural Networks for stabilizing or managing periodic,
quasiperiodic and chaotic behavior in a physical system (e.g., a double
pendulum, electrical circuit, or fluid dynamics setup).
- Experimental Validation: Apply the proposed control strategies to a physical
chaotic system, demonstrating real-world feasibility and effectiveness in
reducing chaos or guiding the system to desired states. The implementation is
based on analog computers (FPAAs).
- Performance Evaluation: Compare the performance of fractional-order
modern Hopfield Neural Networks against traditional neural networks or
integer-order implementations in controlling chaotic systems, focusing on
metrics such as accuracy, computational efficiency, and robustness.
- Applications Exploration: Investigate potential applications of the proposed
method in fields such as robotics, telecommunications, or cryptography, where
controlling chaotic systems is critical.
Infrastructure
● Laboratorio de Circuitos y Sistemas Caóticos de Orden Fraccional,
1FCE4/201

● Software especializado

● Laptops y estaciones de trabajo

● Tarjetas electrónicas

Financing
● Proyecto de Investigación VIEP-BUAP 2025.
References
[1] Alaei, Loghman, Morahem Ashengroph, and Ali A. Moosavi-Movahedi (2021).“The concept of protein
folding/unfolding and its impacts on human health”.en. In: Advances in Protein Chemistry and Structural Biology. Vol. 126.
El-sevier, pp. 227–278. isbn: 978-0-323-85317-0. doi: 10 . 1016 / bs . apcsb .2021.01.007.
[2] Bishop, Christopher M. (2006). Pattern recognition and machine learning. eng.Information science and statistics. New
York: Springer. isbn: 978-0-387-31073-2.
[3] Demircigil, Mete et al. (July 2017). “On a Model of Associative Memory with Huge Storage Capacity”. en. In: Journal of
Statistical Physics 168.2, pp. 288–299. issn: 0022-4715, 1572-9613. doi: 10.1007/s10955-017-1806-y.
[4] Garling, D. J. H. (Dec. 2017). Analysis on Polish Spaces and an Introductionto Optimal Transportation. 1st ed.
Cambridge University Press. isbn: 978-1-108-42157-7 978-1-108-37736-2 978-1-108-43176-7. doi:
10.1017/9781108377362.
[5] Hoover, Benjamin et al. (2023). “Energy Transformer”. In: Publisher: arXivVersion Number: 2. doi:
10.48550/ARXIV.2302.07253.
[6] Hopfield, J J (Apr. 1982). “Neural networks and physical systems with emer-gent collective computational abilities.” en.
In: Proceedings of the NationalAcademy of Sciences 79.8, pp. 2554–2558. issn: 0027-8424, 1091-6490.
doi:10.1073/pnas.79.8.2554.
[7] Hopfield, J J (May 1984). “Neurons with graded response have collective computationalproperties like those of two-
state neurons.” en. In: Proceedings of the NationalAcademy of Sciences 81.10, pp. 3088–3092. issn: 0027-8424, 1091-
6490. doi:10.1073/pnas.81.10.3088.
[8] Krotov, Dmitry and John J. Hopfield (2016). “Dense Associative Memory forPattern Recognition”. In: Advances in Neural
References
[9] Ramsauer, Hubert et al. (2020). Hopfield Networks is All You Need. VersionNumber: 3. doi:
10.48550/ARXIV.2008.02217.
[10] Rumelhart, David E., Geoffrey E. Hinton, and Ronald J. Williams (Oct. 1986).“Learning representations by back-
propagating errors”. en. In: Nature 323.6088,pp. 533–536. issn: 0028-0836, 1476-4687. doi: 10 . 1038 / 323533a0.
[11] Van Der Malsburg, C. (1986). “Frank Rosenblatt: Principles of Neurodynamics:Perceptrons and the Theory of
Brain Mechanisms”. en. In: Brain Theory.Ed. by G¨unther Palm and Ad Aertsen. Berlin, Heidelberg: Springer
BerlinHeidelberg, pp. 245–248. isbn: 978-3-642-70913-5 978-3-642-70911-1. doi:10.1007/978-3-642-70911-1_20.
[12] Widrow, Bernard and Marcian E. Hoff (Apr. 1988). (1960), “Adaptive switching circuits,” 1960 IRE
WESCONConvention Record, New York: IRE, pp. 96-104”. en. In: Neurocomputing,Volume 1. Ed. by James A.
Anderson and Edward Rosenfeld. The MITPress, pp. 126–134. isbn: 978-0-262-26713-7. doi:
10.7551/mitpress/4943.003.0012.
[13] McCulloch, W.S., Pitts, W. A logical calculus of the ideas immanent in nervous activity. Bulletin of
Mathematical Biophysics 5, 115–133 (1943).
[14] Rosenblatt, F. (1962) Principles of Neurodynamics: Perceptrons and the Theory of Brain Mechanisms.
Spartan Books, Washington DC.
[15] Krotov, Dmitry. (2023). A new frontier for Hopfield networks. Nature Reviews Physics. 5. 10.1038/s42254-
023-00595-y.
[16] Joshi, M., Bhosale, S. & Vyawahare, V.A. A survey of fractional calculus applications in artificial neural
networks. Artif Intell Rev 56, 13897–13950 (2023).
References
[17] Fazzino, S., Caponetto, R. & Patanè, L. A new model of Hopfield network with
fractional-order neurons for parameter estimation. Nonlinear Dyn 104, 2671–
2685 (2021). https://doi.org/10.1007/s11071-021-06398-z.
[18] Ahmad, M.; Al-Solami, E. Evolving Dynamic S-Boxes Using Fractional-Order
Hopfield Neural Network Based Scheme. Entropy 2020, 22, 717.
https://doi.org/10.3390/e22070717.

You might also like