Evolution of Neuromorphic Computing
Evolution of Neuromorphic Computing
Abstract—With the advancement of artificial intelligence (AI) data with astounding energy efficiency, and they do so with
technologies, novel and inventive approaches for addressing unmatched efficiency and agility. By providing unmatched
complex problems are coming to the forefront. Neuromorphic energy efficiency and real-time processing capabilities, these
computing based on AI technologies stands as an exemplar,
endeavoring to mimic the human brain’s intricate neural archi- architectures [7] have the potential to revolutionize computing.
tecture and computational principles within electronic devices. Neuromorphic computing aims to develop hardware and
Contrary to conventional Von Neumann architecture, neuro- software that can process data similarly to how the brain neural
morphic computing architecture offers a promising solution for networks [3] do to offer several advantages over traditional
building intelligent and efficient computational systems that excel computing systems potentially. This paper investigates the
in tasks requiring low power consumption, real-time process-
ing, and adaptability. Subsequently, it is employed in various terrain of neuromorphic computing, its evolutionary trajectory,
applications such as robotics, sensory processing, neuromorphic and architectural aspects. It emphasizes the paradigm shift
vision, edge computing, etc. This paper explores the conventional from the conventional Von Neumann architecture, delving
Von Neumann architecture and outlines its shortcomings. Next, into the alternative domains of neuromorphic technology.
neuromorphic architecture as an alternative and its evolution are Additionally, the paper underscores the characteristics of neu-
described. Next, the characteristics of neuromorphic computing
and its diverse applications are illustrated. The paper also romorphic computing and its extensive applications. Further-
addresses the key challenges hindering neuromorphic computing more, it examines the challenges that hinder the progress of
development. neuromorphic computing.
Index Terms—Neuromorphic Computing; Von Neumann Ar- The rest of the paper is organized as follows: Section II
chitecture; Neuromorphic Computing Architecture; Neuromor- describes Von Neumann Architecture. Section III discusses
phic Computing Challenges; SpiNNaker Neuromorphic Computing. Section IV includes the character-
istics and application of Neuromorphic Computing. Section V
provides challenges of Neuromorphic Computing. Section VI
I. I NTRODUCTION
concludes the paper.
The rapid advancement in AI technologies has resulted in
the emergence of divergent solutions to address complex prob- II. VON N EUMANN A RCHITECTURE
lems [1]. Neuromorphic computing is one solution that aims The Von Neumann architecture, often called the Princeton
to replicate the structure and functioning of neural networks in architecture, is a crucial idea in computer architecture and the
the human brain, offering a promising solution [2]. Neuromor- foundation of most contemporary computers. John von Neu-
phic computing provides parallel processing, low power con- mann, a mathematician and computer scientist, first proposed
sumption, event-driven processing, real-time processing, etc. it in the late 1940s [8].
These features of neuromorphic computing play a prominent
role in providing solutions in various fields, such as artificial
intelligence [3], robotics, image and speech recognition, brain- A. Basic Components
computer interfaces, pattern recognition, sensory processing, The architecture comprises four fundamental components,
cognitive computing, and more [4]. Neuromorphic engineering which specify how a computer system is organized [8]. Figure
is another name for neuromorphic computing. A subfield of 2 shows an abstract view of the Von Neumann architecture.
computer science and engineering known as “neuromorphic • Central Processing Unit (CPU): The main component of
computing” tries to create computer architectures and systems the architecture is a CPU. The CPU executes instructions
motivated by the human brain’s structure and operation [3], and does calculations, hence acting as the “brain” of
making it possible for them to work faster than conventional the machine. It comprises the Arithmetic Logic Unit
computing architectures [5]. (ALU), which performs mathematical computations, and
The fundamental goal of neuromorphic computing is to the Control Unit (CU), which interprets and sequences
develop electronic brain-like systems that can process in- instructions.
formation like the human brain, extending AI beyond the • Memory: The next component is memory, a storage unit.
bounds of conventional computing [2], [6]. The neural net- The data and instructions that the CPU requires to per-
works in the human brain process enormous volumes of form tasks are stored in memory. The memory in the Von
Authorized licensed use limited to: J.R.D. Tata Memorial Library Indian Institute of Science Bengaluru. Downloaded on April 23,2024 at 17:39:33 UTC from IEEE Xplore. Restrictions apply.
B. Benefits
The Von Neumann architecture serves as the basis for
most contemporary computer systems [9], and it has several
significant advantages that have helped it become a popular
and influential design [8]. A few advantages are seen in Figure
3.
Fig. 1: Neuromorphic Computing Architecture • Simplicity: The architectural simplicity makes building
and implementing computer systems easier since it is
clear-cut and straightforward to comprehend. The overall
system structure is made simpler because it uses a single
shared memory for data and instructions.
• Flexibility: The design permits the simultaneous storage
of data and instructions in memory, enabling dynamic
modification of programs and data during execution.
• Efficiency: Because of the sequential instruction execu-
tion, the design is more effective at handling linear opera-
tions. The simpleness and effectiveness of the architecture
have led to its success in various computing devices.
• Cost-effectiveness: Creating and installing the architec-
ture is less expensive, easy to maintain, and upgraded for
multiple applications due to its simplicity.
• Universality: The architecture’s adaptable instruction set
enables it to run various programs without requiring
Fig. 2: Von Neumann Architecture
hardware modifications.
• Incremental Development: The architecture enables the
progressive growth and enhancement of computer sys-
tems. Expanding the instruction set and adding new
Neumann architecture is a single, integrated component functionality is possible without significantly altering the
that stores data and program instructions. The “stored- architecture.
program concept” is a feature of modern computers that • Standardisation: The architecture is recognized as a com-
describes this behaviour. monly used model in the computing industry. This stan-
• Input/Output (I/O) Devices: These devices enable in- dardization simplifies programmer’s ability to produce
teraction between the computer and the outside world. software operating on diverse systems.
I/O devices include keyboards, mouse, displays, printers,
scanners, webcam, and network interfaces. The CPU
C. Challenges
manages data transfers between memory and the I/O
devices. The Von Neumann architecture provides a set of challenges
• Control Unit (CU): The fourth component is CU, which [10], as shown in Figure 4.
retrieves program instructions from memory, decodes • Von Neumann Bottleneck: The Von Neumann bottleneck
them and directs how they are executed. It sends signals is a major problem because the CPU and RAM are
to the ALU and other CPU components to perform the connected by the same bus, causing data transfers to be
required tasks. serialized [11].
Authorized licensed use limited to: J.R.D. Tata Memorial Library Indian Institute of Science Bengaluru. Downloaded on April 23,2024 at 17:39:33 UTC from IEEE Xplore. Restrictions apply.
Fig. 5: History of Neuromorphic Computing
Authorized licensed use limited to: J.R.D. Tata Memorial Library Indian Institute of Science Bengaluru. Downloaded on April 23,2024 at 17:39:33 UTC from IEEE Xplore. Restrictions apply.
Fig. 7: Perceptron Model Fig. 9: SpiNNaker
Authorized licensed use limited to: J.R.D. Tata Memorial Library Indian Institute of Science Bengaluru. Downloaded on April 23,2024 at 17:39:33 UTC from IEEE Xplore. Restrictions apply.
Fig. 12: Neuron Fig. 14: Neural Network
Authorized licensed use limited to: J.R.D. Tata Memorial Library Indian Institute of Science Bengaluru. Downloaded on April 23,2024 at 17:39:33 UTC from IEEE Xplore. Restrictions apply.
ers [3]. Neuromorphic systems provide unique and innovative
solutions for artificial intelligence compared to von Neumann’s
computer architecture. Neuromorphic computing aims to repli-
cate the neural networks and cognitive processes of the brain in
contrast to traditional computing’s Von Neumann architecture,
which has separate memory and processing units to achieve
enhanced efficiency. On the other hand, with a neuromor-
phic computer, processing and memory are controlled by the
neurons and synapses, whereas in the Von Neumann archi-
tecture, memory, and processing are controlled by a central
processing unit (CPU) [21]. Neuromorphic computers utilize
the features and organization of the neural network instead
of Von Neumann computers, which use explicit instructions
to define programming. Significant and unique requirements
Fig. 16: Benifits of Neuromorphic Computing for neuromorphic architectures include increased connectivity
and parallelism, low power consumption, memory colloca-
tion, and processing. Compared to conventional Von Neu-
mann architectures, it can execute complicated computations
quickly, using less power and leaving a smaller environmental
footprint. Since these properties represent the von Neumann
architecture’s bottleneck, the neuromorphic architecture has
been considered suitable for implementing machine learning
algorithms [22].
Authorized licensed use limited to: J.R.D. Tata Memorial Library Indian Institute of Science Bengaluru. Downloaded on April 23,2024 at 17:39:33 UTC from IEEE Xplore. Restrictions apply.
• Cognitive Computing for Natural Language Processing:
Neuromorphic processors, such as IBM’s TrueNorth or
Intel’s Loihi, are integrated into the cognitive comput-
ing system [30]. These processors mimic the behavior
of neurons and synapses, enabling parallel and event-
driven processing. The neuromorphic device is crucial for
Cognitive Computing due to its low-power operation and
parallel signal processing [35], [36].
Authorized licensed use limited to: J.R.D. Tata Memorial Library Indian Institute of Science Bengaluru. Downloaded on April 23,2024 at 17:39:33 UTC from IEEE Xplore. Restrictions apply.
that several obstacles must be solved before neuromorphic [19] T. Wunderlich, A. F. Kungl, E. Müller, A. Hartel, Y. Stradmann, S. A.
computing can realize its full potential. Considerations for Aamir, A. Grübl, A. Heimbrecht, K. Schreiber, D. Stöckel, et al.,
“Demonstrating advantages of neuromorphic computation: a pilot study,”
developing and putting into use neuromorphic systems include Frontiers in neuroscience, vol. 13, p. 260, 2019.
the difficulty of replicating the brain, the absence of industry [20] O. Krestinskaya, A. P. James, and L. O. Chua, “Neuromemristive circuits
standards, the limited applications, and the high expense of for edge computing: A review,” IEEE Transactions on Neural Networks
and Learning Systems, vol. 31, no. 1, pp. 4–23, 2020.
doing so. The lack of easily accessible functional software [21] H. D. Nguyen, J. Yu, L. Xie, M. Taouil, S. Hamdioui, and D. Fey,
and hardware for the whole computational and computer “Memristive devices for computing: Beyond cmos and beyond von
science community is a significant obstacle to algorithmic and neumann,” in 2017 IFIP/IEEE International Conference on Very Large
Scale Integration (VLSI-SoC), pp. 1–10, 2017.
application development for neuromorphic computers. Thus, [22] F. Staudigl, F. Merchant, and R. Leupers, “A survey of neuromorphic
the domain has diverse research perspectives. computing-in-memory: Architectures, simulators, and security,” IEEE
Design and Test, vol. 39, no. 2, pp. 90–99, 2022.
[23] X. Zhang, V. Mohan, and A. Basu, “Cram: Collocated sram and dram
R EFERENCES with in-memory computing-based denoising and filling for neuromor-
phic vision sensors in 65 nm cmos,” IEEE Transactions on Circuits and
[1] Y. Chen, H. H. Li, C. Wu, C. Song, S. Li, C. Min, H.-P. Cheng, Systems II: Express Briefs, vol. 67, no. 5, pp. 816–820, 2020.
W. Wen, and X. Liu, “Neuromorphic computing’s yesterday, today, and [24] A. Shrestha, H. Fang, Z. Mei, D. P. Rider, Q. Wu, and Q. Qiu, “A survey
tomorrow–an evolutional view,” Integration, vol. 61, pp. 49–61, 2018. on neuromorphic computing: Models and hardware,” IEEE Circuits and
[2] Z. Yu, A. M. Abdulghani, A. Zahid, H. Heidari, M. A. Imran, and Systems Magazine, vol. 22, no. 2, pp. 6–35, 2022.
Q. H. Abbasi, “An overview of neuromorphic computing for artificial [25] D. Kuzum, R. G. D. Jeyasingh, S. Yu, and H.-S. P. Wong, “Low-
intelligence enabled hardware-based hopfield neural network,” IEEE energy robust neuromorphic computation using synaptic devices,” IEEE
Access, vol. 8, pp. 67085–67099, 2020. Transactions on Electron Devices, vol. 59, no. 12, pp. 3489–3494, 2012.
[3] I. Sharma and Vanshika, “Evolution of neuromorphic computing with [26] R. Vishwa, R. Karthikeyan, R. Rohith, and A. Sabaresh, “Current
machine learning and artificial intelligence,” in 2022 IEEE 3rd Global research and future prospects of neuromorphic computing in artificial
Conference for Advancement in Technology (GCAT), pp. 1–6, 2022. intelligence,” in IOP Conference Series: Materials Science and Engi-
[4] C. D. Schuman, S. R. Kulkarni, M. Parsa, J. P. Mitchell, P. Date, and neering, vol. 912, p. 062029, IOP Publishing, 2020.
B. Kay, “Opportunities for neuromorphic computing algorithms and [27] B. Sun, T. Guo, G. Zhou, S. Ranjan, Y. Jiao, L. Wei, Y. N. Zhou,
applications,” Nature Computational Science, vol. 2, no. 1, pp. 10–19, and Y. A. Wu, “Synaptic devices based neuromorphic computing ap-
2022. plications in artificial intelligence,” Materials Today Physics, vol. 18,
[5] S. Furber, “Large-scale neuromorphic computing systems,” Journal of p. 100393, 2021.
neural engineering, vol. 13, no. 5, p. 051001, 2016. [28] S. Kumar and B. Bhowmik, “Covid-19 waves and their impacts to
[6] T. Manjunath and B. Bhowmik, “Quantum machine learning and recent society,” in 2023 IEEE Guwahati Subsection Conference (GCON), pp. 1–
advancements,” in 2023 International Conference on Artificial Intelli- 5, 2023.
gence and Smart Communication (AISC), pp. 206–211, 2023. [29] A. Hindu and B. Bhowmik, “Impact of stress during covid-19 pan-
[7] J. Mack, R. Purdy, K. Rockowitz, M. Inouye, E. Richter, S. Valancius, demic,” in 2023 9th International Conference on Advanced Computing
N. Kumbhare, M. S. Hassan, K. Fair, J. Mixter, and A. Akoglu, and Communication Systems (ICACCS), vol. 1, pp. 1719–1724, 2023.
“Ranc: Reconfigurable architecture for neuromorphic computing,” IEEE [30] C. D. James, J. B. Aimone, N. E. Miner, C. M. Vineyard, F. H.
Transactions on Computer-Aided Design of Integrated Circuits and Rothganger, K. D. Carlson, S. A. Mulder, T. J. Draelos, A. Faust, M. J.
Systems, vol. 40, no. 11, pp. 2265–2278, 2021. Marinella, et al., “A historical survey of algorithms and hardware archi-
[8] I. Arikpo, F. Ogban, and I. Eteng, “Von neumann architecture and tectures for neural-inspired and neuromorphic computing applications,”
modern computers,” Global Journal of Mathematical Sciences, vol. 6, Biologically Inspired Cognitive Architectures, vol. 19, pp. 49–64, 2017.
no. 2, pp. 97–103, 2007. [31] R. Mondal and B. Bhowmik, “Decs: A deep neural network framework
[9] M. Godfrey and D. Hendry, “The computer as von neumann planned for cold start problem in recommender systems,” in 2022 IEEE Region
it,” IEEE Annals of the History of Computing, vol. 15, no. 1, pp. 11–21, 10 Symposium (TENSYMP), pp. 1–6, 2022.
1993. [32] M. R. Prathyusha and B. Biswajit, “Iot-enabled smart applications and
[10] M. Shaafiee, R. Logeswaran, and A. Seddon, “Overcoming the lim- challenges,” in 2023 8th International Conference on Communication
itations of von neumann architecture in big data systems,” in 2017 and Electronics Systems (ICCES), pp. 354–360, 2023.
7th International Conference on Cloud Computing, Data Science and [33] M. R. Prathyusha and B. Biswajit, “Iot evolution and recent advance-
Engineering - Confluence, pp. 199–203, 2017. ments,” in 2023 9th International Conference on Advanced Computing
[11] Z. Cai and X. Li, “Neuromorphic brain-inspired computing with hybrid and Communication Systems (ICACCS), vol. 1, pp. 1725–1730, 2023.
neural networks,” in 2021 IEEE International Conference on Artificial [34] F. C. Bauer, D. R. Muir, and G. Indiveri, “Real-time ultra-low power
Intelligence and Industrial Design (AIID), pp. 343–347, 2021. ecg anomaly detection using an event-driven neuromorphic processor,”
[12] A. Ganguly, R. Muralidhar, and V. Singh, “Towards energy efficient IEEE Transactions on Biomedical Circuits and Systems, vol. 13, no. 6,
non-von neumann architectures for deep learning,” in 20th International pp. 1575–1582, 2019.
Symposium on Quality Electronic Design (ISQED), pp. 335–342, 2019. [35] S. Yamamichi, A. Horibe, T. Aoki, K. Hosokawa, T. Hisada, and
[13] J. Vijaychandra, B. S. Sai, B. S. Babu, and P. Jagannadh, “A comprehen- H. Mori, “Implementation challenges for scalable neuromorphic com-
sive review on mcculloch-pitts neuron model,” International Journal of puting,” in 2017 Symposium on VLSI Technology, pp. T182–T183, 2017.
Innovative Technology and Exploring Engineering, vol. 8, no. 6, 2019. [36] K. K. Girish and B. Bhowmik, “Recent advancements and challenges
[14] P. Verschure, “Connectionist explanation: Taking positions in the mind- in fintech,” in 2023 14th International Conference on Computing Com-
brain dilemma,” Neural networks and a new artificial intelligence, munication and Networking Technologies (ICCCNT), pp. 1–7, 2023.
pp. 133–188, 1997. [37] M. R. Ahmed and B. Sujatha, “A review on methods, issues and chal-
[15] R. Eberhart and R. Dobbins, “Early neural network development his- lenges in neuromorphic engineering,” in 2015 International Conference
tory: the age of camelot,” IEEE Engineering in Medicine and Biology on Communications and Signal Processing (ICCSP), pp. 0899–0903,
Magazine, vol. 9, no. 3, pp. 15–18, 1990. 2015.
[16] S. B. Furber, F. Galluppi, S. Temple, and L. A. Plana, “The spinnaker [38] E. Klein, T. Brown, M. Sample, A. R. Truitt, and S. Goering, “Engi-
project,” Proceedings of the IEEE, vol. 102, no. 5, pp. 652–665, 2014. neering the brain: ethical issues and the introduction of neural devices,”
[17] F. Akopyan, J. Sawada, A. Cassidy, R. Alvarez-Icaza, J. Arthur, Hastings Center Report, vol. 45, no. 6, pp. 26–35, 2015.
P. Merolla, N. Imam, Y. Nakamura, P. Datta, G.-J. Nam, B. Taba, [39] Y. Zhao, J. Kang, and D. Ielmini, “Materials challenges and opportuni-
M. Beakes, B. Brezzo, J. B. Kuang, R. Manohar, W. P. Risk, B. Jackson, ties for brain-inspired computing,” MRS Bulletin, vol. 46, pp. 978–986,
and D. S. Modha, “Truenorth: Design and tool flow of a 65 mw 1 2021.
million neuron programmable neurosynaptic chip,” IEEE Transactions
on Computer-Aided Design of Integrated Circuits and Systems, vol. 34,
no. 10, pp. 1537–1557, 2015.
[18] T. C. Südhof, “The cell biology of synapse formation,” Journal of Cell
Biology, vol. 220, no. 7, p. e202103052, 2021.
Authorized licensed use limited to: J.R.D. Tata Memorial Library Indian Institute of Science Bengaluru. Downloaded on April 23,2024 at 17:39:33 UTC from IEEE Xplore. Restrictions apply.