0% found this document useful (0 votes)
31 views8 pages

Evolution of Neuromorphic Computing

Uploaded by

biscuitman45690
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
31 views8 pages

Evolution of Neuromorphic Computing

Uploaded by

biscuitman45690
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

2024 Fourth International Conference on Advances in Electrical, Computing, Communication and Sustainable Technologies (ICAECT) | 979-8-3503-4367-0/24/$31.

00 ©2024 IEEE | DOI: 10.1109/ICAECT60202.2024.10469389

Evolution of Neuromorphic Computing


Vakada G Sai Sree Vaishnavi, Biswajit Bhowmik
Ishwarchandra Vidyasagar AIT Lab, BRICS Laboratory
Department of Computer Science and Engineering
National Institute of Technology Karnataka
Surathkal, Mangalore-575025, Bharat
Email: {vakadagsaisreevaishnavi.222is035, brb}@nitk.edu.in

Abstract—With the advancement of artificial intelligence (AI) data with astounding energy efficiency, and they do so with
technologies, novel and inventive approaches for addressing unmatched efficiency and agility. By providing unmatched
complex problems are coming to the forefront. Neuromorphic energy efficiency and real-time processing capabilities, these
computing based on AI technologies stands as an exemplar,
endeavoring to mimic the human brain’s intricate neural archi- architectures [7] have the potential to revolutionize computing.
tecture and computational principles within electronic devices. Neuromorphic computing aims to develop hardware and
Contrary to conventional Von Neumann architecture, neuro- software that can process data similarly to how the brain neural
morphic computing architecture offers a promising solution for networks [3] do to offer several advantages over traditional
building intelligent and efficient computational systems that excel computing systems potentially. This paper investigates the
in tasks requiring low power consumption, real-time process-
ing, and adaptability. Subsequently, it is employed in various terrain of neuromorphic computing, its evolutionary trajectory,
applications such as robotics, sensory processing, neuromorphic and architectural aspects. It emphasizes the paradigm shift
vision, edge computing, etc. This paper explores the conventional from the conventional Von Neumann architecture, delving
Von Neumann architecture and outlines its shortcomings. Next, into the alternative domains of neuromorphic technology.
neuromorphic architecture as an alternative and its evolution are Additionally, the paper underscores the characteristics of neu-
described. Next, the characteristics of neuromorphic computing
and its diverse applications are illustrated. The paper also romorphic computing and its extensive applications. Further-
addresses the key challenges hindering neuromorphic computing more, it examines the challenges that hinder the progress of
development. neuromorphic computing.
Index Terms—Neuromorphic Computing; Von Neumann Ar- The rest of the paper is organized as follows: Section II
chitecture; Neuromorphic Computing Architecture; Neuromor- describes Von Neumann Architecture. Section III discusses
phic Computing Challenges; SpiNNaker Neuromorphic Computing. Section IV includes the character-
istics and application of Neuromorphic Computing. Section V
provides challenges of Neuromorphic Computing. Section VI
I. I NTRODUCTION
concludes the paper.
The rapid advancement in AI technologies has resulted in
the emergence of divergent solutions to address complex prob- II. VON N EUMANN A RCHITECTURE
lems [1]. Neuromorphic computing is one solution that aims The Von Neumann architecture, often called the Princeton
to replicate the structure and functioning of neural networks in architecture, is a crucial idea in computer architecture and the
the human brain, offering a promising solution [2]. Neuromor- foundation of most contemporary computers. John von Neu-
phic computing provides parallel processing, low power con- mann, a mathematician and computer scientist, first proposed
sumption, event-driven processing, real-time processing, etc. it in the late 1940s [8].
These features of neuromorphic computing play a prominent
role in providing solutions in various fields, such as artificial
intelligence [3], robotics, image and speech recognition, brain- A. Basic Components
computer interfaces, pattern recognition, sensory processing, The architecture comprises four fundamental components,
cognitive computing, and more [4]. Neuromorphic engineering which specify how a computer system is organized [8]. Figure
is another name for neuromorphic computing. A subfield of 2 shows an abstract view of the Von Neumann architecture.
computer science and engineering known as “neuromorphic • Central Processing Unit (CPU): The main component of
computing” tries to create computer architectures and systems the architecture is a CPU. The CPU executes instructions
motivated by the human brain’s structure and operation [3], and does calculations, hence acting as the “brain” of
making it possible for them to work faster than conventional the machine. It comprises the Arithmetic Logic Unit
computing architectures [5]. (ALU), which performs mathematical computations, and
The fundamental goal of neuromorphic computing is to the Control Unit (CU), which interprets and sequences
develop electronic brain-like systems that can process in- instructions.
formation like the human brain, extending AI beyond the • Memory: The next component is memory, a storage unit.
bounds of conventional computing [2], [6]. The neural net- The data and instructions that the CPU requires to per-
works in the human brain process enormous volumes of form tasks are stored in memory. The memory in the Von

Authorized licensed use limited to: J.R.D. Tata Memorial Library Indian Institute of Science Bengaluru. Downloaded on April 23,2024 at 17:39:33 UTC from IEEE Xplore. Restrictions apply.
B. Benefits
The Von Neumann architecture serves as the basis for
most contemporary computer systems [9], and it has several
significant advantages that have helped it become a popular
and influential design [8]. A few advantages are seen in Figure
3.

Fig. 3: Benefits of Von Neumann architecture

Fig. 1: Neuromorphic Computing Architecture • Simplicity: The architectural simplicity makes building
and implementing computer systems easier since it is
clear-cut and straightforward to comprehend. The overall
system structure is made simpler because it uses a single
shared memory for data and instructions.
• Flexibility: The design permits the simultaneous storage
of data and instructions in memory, enabling dynamic
modification of programs and data during execution.
• Efficiency: Because of the sequential instruction execu-
tion, the design is more effective at handling linear opera-
tions. The simpleness and effectiveness of the architecture
have led to its success in various computing devices.
• Cost-effectiveness: Creating and installing the architec-
ture is less expensive, easy to maintain, and upgraded for
multiple applications due to its simplicity.
• Universality: The architecture’s adaptable instruction set
enables it to run various programs without requiring
Fig. 2: Von Neumann Architecture
hardware modifications.
• Incremental Development: The architecture enables the
progressive growth and enhancement of computer sys-
tems. Expanding the instruction set and adding new
Neumann architecture is a single, integrated component functionality is possible without significantly altering the
that stores data and program instructions. The “stored- architecture.
program concept” is a feature of modern computers that • Standardisation: The architecture is recognized as a com-
describes this behaviour. monly used model in the computing industry. This stan-
• Input/Output (I/O) Devices: These devices enable in- dardization simplifies programmer’s ability to produce
teraction between the computer and the outside world. software operating on diverse systems.
I/O devices include keyboards, mouse, displays, printers,
scanners, webcam, and network interfaces. The CPU
C. Challenges
manages data transfers between memory and the I/O
devices. The Von Neumann architecture provides a set of challenges
• Control Unit (CU): The fourth component is CU, which [10], as shown in Figure 4.
retrieves program instructions from memory, decodes • Von Neumann Bottleneck: The Von Neumann bottleneck
them and directs how they are executed. It sends signals is a major problem because the CPU and RAM are
to the ALU and other CPU components to perform the connected by the same bus, causing data transfers to be
required tasks. serialized [11].

Authorized licensed use limited to: J.R.D. Tata Memorial Library Indian Institute of Science Bengaluru. Downloaded on April 23,2024 at 17:39:33 UTC from IEEE Xplore. Restrictions apply.
Fig. 5: History of Neuromorphic Computing

Fig. 4: Drawbacks of Von Neumann architecture


• McCulloch-Pitts Model (1943): In 1943, Warren McCul-
loch and Walter Pitts put forth a model that has been
• Sequential Execution: For modern, highly parallel appli-
regarded as the start of the modern era of Artificial
cations, sequential instruction execution becomes a con-
Neural Networks (ANNs). This model creates an ANN’s
straint even though it was appropriate for early computer
logical calculus. The neuron in this model only permits
jobs [12].
the binary states 0 and 1 because it is a binary-triggered
• Limited Scalability: The shared bus and sequential exe-
neuron [13]. The McCulloch-Pitts model is shown in
cution might raise scalability problems as the size and
Figure 6.
complexity of computer systems increase.
• Memory-Processor Mismatch: Data access from memory
is much slower than CPU processing speed. The memory-
processor mismatch, or discrepancy between CPU and
memory speeds, can cause latency problems and perfor-
mance snags.
• Power Inefficiency: Particularly in contemporary high-
performance computing systems, the architecture’s re-
liance on sequential execution and shared bus contributes
to power inefficiency [12].
• Instruction Fetch Overhead: The CPU must spend time
retrieving instructions from memory in the Von Neumann
architecture before executing them [11].
• Susceptibility to Malware: The Von Neumann archi-
tecture is susceptible to several malware and security Fig. 6: McCulloch Model

exploits, such as buffer overflows and code injection


• Perceptron (1957): Frank Rosenblatt introduced the per-
attacks, because data and instructions are stored in the
ceptron, a computational model based on the McCulloch-
same memory.
Pitts neuron, capable of learning and making binary
• Limited Application of AI and Neural Networks: Arti-
decisions. It was one of the earliest examples of a
ficial intelligence (AI) and neural network applications
neural network. The perceptron was influential in the
were not considered when designing the architecture.
early stages of the development of artificial intelligence
III. N EUROMORPHIC C OMPUTING and machine learning [1]. Understanding the perceptron
model is crucial in understanding the historical context
A new method of computing that takes cues from the and growth of neural networks, which influenced the
structure and operation of the human brain is known as development of neuromorphic computing. The structure
“neuromorphic computing”. Neuromorphic computing aims to of perceptron is shown in Figure 7.
develop hardware and software that can process data similarly • Connectionism (1980s): In the 1980s, the field of Connec-
to how the brain’s neural networks [3] do to offer several tionism gained prominence, focusing on studying neural
advantages over traditional computing systems potentially. networks and their applications. Connectionism explains
mental phenomena through parallel activation and inter-
A. Evolution of Neuromorphic Computing action of artificial synapses between model neurons [14].
The history of neuromorphic computing dates back several Connectionism is shown in Figure 8.
decades, with notable milestones and developments. Figure 5 • Neural Network Renaissance (late 1990s): Neural net-
illustrates various neuromorphic computing models that have works experienced a resurgence of interest in the late
evolved since 1943. 1990s due to advances in computing power and the

Authorized licensed use limited to: J.R.D. Tata Memorial Library Indian Institute of Science Bengaluru. Downloaded on April 23,2024 at 17:39:33 UTC from IEEE Xplore. Restrictions apply.
Fig. 7: Perceptron Model Fig. 9: SpiNNaker

Fig. 10: IBM TrueNorth chip


Fig. 8: Connectionism

neuromorphic research chip. Loihi employed a highly


availability of larger datasets [15]. This period saw the efficient architecture that combined digital circuits for
development of more sophisticated neural network archi- computation and analog circuits for communication [3].
tectures. The Intel’s loihi chip is shown in Figure 11.
Deeper and more complicated neural networks could be
trained using advanced technologies such as strong GPUs
B. Architecture of Neuromorphic Computing
(Graphics Processing Units), TPUs (Tensor Processing
Units), and distributed computing. The architecture of neuromorphic computing is composed
of various components that collectively imitate the brain’s
• SpiNNaker Project (2005): SpiNNaker (Spiking Neural behavior [7]. To make computer systems more effective at
Network Architecture) project was designed to create tasks like pattern recognition, learning, and decision-making,
a highly parallel neuromorphic computing system capa- it tries to create and construct computer systems that can
ble of instantly simulating complex brain models [3]. understand the neural networks and synapses in the brain.
The SpiNNaker PCB(Process Control Board) architecture Basic components include:
consisting of 48-node [16], which is used in the spiN-
Naker project, is shown in Figure 9.
• IBM’s TrueNorth (2011): IBM Research introduced the
TrueNorth neuromorphic chip, which employed a net-
work of spiking neurons. TrueNorth chips were designed
to mimic the brain’s parallelism and low power consump-
tion, with potential applications in sensory processing
and cognitive computing [17]. The IBM TrueNorth chip
introduced by IBM research is shown in Figure 10.
• BrainScaleS Project (2011): “BrainScaleS” project was a
prominent initiative in the field of neuromorphic comput-
ing [1]. It focused on implementing large-scale, mixed-
signal neuromorphic chips that could emulate the behav-
ior of biological neural networks.
• Intel’s Loihi (2017): Intel unveiled Loihi as a self-learning Fig. 11: Intel’s Loihi Chip

Authorized licensed use limited to: J.R.D. Tata Memorial Library Indian Institute of Science Bengaluru. Downloaded on April 23,2024 at 17:39:33 UTC from IEEE Xplore. Restrictions apply.
Fig. 12: Neuron Fig. 14: Neural Network

(a) A spike-based quasi-


backpropagation network’s (b) Standard ANN is
structure trained

(c) Architecture of a typical


reservoir computing solu-
tion

Fig. 13: Synapse

(d) Structures and parame-


1) Neurons: Neuromorphic systems use artificial neurons ters of an SNN (e)
as the basic building blocks. “Spikes”, or action potentials, Fig. 15: Spiking Activity [4]
are the primary means of communication between neurons.
These are only asynchronous impulses in which the only
information sent is the identity of the spiking neuron and tion is encoded in discrete spikes or pulses. Spiking activity is
the moment at which it spiked, which occurs due to the a crucial feature of this architecture, enabling efficient event-
electrochemical regeneration mechanism employed to ensure driven computation and communication [4]. Figure 15 shows
the reliable propagation of the signals. “wires” [16]. Figure 12 spiking activity stages.
shows the neuron’s structure. 5) Plasticity: One of the critical features of neuromorphic
2) Synapses: Synapses in a brain circuit quickly transport computing is synaptic plasticity, which refers to the ability of
information between neurons while also transforming that synapses to adapt and change their strength based on activity
information. The connections between pre- and postsynaptic patterns and learning rules. Plasticity allows for learning,
neurons influence the many computational characteristics of memory, and the ability to recognize patterns and make
synapses. It is mainly unknown how synapses come together predictions.
to create a brain circuit and how the specificity of synaptic
connections is produced [18]. Synapse is shown in Figure ??. C. Benefits of Neuromorphic Computing
3) Neural Network: Neuromorphic systems organize neu- The neuromorphic computing paradigm provides multiple
rons and synapses into neural networks. Depending on the benefits, as shown in Figure 16.
specific application and computational requirements, these • Robustness and fault tolerance: Robustness and fault
networks can have different topologies, such as feedforward, tolerance are provided by the distributed nature of neuro-
recurrent, or spiking neural networks. Figure 14 shows the morphic architectures [19]. The system is more resilient
neural network structure. in real-world applications because it can keep running
4) Spiking Activity: Unlike traditional computing systems even if its components malfunction.
that rely on continuous-valued signals, neuromorphic systems • Brain-Inspired Algorithms: Neuromorphic computing
often employ spiking neural networks (SNNs), where informa- promotes the creation of brain-inspired algorithms, which

Authorized licensed use limited to: J.R.D. Tata Memorial Library Indian Institute of Science Bengaluru. Downloaded on April 23,2024 at 17:39:33 UTC from IEEE Xplore. Restrictions apply.
ers [3]. Neuromorphic systems provide unique and innovative
solutions for artificial intelligence compared to von Neumann’s
computer architecture. Neuromorphic computing aims to repli-
cate the neural networks and cognitive processes of the brain in
contrast to traditional computing’s Von Neumann architecture,
which has separate memory and processing units to achieve
enhanced efficiency. On the other hand, with a neuromor-
phic computer, processing and memory are controlled by the
neurons and synapses, whereas in the Von Neumann archi-
tecture, memory, and processing are controlled by a central
processing unit (CPU) [21]. Neuromorphic computers utilize
the features and organization of the neural network instead
of Von Neumann computers, which use explicit instructions
to define programming. Significant and unique requirements
Fig. 16: Benifits of Neuromorphic Computing for neuromorphic architectures include increased connectivity
and parallelism, low power consumption, memory colloca-
tion, and processing. Compared to conventional Von Neu-
mann architectures, it can execute complicated computations
quickly, using less power and leaving a smaller environmental
footprint. Since these properties represent the von Neumann
architecture’s bottleneck, the neuromorphic architecture has
been considered suitable for implementing machine learning
algorithms [22].

IV. C HARACTERISTICS AND A PPLICATIONS OF


N EUROMORPHIC C OMPUTING
Neuromorphic Computing can entirely transform industries
from coding to hardware. The paradigm is ensured in diverse
directions.

Fig. 17: Neuromorphic vs Von Neumann computer by characteristics


A. Characteristics of Neuromorphic Computing
Figure 18 provides essential neuromorphic computing char-
acteristics.
may result in creative and practical answers to various
computational issues.
• Applications for Edge Computing: Neuromorphic com-
puting is highly suited for edge computing applications
because of its energy-efficient and real-time processing
capabilities [20]. It eliminates the need to send massive
volumes of data to centralized cloud servers by processing
data locally on the device or at the network’s edge.
• Neuromorphic Hardware: Driven by neuromorphic com-
puting, specialized hardware is being created that can
optimize neural computations and speed up AI activities,
resulting in more effective and potent computer systems
[19].
• Brain-computer interactions: By permitting a more direct
and practical connection between the human brain and
computers or robotic systems, neuromorphic computing Fig. 18: Characteristics of Neuromorphic computing
can improve brain-computer interactions.
• High parallel operation: Parallelism is a fundamental and
D. Von Neumann Vs Neuromorphic Computers essential aspect of neuromorphic computing. The inter-
A neuromorphic computer is more powerful than a von connected network of synthetic neurons and synapses,
Neumann computer. Figure 17 shows an essential comparison modeled after the composition and operation of organic
study between them. brain networks, enables this parallel processing [4].
Neuromorphic computers have neural connections modeled • Collocated processing and memory: The concept of a
after the human brain, operating unlike Von Neumann comput- separation between processor and memory is absent in

Authorized licensed use limited to: J.R.D. Tata Memorial Library Indian Institute of Science Bengaluru. Downloaded on April 23,2024 at 17:39:33 UTC from IEEE Xplore. Restrictions apply.
• Cognitive Computing for Natural Language Processing:
Neuromorphic processors, such as IBM’s TrueNorth or
Intel’s Loihi, are integrated into the cognitive comput-
ing system [30]. These processors mimic the behavior
of neurons and synapses, enabling parallel and event-
driven processing. The neuromorphic device is crucial for
Cognitive Computing due to its low-power operation and
parallel signal processing [35], [36].

V. C HALLENGES OF N EUROMOPHIC C OMPUTING


Although neuromorphic computing has the potential for
various applications, it also faces some challenges. Figure 20
shows a few challenges.
Fig. 19: Applications of Neuromorphic Computing

neuromorphic technology [4]. Collocating processing and


memory reduces the Von Neumann bottleneck caused by
the processor/memory separation, which slows down the
maximum throughput that may be achieved [23].
• Inherent scalability: Neuromorphic computers are built to
be intrinsically scalable [24] because adding additional
neuromorphic chips increases the possible number of
neurons and synapses. It is conceivable to treat several
physical neuromorphic chips as a single enormous neuro-
morphic implementation to run larger and larger networks
[25].
• Event-driven computation: As neuromorphic systems are Fig. 20: Challenges of Neuromorphic computing
event-driven, they only process data when the input
changes or a pertinent event occurs. By avoiding pointless • Building neuromorphic systems may be challenging be-
computations, this event-driven nature lessens the com- cause of how intricate their structures make it tough to
puting burden and conserves power. accurately duplicate a specific behaviour in one system
[37].
• Due to their processing powers, neuromorphic systems
B. Applications of Neuromorphic Computing
raise ethical issues [38]. Many people are concerned
Neuromorphic Computing is widely used in various fields, about the misuse or unauthorized use of these technolo-
as shown in Figure 19. gies because no particular legal frameworks control their
• AI for Image Recognition: A neuromorphic computing use.
architecture, such as a Spiking Neural Network (SNN), • Discussions on the uses and implications of this technol-
was employed for image recognition [26]. SNNs mimic ogy frequently center on the social effects [?] of giving
the behavior of biological neurons, enabling efficient and these programs human rights [39].
event-based processing [27]–[29]. • Although second-generation neuromorphic sensor tech-
• Robotics for Object Recognition and Navigation: Neu- nology is effective, there are still issues with its depend-
romorphic computing can be applied in robotics, specif- ability, speed, and accuracy in particular applications,
ically for object recognition and navigation tasks [30], such as voice recognition, where background noise can
[31]. cause incorrect data entry [39].
• IoT for Anomaly Detection in Smart Environments: A
neuromorphic computing architecture, such as a spiking VI. C ONCLUSION
neural network (SNN), was utilized for anomaly detec- In conclusion, neuromorphic computing is a fascinating and
tion. The SNN was trained to learn the standard patterns promising area of research. It aims to mimic the human brain’s
and behaviors of the IoT system using labeled data [32], organization and operation of neural networks. By taking
[33]. The neuromorphic processor here processes the cues from nature, neuromorphic computing tries to overcome
event-based sensor data in real time and detects anomalies the constraints of conventional computing and open up fresh
based on deviations from the learned patterns [34]. The possibilities for effectively solving challenging issues.
anomaly detection model is trained using neuromorphic With the development of more effective and potential com-
learning algorithms, such as spike-timing-dependent plas- puter systems, neuromorphic computing promises to revolu-
ticity (STDP). tionize the field of artificial intelligence. It is crucial to note

Authorized licensed use limited to: J.R.D. Tata Memorial Library Indian Institute of Science Bengaluru. Downloaded on April 23,2024 at 17:39:33 UTC from IEEE Xplore. Restrictions apply.
that several obstacles must be solved before neuromorphic [19] T. Wunderlich, A. F. Kungl, E. Müller, A. Hartel, Y. Stradmann, S. A.
computing can realize its full potential. Considerations for Aamir, A. Grübl, A. Heimbrecht, K. Schreiber, D. Stöckel, et al.,
“Demonstrating advantages of neuromorphic computation: a pilot study,”
developing and putting into use neuromorphic systems include Frontiers in neuroscience, vol. 13, p. 260, 2019.
the difficulty of replicating the brain, the absence of industry [20] O. Krestinskaya, A. P. James, and L. O. Chua, “Neuromemristive circuits
standards, the limited applications, and the high expense of for edge computing: A review,” IEEE Transactions on Neural Networks
and Learning Systems, vol. 31, no. 1, pp. 4–23, 2020.
doing so. The lack of easily accessible functional software [21] H. D. Nguyen, J. Yu, L. Xie, M. Taouil, S. Hamdioui, and D. Fey,
and hardware for the whole computational and computer “Memristive devices for computing: Beyond cmos and beyond von
science community is a significant obstacle to algorithmic and neumann,” in 2017 IFIP/IEEE International Conference on Very Large
Scale Integration (VLSI-SoC), pp. 1–10, 2017.
application development for neuromorphic computers. Thus, [22] F. Staudigl, F. Merchant, and R. Leupers, “A survey of neuromorphic
the domain has diverse research perspectives. computing-in-memory: Architectures, simulators, and security,” IEEE
Design and Test, vol. 39, no. 2, pp. 90–99, 2022.
[23] X. Zhang, V. Mohan, and A. Basu, “Cram: Collocated sram and dram
R EFERENCES with in-memory computing-based denoising and filling for neuromor-
phic vision sensors in 65 nm cmos,” IEEE Transactions on Circuits and
[1] Y. Chen, H. H. Li, C. Wu, C. Song, S. Li, C. Min, H.-P. Cheng, Systems II: Express Briefs, vol. 67, no. 5, pp. 816–820, 2020.
W. Wen, and X. Liu, “Neuromorphic computing’s yesterday, today, and [24] A. Shrestha, H. Fang, Z. Mei, D. P. Rider, Q. Wu, and Q. Qiu, “A survey
tomorrow–an evolutional view,” Integration, vol. 61, pp. 49–61, 2018. on neuromorphic computing: Models and hardware,” IEEE Circuits and
[2] Z. Yu, A. M. Abdulghani, A. Zahid, H. Heidari, M. A. Imran, and Systems Magazine, vol. 22, no. 2, pp. 6–35, 2022.
Q. H. Abbasi, “An overview of neuromorphic computing for artificial [25] D. Kuzum, R. G. D. Jeyasingh, S. Yu, and H.-S. P. Wong, “Low-
intelligence enabled hardware-based hopfield neural network,” IEEE energy robust neuromorphic computation using synaptic devices,” IEEE
Access, vol. 8, pp. 67085–67099, 2020. Transactions on Electron Devices, vol. 59, no. 12, pp. 3489–3494, 2012.
[3] I. Sharma and Vanshika, “Evolution of neuromorphic computing with [26] R. Vishwa, R. Karthikeyan, R. Rohith, and A. Sabaresh, “Current
machine learning and artificial intelligence,” in 2022 IEEE 3rd Global research and future prospects of neuromorphic computing in artificial
Conference for Advancement in Technology (GCAT), pp. 1–6, 2022. intelligence,” in IOP Conference Series: Materials Science and Engi-
[4] C. D. Schuman, S. R. Kulkarni, M. Parsa, J. P. Mitchell, P. Date, and neering, vol. 912, p. 062029, IOP Publishing, 2020.
B. Kay, “Opportunities for neuromorphic computing algorithms and [27] B. Sun, T. Guo, G. Zhou, S. Ranjan, Y. Jiao, L. Wei, Y. N. Zhou,
applications,” Nature Computational Science, vol. 2, no. 1, pp. 10–19, and Y. A. Wu, “Synaptic devices based neuromorphic computing ap-
2022. plications in artificial intelligence,” Materials Today Physics, vol. 18,
[5] S. Furber, “Large-scale neuromorphic computing systems,” Journal of p. 100393, 2021.
neural engineering, vol. 13, no. 5, p. 051001, 2016. [28] S. Kumar and B. Bhowmik, “Covid-19 waves and their impacts to
[6] T. Manjunath and B. Bhowmik, “Quantum machine learning and recent society,” in 2023 IEEE Guwahati Subsection Conference (GCON), pp. 1–
advancements,” in 2023 International Conference on Artificial Intelli- 5, 2023.
gence and Smart Communication (AISC), pp. 206–211, 2023. [29] A. Hindu and B. Bhowmik, “Impact of stress during covid-19 pan-
[7] J. Mack, R. Purdy, K. Rockowitz, M. Inouye, E. Richter, S. Valancius, demic,” in 2023 9th International Conference on Advanced Computing
N. Kumbhare, M. S. Hassan, K. Fair, J. Mixter, and A. Akoglu, and Communication Systems (ICACCS), vol. 1, pp. 1719–1724, 2023.
“Ranc: Reconfigurable architecture for neuromorphic computing,” IEEE [30] C. D. James, J. B. Aimone, N. E. Miner, C. M. Vineyard, F. H.
Transactions on Computer-Aided Design of Integrated Circuits and Rothganger, K. D. Carlson, S. A. Mulder, T. J. Draelos, A. Faust, M. J.
Systems, vol. 40, no. 11, pp. 2265–2278, 2021. Marinella, et al., “A historical survey of algorithms and hardware archi-
[8] I. Arikpo, F. Ogban, and I. Eteng, “Von neumann architecture and tectures for neural-inspired and neuromorphic computing applications,”
modern computers,” Global Journal of Mathematical Sciences, vol. 6, Biologically Inspired Cognitive Architectures, vol. 19, pp. 49–64, 2017.
no. 2, pp. 97–103, 2007. [31] R. Mondal and B. Bhowmik, “Decs: A deep neural network framework
[9] M. Godfrey and D. Hendry, “The computer as von neumann planned for cold start problem in recommender systems,” in 2022 IEEE Region
it,” IEEE Annals of the History of Computing, vol. 15, no. 1, pp. 11–21, 10 Symposium (TENSYMP), pp. 1–6, 2022.
1993. [32] M. R. Prathyusha and B. Biswajit, “Iot-enabled smart applications and
[10] M. Shaafiee, R. Logeswaran, and A. Seddon, “Overcoming the lim- challenges,” in 2023 8th International Conference on Communication
itations of von neumann architecture in big data systems,” in 2017 and Electronics Systems (ICCES), pp. 354–360, 2023.
7th International Conference on Cloud Computing, Data Science and [33] M. R. Prathyusha and B. Biswajit, “Iot evolution and recent advance-
Engineering - Confluence, pp. 199–203, 2017. ments,” in 2023 9th International Conference on Advanced Computing
[11] Z. Cai and X. Li, “Neuromorphic brain-inspired computing with hybrid and Communication Systems (ICACCS), vol. 1, pp. 1725–1730, 2023.
neural networks,” in 2021 IEEE International Conference on Artificial [34] F. C. Bauer, D. R. Muir, and G. Indiveri, “Real-time ultra-low power
Intelligence and Industrial Design (AIID), pp. 343–347, 2021. ecg anomaly detection using an event-driven neuromorphic processor,”
[12] A. Ganguly, R. Muralidhar, and V. Singh, “Towards energy efficient IEEE Transactions on Biomedical Circuits and Systems, vol. 13, no. 6,
non-von neumann architectures for deep learning,” in 20th International pp. 1575–1582, 2019.
Symposium on Quality Electronic Design (ISQED), pp. 335–342, 2019. [35] S. Yamamichi, A. Horibe, T. Aoki, K. Hosokawa, T. Hisada, and
[13] J. Vijaychandra, B. S. Sai, B. S. Babu, and P. Jagannadh, “A comprehen- H. Mori, “Implementation challenges for scalable neuromorphic com-
sive review on mcculloch-pitts neuron model,” International Journal of puting,” in 2017 Symposium on VLSI Technology, pp. T182–T183, 2017.
Innovative Technology and Exploring Engineering, vol. 8, no. 6, 2019. [36] K. K. Girish and B. Bhowmik, “Recent advancements and challenges
[14] P. Verschure, “Connectionist explanation: Taking positions in the mind- in fintech,” in 2023 14th International Conference on Computing Com-
brain dilemma,” Neural networks and a new artificial intelligence, munication and Networking Technologies (ICCCNT), pp. 1–7, 2023.
pp. 133–188, 1997. [37] M. R. Ahmed and B. Sujatha, “A review on methods, issues and chal-
[15] R. Eberhart and R. Dobbins, “Early neural network development his- lenges in neuromorphic engineering,” in 2015 International Conference
tory: the age of camelot,” IEEE Engineering in Medicine and Biology on Communications and Signal Processing (ICCSP), pp. 0899–0903,
Magazine, vol. 9, no. 3, pp. 15–18, 1990. 2015.
[16] S. B. Furber, F. Galluppi, S. Temple, and L. A. Plana, “The spinnaker [38] E. Klein, T. Brown, M. Sample, A. R. Truitt, and S. Goering, “Engi-
project,” Proceedings of the IEEE, vol. 102, no. 5, pp. 652–665, 2014. neering the brain: ethical issues and the introduction of neural devices,”
[17] F. Akopyan, J. Sawada, A. Cassidy, R. Alvarez-Icaza, J. Arthur, Hastings Center Report, vol. 45, no. 6, pp. 26–35, 2015.
P. Merolla, N. Imam, Y. Nakamura, P. Datta, G.-J. Nam, B. Taba, [39] Y. Zhao, J. Kang, and D. Ielmini, “Materials challenges and opportuni-
M. Beakes, B. Brezzo, J. B. Kuang, R. Manohar, W. P. Risk, B. Jackson, ties for brain-inspired computing,” MRS Bulletin, vol. 46, pp. 978–986,
and D. S. Modha, “Truenorth: Design and tool flow of a 65 mw 1 2021.
million neuron programmable neurosynaptic chip,” IEEE Transactions
on Computer-Aided Design of Integrated Circuits and Systems, vol. 34,
no. 10, pp. 1537–1557, 2015.
[18] T. C. Südhof, “The cell biology of synapse formation,” Journal of Cell
Biology, vol. 220, no. 7, p. e202103052, 2021.

Authorized licensed use limited to: J.R.D. Tata Memorial Library Indian Institute of Science Bengaluru. Downloaded on April 23,2024 at 17:39:33 UTC from IEEE Xplore. Restrictions apply.

You might also like