Memristor-Based Neural Network Implementation With Adjustable Synaptic Weights in LTSPICE
Memristor-Based Neural Network Implementation With Adjustable Synaptic Weights in LTSPICE
Abstract — The memristors are innovative electronic neural networks could be applied, in the cases where is
elements with nano-sized structure and with very good memory required simultaneously both very low power consumption,
and switching abilities. They have very low power consumption high speed and real-time signal processing and can help the
and a good compatibility to CMOS integrated chips, and they direct porting of trained classical neural networks into
could be used in neural networks, memories, and many other hardware. During the early 1990s, a problem occurred in the
schematics. In this paper an LTSPICE model of artificial neural realization of optoelectronic analog neural networks, related
network with memristor-based synapses is proposed. In this to impossibility of realizing negative synaptic weights [19].
2023 International Conference Automatics and Informatics (ICAI) | 979-8-3503-1291-1/23/$31.00 ©2023 IEEE | DOI: 10.1109/ICAI58806.2023.10339092
network, each synapse is realized with only one memristor, thus To overcome this limitation, an appropriate solution was
providing a higher reduction in circuit complexity and with
proposed: scaling the input signals and weights in the range
main benefit of that individual memristor resistance value can
be adjusted with external control voltage signals. The summing
between 0 and 1, and applying “logsig” activation function
and scaling component implementations are based on op-amps instead of the standard “tansig” function [20]. A modified
and memristors. We use the most common logarithmic- learning algorithm was applied to enable operation with
sigmoidal activation function and it is realized by a voltage- positive synaptic weights. All input signals and the respective
controlled source. The operation of the proposed memristor weights were limited to positive values and the weights are
neural network is analyzed and simulated in both LTSPICE and scaled and limited between the limits of zero and unity. When
MATLAB, and the derived results are compared and verified trying to scale or transfer the synaptic weights of a neural net,
successfully. The proposed memristor-based neural network is trained with positive and negative weights to the range
a significant step for engineering low power complex networks between 0 and 1, there arises a lack of the corresponding
in very high-density integrated circuits and chips. outputs. Therefore, it becomes necessary to train the neural
network applying a modified algorithm. In the context of
Keywords — memristor, neural network, activation function, circuit realization the result is that an already trained classical
modelling and simulation, electrical engineering, artificial neural network could not be used. But, with the use of
intelligence, LTSPICE, MATLAB memristors this limitation can be remedied. The first physical
memristor element [21], based on titanium dioxide is invented
I. INTRODUCTION
in Hewlett-Packard research labs by Stanley Williams [10],
Neural networks are architectures that have been inspired [22]. Some metal oxides, as hafnium dioxide, tantalum oxide,
from the electro-chemical communication between neural niobium oxide and many others are applied for
cells in the human brain and other similar biological unities implementations of memristors [23], [24]. Some useful
[1], [2]. The application of neural networks provides properties of memristors are their switching and memory
parallelism, energy effectiveness and potential usage in low- effects, very low power dissipation, high switching rate, nano-
power electronic modules [3], [4]. Neuromorphic chips have sizes and a very good compatibility to CMOS integrated chips
the ability to realize complex computations fast and [25], [26]. Memristors are successfully applicable in memory
effectively while dissipating minimal power. In the past, arrays, reconfigurable electronic circuits, neural networks, and
optical neural networks were dominant solution for realizing many others [27], [28].The utilization of memristors in neural
analogue neural networks, using light instead of electrical networks is a novel trend and after a performed literature
signals to make the neural computations, image and video overview for the generation of a synapse
processing, and for other applications [5], [6], [7]. In the recent [3],[9],[25],[26],[27],[34],[35],[36],[37],[38],[39] we found
years some of the newest advancements in the neuromorphic that usually three or more memristors are used, and two or
computing field are the neural networks that are based on more operational amplifiers are applied. This is related to a
memristor logic [8], [9]. The memristors are two-terminal and very high growth increase of the number of electronic
passive electronic elements, having the ability to store and components and the respective complexity of the hardware
retain the electric charges passed through them [10], [11]. The implemented neural networks increases too. As described, in
resistance of memristors can be altered by pulses with specific the optical neural network implementation only positive
amplitudes and durations and therefore, the memristors could weights could have been used, requiring a different modified
function as tunable resistors [12]. They could be utilized in training algorithms. With memristor-based weight
analogue electric circuits neural network implementations, coefficients the added benefit of such neural networks is that
and are associated with single unit computing and memory positive and negative synaptic weights could be implemented,
accumulation [13], [14], [15]. Memristors could accumulate resulting on the option to apply directly classical feedforward
amounts of information, proportional to the change of their multilayer or deep neural network architectures. Our
resistance [16], [17], [18]. In machine learning and artificial motivation for new contribution was to check and try to
intelligence analog implementation the memristor-based design, simulate and verify if potential implementations of
Authorized licensed use limited to: MANIPAL INSTITUTE OF TECHNOLOGY. Downloaded on March 25,2025 at 10:29:30 UTC from IEEE Xplore. Restrictions apply.
403
neural networks with reduced complexity are possible for [10], Joglekar and Biolek [12] is represented by the following
synaptic circuits and realized with the use of only one system of equations [12], [14]:
memristor per synapse. With memristor logic we can have v
tunable gain coefficient values with adjustable resistance i=
levels. With use of inverting summing operational amplifier RON x + ROFF (1 − x ) (1)
we can have negative weight realization with a certain x& = k ⋅ i ⋅ f ( x )
amplification ratio in respect to resistance ratios. This where RON and ROFF are the respective ON-state and OFF-state
provides the possibility to use such a model and realize each resistances, k is a physical constant, dependent on memristor
synapse by the use of a single memristor, expressively parameters [12], and f(x) is a window function applied for
reducing the complexity, and with both positive and negative restriction of state variable x [14]. A widely applied window
weights, with both positive and negative inputs, providing the is those proposed by Biolek [12]:
ability for direct application of the classical feedforward 2p
multilayer or deep learning neural networks in an electronic f B ( i, x ) = 1 − x − stp ( −i ) (2)
implementations. Here, p is an integer positive exponent and stp is the
For verification of this idea a simulation model needs to be classical Heaviside step function [12]. The model of
implemented and the best simulation environment is an Lehtonen-Laiho is a frequently used one, it has a good
Simulation Programs for Integrated Circuits Emphasis accuracy and operates at high-frequency signals. It is
(SPICE) software products [29], LTSPICE is the preferable represented by Eq. (3) [13]:
environment for electronic circuit analyses and simulations it
the late years, due to its free license, big company support and i = χ exp ( γ v ) − 1 + x n ⋅ β ⋅ sinh (α v )
updates, user-friendly interface, and a very good convergence dx (3)
[30]. Due to this, LTSPICE is used in the present work. The = a ⋅ v m ⋅ f B ( i, x )
main purpose of this paper is to suggest a simple memristor- dt
based neural network with positive and negative synaptic where the coefficients α, β, n, χ, γ, m and a are used for its
weights, with minimal number of electronic elements in the adjustment, according to experimental i-v relations [13]. In
LTSPICE environment. For the realization of this objective to [15], a simple model, based on Biolek memristor model and
simplify the model, every neuron has two major types of Hann window, is presented, and j is a physical constant [15].
inputs, one for the positive and one for negative weights v
respectively. This procedure is performed with application of i=
RON x + ROFF (1 − x )
memristors and op-amps, where the respective synapses are (4)
realized by single memristor elements. The activation dx 1
= j ⋅ i ⋅ 1 − cos ( 2π x )
functions are realized by voltage-controlled sources in dt 2
LTSPICE. For model verification and comparisons an
example trained neural network with six inputs, three neurons
in one hidden layer and two neurons in output layer is created
and analyzed in both LTSPICE and MATLAB environments.
The rest of the paper is organized as follows. Section 2
presents memristor operation, modeling and memristor-based
neural networks. The used simple LTSPICE memristor model
is described in Section 3. The proposed memristor-based
neural network in LTSPICE is presented in Section 4, with
included analysis and verification of the derived results,
followed with Conclusion and References sections.
II. MEMRISTORS, MODELING AND MEMRISTOR-BASED
NEURAL NETWORKS
For better understanding of the following paragraphs, a
brief overview on the basic aspects of the operation and
modeling of memristors is first provided.
A. A short description of memristor operation
The switching and memory properties of memristors are Fig. 1 a) A schematic of a memristor-based neural network; b) A block
related to their capability to allow alterations of their schematic of a neuron with memristor synapses
resistance, which is proportional to state variable, when
external voltage or current signals are applied [12], [31]. As A simple memristor-based neural network is presented in
written in [], the state variable x expresses the ratio between Fig. 1 a) for explanation of its structure and functioning. A
the length of the doped layer and those of the overall detailed diagram of a neuron with memristor-based synapses
memristor. The state variable is between zero and unity [32]. is shown in Fig. 1 b). The input signals are presented as x1,
x1, … and xn. They are applied to memristor synapses, having
B. Memristor Modeling weights denoted as w1, w2, … and wn. Each synapse is
Each model of memristor contains two equations [12]. The realized by a memristor, which resistance is related to the
first one expresses the i-v relation, and the second one relates respective synaptic weight w. The signals obtained after the
the time derivative of the memristor state variable and the synapses are applied to a summing scheme, implemented
current (or the voltage). A collection of frequently used metal- with memristors and op-amps, operating as linear resistors.
oxide memristor models, such as these of Williams-Strukov The signal gained after the adder is denoted as y_in and is
Authorized licensed use limited to: MANIPAL INSTITUTE OF TECHNOLOGY. Downloaded on March 25,2025 at 10:29:30 UTC from IEEE Xplore. Restrictions apply.
404
applied to a log-sig activation function, implemented by a
voltage-dependent voltage source. The output signal of the
memristor-based neuron is presented by y.
The neural network contains input nodes for applying the
input signals, a hidden layer and an output layer [20], [33].
The operation of the discussed memristor-based neural
network is related to a feed-forward and back-error correction
algorithm [20]. Various activation functions could be used
after the adder. Flat and differentiable transfer functions, such Fig. 2 Current-voltage characteristics of applied memristor model at different
as logarithmic-sigmoidal and tangent-sigmoidal expressions frequencies: a) f1 = 600 Hz; b) f2 = 3 kHz; c) f3 = 200 kHz
are regularly used. A relay transfer function is also used in
neural networks. Linear activation functions are also applied IV. THE PROPOSED LTSPICE MODEL OF MEMRISTOR-BASED
in neural networks, especially in the neurons of the output NEURAL NETWORK. ANALYSIS AND SIMULATIONS OF THE
layers, using a gain factor. In the next section, a modified PROPOSED NEURAL NETWORK
version of an enhanced model of memristor [32] is presented. In order for the schematic to work properly we need to
III. THE APPLIED LTSPICE MEMRISTOR MODEL scale all input voltage signals to the memristors in the upper
range of the activation threshold, that is 0.2 V. In order to ease
The first equation of the modified model is based on the computations and not to reach 0.2 in border cases we scale
Biolek memristor model [12]. The second one is founded on and rescale all memristor inputs to be set at 100 mV maximal
Lehtonen-Laiho model and utilizes a sine window sin2(πx). voltage levels, that will guarantee that we will always use the
v memristors in normal operation in their resistor mode range.
i=
RON x + ROFF (1 − x ) For some junctions, where we will always have a higher than
(5) 0.2 V signals we use memristors with their cut of value that
dx provides 16000 Ohm value, regardless of voltage. In this
= k ⋅ v m .sin 2 (π x ) ⋅ stp ( v − vthr )
dt aspect the input voltages will be in the -0.1V to 0.1V range.
The used voltage activation threshold is vthr = 0.2 V. The The resulting signals after the sigmoidal activation function,
minimal resistance value is 100 Ohm, while maximal is are between 0 and 1V and we scale them down with 0.1
16kOhm. When the voltage is at levels -0.2 to 0.2 Volts the coefficient resulting of voltages in -0.1 to 0.1 V in the inputs
memristors act as a common resistance, while when the input of the hidden layers, where with multiplication of weight
signals are above activation threshold the memristor coefficients with 10, results in one to one weight mapping
resistance is changed [12]. The utilized memristor model is from a trained neural network to a electronic implementation
analyzed for sinusoidal and impulse signals for soft-switching analogue circuit model.
and hard-switching modes. Using (5), LTSPICE memristor In Fig. 3, a principal schematic of the applied memristor-
model is obtained. It corresponds to the math model, based neuron is presented for description of its structure and
representing the i-v relation and state equation. The LTSPICE operation. A neuron in LTSPICE is shown in Fig. 4.
code of the discussed memristor model is given below for
additional explanations.
1 .subckt A14m te be
2 .params ron=100 roff=16e3 m=3 k=10 mm=1e-21 m0=300 vthr=0.2
3 C1 Y 0 {1}
4 .IC V(Y)={(roff-m0)/(roff-ron)}
5 R1 Y 0 10G
6 G2 0 Y value={((k*pow(V(te,be),m))*((pow(sin(pi*V(Y)),2)))*
(stpp((abs(V(te,be))-vthr),mm)))}
7 G1 te be value={V(te,be)*((1/(ron*(V(Y))+roff*(1-V(Y)))))}
8 .func stpp(x,p)={0.5*(1+(x/sqrt(pow(x,2)+p)))}
9 .ends A14m
The first row represents the name of the model, denoted
by A14m. The terminals are depicted as te (top electrode) and Fig. 3 A memristor-based neuron with positive and negative synaptic weights
be (bottom electrode). The parameters RON, ROFF, m, k and vthr In Fig. 4, the structure of an artificial neuron in LTSPICE
are put at the next row. The initial value of the voltage of is presented.
integrating capacitor C1 is proportional to the respective
memristance. The fifth row represents the additional resistor
Rad of 10 GΩ, connected in parallel to capacitor C1. The
controlled source G2 in the sixth row expresses the time
derivative of memristor state variable x. The source G1
presents the memristor current. The code finishes with
command “ends” [18], [30]. The code is included in a library,
available at https://github.com/mladenovvaleri/Advanced-
Memristor-Modeling-in-LTSpise [18].
The model is analyzed at sinusoidal signals with an
amplitude of 1.3 V and different frequencies. The derived i-v
relationships are presented in Fig. 2 for confirmation of
correct operation of model. When the signal’s frequency
increases, then a decrease of surface of i-v loop is detected.
This is in agreement with main memristors’ patterns [12]. Fig. 4 LTSPICE schematic model of a neuron
Authorized licensed use limited to: MANIPAL INSTITUTE OF TECHNOLOGY. Downloaded on March 25,2025 at 10:29:30 UTC from IEEE Xplore. Restrictions apply.
405
In order to verify the model, we first train a neural network enough, but still simple enough to signal trace and analyze the
with specific design features and architecture, that is simple structure as shown on Fig. 5. On Fig. 6 we present the resulting
for visualization and for comparison. For this purpose, we LTSPICE model of the proposed memristor-based neural
used the interactive example from playground.tensorflow.org network with 6 inputs connected to 3 hidden neurons and with
to train a neural network with 6 dimensional input space, with 2 output units.
3 hidden neurons and with 2 output neurons as a complex
Fig. 5 The trained neural network model of the example that is to be implemented in LTSPICE
Out1 Out2
Table2 Sum1 Sum2 Sum3 Tf1 Tf2 Tf3 Sum4 Sum5
(TF4) (TF5)
LTSPICE -0.9754V 9.5241V 2.2586V 0.2738V 0.9999V 0.9054V 1.0936V -0.8920V 0.7491V 0.2907V
MATLAB -0.9754V 9.5388V 2.2589V 0.2738V 0.9999V 0.9054V 1.0943V -0.8921V 0.7492V 0.2907V
Authorized licensed use limited to: MANIPAL INSTITUTE OF TECHNOLOGY. Downloaded on March 25,2025 at 10:29:30 UTC from IEEE Xplore. Restrictions apply.
406
In Table 1 are presented the levels of the output signals, and U18 correspond to v1 and v2, while U19 – U22 are related to
and all relevant resulting signals in LTSPICE, comparing v3 – v6. The op-amps U3 and U6 are supplied by DC voltage of
them to the results derived in MATLAB. 3 Volts. The feedback memristors M7, M8 and M9 are can be
Table 1 Input signals, synaptic weights and output signals
replaced by memristor blocks if higher resistance base to the
ratio value is demanded, by containing two or more
memristors connected in series and to avoid the corresponding
voltage to exceed the activation threshold if a certain below
16KΩ values are desired. The exact number of the memristors
in a block could be calculated, using the levels and number of
input signals, the synaptic weights and the feedback resistors.
The activation function is realized by a voltage-controlled
voltage source.
The neuron is analyzed for several definite cases, and the
levels of the input signals are presented in Table 2. The
weights of the synapses and the corresponding resistances of
memristors M1 – M6 are presented in Table 2. the time
diagrams of the input signals are presented in Fig. 7 a) for
visual expression and comparison. The corresponding
diagrams of the signals after the adders are shown in Fig. 7 b)
and the output signals – in Fig. 7 c). Observing the results, a
very good matching between the output signals is established.
Authorized licensed use limited to: MANIPAL INSTITUTE OF TECHNOLOGY. Downloaded on March 25,2025 at 10:29:30 UTC from IEEE Xplore. Restrictions apply.
407
V. CONCLUSION [17] Solovyeva, E. B., Azarov, V. A., "Comparative Analysis of Memristor
Models with a Window Function Described in LTspice," 2021 IEEE
In this paper, a simple LTSPICE model of a memristor- Conf. (ElConRus), 2021, pp. 1097-1101.
based neuron is suggested. The synapses are realized by only [18] Mladenov, V., "A Unified and Open LTSPICE Memristor Model
one memristor per synapse and with summing operational Library," MDPI Electronics, 2021, Vol. 10, no. 13: 1594.
https://doi.org/10.3390/electronics10131594.
amplifiers, by using their inverting inputs. The proposed
[19] Dickey, F.M. and DeLaurentis, J.M., 1993. Optical neural networks
synapses realize positive and negative weights. The with unipolar weights. Optics communications, 101(5-6), pp.303-305.
conducted analyses and simulations in both MATLAB and [20] Fausett, L., “Fundamentals of neural networks – architectures,
LTSPICE represents a good matching between the derived algorithms and applications,” p. 471, Prentice-Hall, 1994, ISBN
results and confirms the correct operation of the proposed 9780133341867.
memristor-based neuron. An advantage of the suggested [21] Chua, L., “Memristor - The missing circuit element,” IEEE
memristor neuron is its simple implementation with the use Transactions on Circuit Theory 1971, 18, pp. 507–519.
of nano-sized and low-power memristors, with a good [22] Ascoli, A., Weiher, M., Herzig, M., Slesazeck, S., Mikolajick, T. and
Tetzlaff, R., 2022. “Graph Coloring via Locally-Active Memristor
compatibility to CMOS high density integrated circuits. Oscillatory Networks,” J. Low Power Electr. Appl., 12(2), p.22.
[23] James, A. “Memristors - Circuits and Applications of Memristor
ACKNOWLEDGMENT Devices,” IntechOpen, 2019. doi:10.5772/intechopen.77562, ISBN
This research was done at a Neurocomputers Laboratory of 978-1-78984-074-2, p. 132.
the Technical University of Sofia. The authors would like to [24] Krestinskaya, O., Choubey, B. and James, A.P., 2020. “Memristive
GAN in analog,” Scientific reports, vol. 10, issue (1), pp.1-14.
thank the Research and Development Sector at the Technical
[25] Zhang, X., Wang, X., Ge, Z., Li, Z., Wu, M. and Borah, S., 2022. “A
University of Sofia for the financial support. Novel Memristive Neural Network Circuit and Its Application in
Character Recognition,” Micromachines, 13(12), p.2074.
REFERENCES
[26] Wang, Z., Joshi, S., Savel’ev, S., Song, W., Midya, R., Li, Y., Rao, M.,
[1] M. P. Sah, H. Kim and L. O. Chua, "Brains Are Made of Memristors," Yan, P., Asapu, S., Zhuo, Y. and Jiang, H., 2018. Fully memristive
in IEEE Circuits and Systems Magazine, vol. 14, no. 1, pp. 12-36, neural networks for pattern classification with unsupervised learning.
Firstquarter 2014, doi: 10.1109/MCAS.2013.2296414. Nature Electronics, 1(2), pp.137-145.
[2] Xu, W., Wang, J. and Yan, X., 2021. “Advances in memristor-based [27] Q. Hong, R. Yan, C. Wang and J. Sun, "Memristive Circuit
neural networks,” Frontiers in Nanotechnology, 3, p.645995. Implementation of Biological Nonassociative Learning Mechanism
[3] Zhang, Y., Wang, X., Friedman, E.G., “Memristor-Based Circuit and Its Applications," in IEEE Transactions on Biomedical Circuits
Design for Multilayer Neural Networks,” IEEE Transactions on and Systems, vol. 14, no. 5, pp. 1036-1050, Oct. 2020, doi:
Circuits and Systems: I Regular Papers 2018, 65, pp. 677–686. 10.1109/TBCAS.2020.3018777.
[4] Aggarwal, C., “Neural Networks and Deep Learning” Springer Int. [28] Li, C., Belkin, D., Li, Y., Yan, P., Hu, M., Ge, N., Jiang, H.,
Publ. AG: Berlin, Germany, 2018; ISBN 978-3-319-94463-0. Montgomery, E., Lin, P., Wang, Z. and Song, W., 2018. Efficient and
[5] B. Li, M. Yang and G. Shi, "Design of Analog CMOS-Memristive self-adaptive in-situ learning in multilayer memristor neural networks.
Nature communications, 9(1), p.2385.
Neural Network Circuits for Pattern Recognition," 2021 IEEE 14th
International Conf. on ASIC (ASICON), China, 2021, pp. 1-4. [29] Yang, Y., Lee, S.C., “Circuit Systems with MATLAB and PSpice,”
John Wiley & Sons: Hoboken, USA, 2008, ISBN 978-04-7082-240-1.
[6] O. Krestinskaya, K. N. Salama and A. P. James, "Learning in
Memristive Neural Network Architectures Using Analog [30] May, C., "Passive Circuit Analysis with LTspice® - An Interactive
Backpropagation Circuits," in IEEE Transactions on Circuits and Approach," Springer Nat. 2020, ISBN 978-3-030-38304-6, pp. 763.
Systems I: Regular Papers, 2019, vol. 66, no. 2, pp. 719-732. [31] T.D. Dongale, K.P. Patil, S.R. Vanjare, A.R. Chavan, P.K. Gaikwad,
[7] W. M. D. Bradley and R. J. Mears, "Backpropagation learning using R.K. Kamat, Modelling of nanostructured memristor device
positive weights for multilayer optoelectronic neural networks," characteristics using Artificial Neural Network (ANN), Journal of
Conference Proceedings LEOS'96 9th Annual Meeting IEEE Lasers Computational Science, Volume 11, 2015, Pages 82-90, ISSN 1877-
and Electro-Optics Society, vol.1, USA, 1996, pp. 294-295. 7503, https://doi.org/10.1016/j.jocs.2015.10.007.
[8] Su, B., Cai, J., Wang, Z., Chu, J. and Zhang, Y., 2022. A π-Type [32] Mladenov, V., S. Kirilov, “An Improved Memristor Model and
Memristor Synapse and Neuron With Structural Plasticity. Frontiers in Applications,” presented at IEEE MOCAST 2023 Conf., Athens,
Physics, 9, p.798971. Greece, pending for publishing in IEEE Xplore.
[9] Y. Wang et al., "A Configurable Artificial Neuron Based on a [33] González-Díaz_Conti, G., Vázquez-Castillo, J., Longoria-Gandara, O.,
Threshold-Tunable TiN/NbOₓ/Pt Memristor," in IEEE Electron Device Castillo-Atoche, A., Carrasco-Alvarez, R., Espinoza-Ruiz, A. and
Letters, vol. 43, no. 4, pp. 631-634, 2022. Ruiz-Ibarra, E., 2021. Hardware-based activation function-core for
neural network implementations. Electronics, 11(1), p.14.
[10] Strukov, D.B., Snider, G.S., Stewart, D.R., Williams, S., „The missing
memristor found,“ Nature 2008, vol. 453, pp. 80 – 83. [34] Nugent, M.A. and Molter, T.W., 2014. AHaH computing–from
[11] Baker, M., Jaoude, A., Kumar, V., Al Homouz, D., Nahla, Heba, A., metastable switches to attractors to machine learning. PloS one, 9(2),
p.e85175.
Al-Qutayri, M., Christoforou, N., "State of the art of metal oxide
memristor devices," Nanotechnology Reviews, vol. 5, no. 3, doi: [35] Wen, S., Xie, X., Yan, Z., Huang, T. and Zeng, Z., 2018. General
10.1515/ntrev-2015-0029, 2016, pp. 311-329. memristor with applications in multilayer neural networks. Neural
Networks, 103, pp.142-149.
[12] Biolek, Z., Biolek, D., Biolkova, V., “SPICE Model of Memristor with
Nonlinear Dopant Drift,” Radioengineering 2009, 18, pp. 210–214. [36] L. Danial, K. Sharma and S. Kvatinsky, "A Pipelined Memristive
[13] Lehtonen, E., Laiho, M., “CNN using memristors for neighborhood Neural Network Analog-to-Digital Converter," 2020 IEEE
connections,” In Proceedings of the IEEE CNNA 2010 Conf., Berkeley, International Symposium on Circuits and Systems (ISCAS), Seville,
CA, USA, 3–5 February 2010, pp. 1 – 4. Spain, 2020, pp. 1-5, doi: 10.1109/ISCAS45731.2020.9181108.
[14] Mladenov, V., “Advanced Memristor Modeling — Memristor Circuits [37] Yakopcic, C., Alom, M.Z. and Taha, T.M., 2016, July. Memristor
and Networks,” MDPI: Basel, Switzerland, 2019, ISBN 978-3-03897- crossbar deep network implementation based on a convolutional neural
104-7 (Hbk), pp. 172. network. In 2016 International joint conference on neural networks
(IJCNN) (pp. 963-970). IEEE.
[15] Zafar M., Awais M., Shehzad M., „Computationally efficient
memristor model based on Hann window function,“ Microelectronics [38] Zhang, L., Zhou, Y., Hu, X., Sun, F. and Duan, S., 2022. MSL-MNN:
Journal, vol. 125, 2022, doi.org/10.1016/j.mejo.2022.105476, pp.1-12. image deraining based on multi-scale lightweight memristive neural
network. Neural Computing and Applications, 34(9), pp.7299-7309.
[16] Ascoli, A., Tetzlaff, R., Biolek, Z., Kolka, Z., Biolkova, V., Biolek, D.
[39] Hong, Q., Zhao, L. and Wang, X., 2019. Novel circuit designs of
“The Art of Finding Accurate Memristor Model Solutions,” IEEE J.
memristor synapse and neuron. Neurocomputing, 330, pp.11-16.
Emerg. Sel. Top. Circuits Syst., 2015, vol. 5, pp. 133–142.
Authorized licensed use limited to: MANIPAL INSTITUTE OF TECHNOLOGY. Downloaded on March 25,2025 at 10:29:30 UTC from IEEE Xplore. Restrictions apply.
408