Quantum Machine Learning: Exploring Quantum Algorithms For Enhancing Deep Learning Models
Quantum Machine Learning: Exploring Quantum Algorithms For Enhancing Deep Learning Models
Science (IJAERS)
Peer-Reviewed Journal
ISSN: 2349-6495(P) | 2456-1908(O)
Vol-11, Issue-5; May, 2024
Journal Home Page Available: https://ijaers.com/
Article DOI: https://dx.doi.org/10.22161/ijaers.115.5
Received: 03 Apr 2024, Abstract—Using quantum algorithms to improve deep learning models'
Receive in revised form: 05 May 2024, capabilities is becoming increasingly popular as quantum computing
develops. In this work, we investigate how quantum algorithms using
Accepted: 15 May 2024,
quantum neural networks (QNNs) might enhance the effectiveness and
Available online: 23 May 2024 performance of deep learning models. We examine the effects of quantum-
©2024 The Author(s). Published by AI inspired methods on tasks, including regression, sorting, and optimization,
Publication. This is an open access article by thoroughly analyzing quantum algorithms and how they integrate with
under the CC BY license deep learning systems. We experiment with Estimator QNN and Sampler
(https://creativecommons.org/licenses/by/4.0/) QNN implementations using Qiskit machine-learning, analyzing their
forward and backward pass outcomes to assess the effectiveness of
Keywords—Quantum Machine learning
quantum algorithms in improving deep learning models. Our research
(QML), Deep learning, QNN, Qiskit.
clarifies the scope, intricacy, and scalability issues surrounding QNNs
Estimator QNN, Sampler QNN
and offers insights into the possible advantages and difficulties of
quantum-enhanced deep learning. This work adds to the continuing
investigation of quantum computing's potential to advance machine
learning and artificial intelligence paradigms by clarifying the interaction
between quantum algorithms and deep learning systems.
www.ijaers.com Page | 35
Gonaygunta et al. International Journal of Advanced Engineering Research and Science, 11(5)-2024
Quantum machine learning is also promising for solving Based on concepts fundamentally different from
intrinsic quantum problems, including optimizing quantum classical computing, quantum computing uses the exciting
circuits or mimicking quantum systems (Avramouli et al., field of quantum mechanics. Qubits, the quantum
2023). equivalents of classical bits, are the fundamental building
1.1 Research Motivations blocks of quantum computing (Kharsa et al., 2023).
Because of superposition, qubits can concurrently occupy
Realizing the inherent constraints of classical
many states, as opposed to traditional bits of information
computers and conventional deep learning approaches
limited to two possible values: 0 and 1. This greatly
drives research at the nexus of quantum science and deep
increases the range of possible computations since a qubit
learning. Due to constraints like the exponential increase
may simultaneously be a mixture of 0 and 1. Quantum
in computational resources needed for larger and more
algorithms are based on superposition, enabling them to
complicated optimization problems, traditional computers
investigate several possible solutions to a problem
have difficulty processing large-scale datasets and solving
simultaneously (Ramezani et al., 2020).
these challenging issues. In the meantime, despite their
great potential, deep learning models frequently suffer Quantum bits and superposition: It can display
from problems including overfitting, sluggish convergence entanglement, a distinctive quantum phenomenon in which
rates, and the requirement for large amounts of labeled the states of two qubits are inextricably connected
training data (Santosh et al., 2022). By utilizing quantum regardless of their distance. This phenomenon greatly
dynamics entanglement and superposition, quantum increases the computing capabilities of quantum computers
computing enables a paradigm change in computing by by allowing them to execute coordinated operations on
allowing calculations to be completed tenfold more entangled qubits. By utilizing these characteristics,
quickly than traditional computers (Liu et al., 2024). By quantum computing can potentially address
investigating quantum algorithms to improve deep learning computationally demanding issues beyond the capabilities
models, scientists hope to overcome these obstacles and of conventional computers (Zahorodko et al., 2021). The
uncover new possibilities for resolving challenging issues possibilities for quantum computing are enormous and
in various domains, from voice and picture recognition to potentially revolutionary, ranging from modeling intricate
medication development and optimization. The ultimate quantum systems to optimizing massive logistical
goal of this study is to push the limits of computation and networks.
machine learning to facilitate revolutionary advances in Solving major technical obstacles, such as de-
artificial intelligence and science. coherence and error correction, as well as creating scalable
quantum technology, are necessary to realize this promise.
Research on quantum computing is still driven by the
II. BACKGROUND STUDY
fascination of using qubits and superposition to solve
Two important areas are explored in the background problems and open up new computational and problem-
research for this work: deep learning and quantum solving possibilities (Avramouli et al., 2023).
technology. With the potential for exponential
2.2 Quantum Gates and Circuits
computational speedups, quantum computing uses the
concepts of quantum physics to manipulate data in ways Modern doors and quantum systems are the
that traditional computers cannot (Egon et al., 2023). fundamental components of quantum computing,
Meanwhile, by autonomously deriving abstractions from providing the means of controlling qubits and carrying out
data, the deep learning tech is part of ML for impressive calculations. Quantum gates are simple procedures that
performance in several disciplines. Scaling problems, change the state of qubits, much like logic gates are used
sluggish convergence rates, and the curse of in conventional computers to carry out logical operations
dimensionality beset conventional deep learning models. (Alchieri et al., 2021). Quantum gates can execute
By incorporating quantum computing concepts into deep operations that use the special characteristics of quantum
learning frameworks, researchers hope to get beyond these physics, in contrast to classical gates, which operate on
constraints and open up new possibilities for improved bits (0s and 1s) in Figure 1.
performance and efficiency when tackling challenging
tasks (Fikadu & Pandey, 2023). Laying the foundations for
investigating quantum algorithms to improve deep learning
models requires understanding the fundamental ideas and
difficulties in both quantum computing and deep learning.
2.1 Quantum Computing Fundamentals
www.ijaers.com Page | 36
Gonaygunta et al. International Journal of Advanced Engineering Research and Science, 11(5)-2024
www.ijaers.com Page | 37
Gonaygunta et al. International Journal of Advanced Engineering Research and Science, 11(5)-2024
www.ijaers.com Page | 38
Gonaygunta et al. International Journal of Advanced Engineering Research and Science, 11(5)-2024
Data Encoding into Quantum States: For quantum Estimator QNN and Sampler-QNN software versions
circuits to handle input data in the context of QNNs, use Qiskit primitives from Figure 5, the building blocks for
classical data must be converted into a quantum running QNNs on simulators or real quantum hardware.
description. Shifting classical data onto quantum states is a Each of these implementations accepts an extra class of the
common step in an encoding procedure. This can be done appropriate basic, Base-Sampler for Sampler QNN and
using volume-encoded data, angle coding, or other Base Estimator for Estimator QNN (Innan et al., 2023).
encoding approaches (Weigold et al., 2021). This phase is The QNN classes automatically instantiate the proper
demonstrated in the given code when using reference primitive (Sampler or Estimator) for smooth
QuantumCircuit from Qiskit to create the quantum circuits operation if no instance is explicitly supplied. Let us
(qc1 and qc2) to process the quantum-ready data during explore the theory of utilizing a Quantum Neural Network
the forward pass, and these circuits are initialized with (QNN) in Qiskit Machine algo to do a forward and
parameters (params1 and inputs2) that reflect the classical backward pass together (Abbas et al., 2021). We will
input data. review the underlying idea of these procedures and provide
Quantum Circuit Parameterization: The quantum code examples.
circuits of the QNN process the classical data once it has Forward Pass
been encoded into quantum states. These quantum circuits In a QNN, a forward move entails calculating the
are parametrized, which means that the variables (weights) output, transferring the input data via the quantum circuit,
are changed within the training phase to maximize the and maybe doing some afterward. This is how it operates:
network's performance for the assigned job (Shi et al.,
Input Preparation: Quantum states—typically represented
2023). The given code builds the quantum circuits (qc1
as qubits in a quantum circuit—are created by encoding
and qc2) using parameters (params1 and inputs2) and then
the incoming data. A characteristic of the incoming data
modifies them using gates like RX and Ry to represent the
may be correlated with each qubit.
QNN's processing phases. The network's performance is
then enhanced by training these parameters using Quantum Circuit Execution: The quantum circuit's training
optimization methods like backpropagation to minimize a weights determine the encoded quantum states' processes.
specified loss function. Measurement: The quantum circuit is executed, and then
3.4 Implementation and Measurement the qubits are measured. The results of these
measurements yield classical data that can be handled
Quantum Neural Networks (QNNs), which are
further.
application-agnostic compute units tailored to various use
cases, are available through the Qiskit Machine Learning The output of the Sampler QNN is a probability
package. These QNNs have two distinct implementations distribution across all potential measurement results, with
that are organized around an interface: each element representing the likelihood of detecting a
particular measurement outcome. The output vector in this
instance is shaped like (1, 4), meaning that there is one
sample and four potential measurement results. The
corresponding probability for each possible event is
around 0.018, 0.257, 0.527, and 0.198. Conversely, the
Estimator QNN yields a single probability value for every
input sample. The output vector's structure of (2, 1)
denotes that two input samples were processed
Fig.5: Input QNN Estimator QNN concurrently, with one probability value computed for
each sample. In this instance, the probability value
obtained from both samples is around 0.297.
Neural Network: This is the interface for all neural
networks within the Qiskit Machine Learning framework. Backward Pass:
It is an abstract class from which all QNNs inherit. The backward pass in a QNN involves calculating
Estimator QNN: This implementation evaluates quantum gradients of the loss function concerning the quantum
mechanical observables for its operations. circuit's trainable parameters (weights). Here is how it
works:
Sampler QNN: In contrast, depending on the data acquired
by testing a quantum computing circuit, Sampler-QNN Compute Loss: First, the loss function is computed using
functions. the predicted output from the forward pass and the target
output.
www.ijaers.com Page | 39
Gonaygunta et al. International Journal of Advanced Engineering Research and Science, 11(5)-2024
Gradient Calculation: Grades of the loss functions through capacity in deeper QNNs allows them to take on more
deference to the trainable limits (weights) are calculated difficult learning tasks and extract higher-level features
using methods like backs-propagation shown in Figure 6. from the data. However, the effective use of deep QNNs
Parameter Update: The gradients are used to update the necessitates strong optimization methods, effective
parameters of the quantum circuit to minimize the loss resource management, and noise and quantum error
function. mitigation approaches to guarantee realistic scalability and
practicality in quantum computing platforms; the depth of
QNNs must be matched with the quantum resources
available, such as the number of qubits, circuit coherence
times, and gate fidelities.
REFERENCES
[1] Abbas, A., Sutter, D., Zoufal, C., Lucchi, A., Figalli, A., &
Woerner, S. (2021). The power of Quantum Neural
Networks. Nature Computational Science, 1(6), 403–409.
https://doi.org/10.1038/s43588-021-00084-1
[2] Alchieri, L., Badalotti, D., Bonardi, P., & Bianco, S. (2021).
An introduction to quantum machine learning: From
Quantum Logic to Quantum Deep Learning. Quantum
Machine Intelligence, 3(2). https://doi.org/10.1007/s42484-
021-00056-8
Figure 6: Gradient model training [3] Avramouli, M., Savvas, I. K., Vasilaki, A., & Garani, G.
(2023). Unlocking the potential of quantum machine
The quantum neural network (QNN) propagates input
learning to advance drug discovery. Electronics, 12(11),
data forward during each epoch, and then gradients
2402. https://doi.org/10.3390/electronics12112402
calculated by gradient descent propagate backward. The [4] Batra, K., Zorn, K. M., Foil, D. H., Minerali, E., Gawriljuk,
objective is to repeatedly adjust the QNN's parameters to V. O., Lane, T. R., & Ekins, S. (2021). Quantum machine
reduce the loss function and enhance its functionality. learning algorithms for Drug Discovery Applications.
Attaining an accuracy score of 83% signifies that the QNN Journal of Chemical Information and Modeling, 61(6),
has successfully learned to categorize or predict outcomes 2641–2647. https://doi.org/10.1021/acs.jcim.1c00166
with a high degree of accuracy after several training [5] Beer, K., Bondarenko, D., Farrelly, T., Osborne, T. J.,
epochs. The QNN may effectively modify its parameters Salzmann, R., Scheiermann, D., & Wolf, R. (2020).
Training Deep Quantum Neural Networks. Nature
to better suit the training data by utilizing Gradient
Communications, 11(1). https://doi.org/10.1038/s41467-
Descent, which improves the QNN's performance in deep
020-14454-2
learning tasks. [6] Bishwas, A. K., Mani, A., & Palade, V. (2020). 4. from
classical to Quantum Machine Learning. Quantum Machine
Learning, 67–88. https://doi.org/10.1515/9783110670707-
IV. CONCLUSION
004
In summary, the depth and complexity of quantum neural [7] Buffoni, L., & Caruso, F. (2020). New trends in Quantum
networks (QNNs) are critical to their effectiveness and Machine Learning <sup>(a)</sup>. Europhysics
generalizability in various applications. The quantity of Letters, 132(6), 60004. https://doi.org/10.1209/0295-
layers in a neural network architecture—conventional and 5075/132/60004
[8] Das, P. (2023). Design and Comparative Analysis of
Quantum—is called the QNN's depth. Deeper QNNs are
Quantum Hashing Algorithms Using Qiskit.
often associated with a higher ability to identify complex
https://doi.org/10.36227/techrxiv.24037440.v1
patterns and correlations in the input data, which may [9] Dave, R. (2022). Machine Learning Applications of Data
enhance performance on challenging tasks. Moreover, Security and Privacy a Systemic Review.
there are drawbacks to deepening a QNN, including https://doi.org/10.36227/techrxiv.20063219.v1
greater computing complexity, noise sensitivity, and the [10] Dunjko, V., & Wittek, P. (2020). A non-review of Quantum
possibility of gradients disappearing or ballooning during Machine Learning: Trends and explorations. Quantum
training. Views, 4, 32. https://doi.org/10.22331/qv-2020-03-17-32
[11] Egon, K., ROSINSKI, J., KARL, L., & EUGENE, R.
The debate on QNN depth entails weighing the trade- (2023). Quantum Machine Learning: The Confluence of
offs between computational viability and model Quantum Computing and Ai.
expressiveness. The potential for improved representation https://doi.org/10.31219/osf.io/rf4xp
www.ijaers.com Page | 40
Gonaygunta et al. International Journal of Advanced Engineering Research and Science, 11(5)-2024
[12] Fikadu Tilaye, G., & Pandey, A. (2023). Investigating the algorithms in Quantum Computing: A survey. 2020
effects of hyperparameters in quantum-enhanced deep International Joint Conference on Neural Networks
reinforcement learning. Quantum Engineering, 2023, 1–16. (IJCNN). https://doi.org/10.1109/ijcnn48605.2020.9207714
https://doi.org/10.1155/2023/2451990 [25] Santosh, K., Das, N., & Ghosh, S. (2022). Deep learning
[13] Gil-Fuster, E., Eisert, J., & Bravo-Prieto, C. (2024). models. Deep Learning Models for Medical Imaging, 65–
Understanding Quantum Machine Learning also requires 97. https://doi.org/10.1016/b978-0-12-823504-1.00013-1
rethinking generalization. Nature Communications, 15(1). [26] Shi, J., Li, Z., Lai, W., Li, F., Shi, R., Feng, Y., & Zhang, S.
https://doi.org/10.1038/s41467-024-45882-z (2023). Two end-to-end quantum-inspired deep neural
[14] Innan, N., Khan, M. A.-Z., & Bennai, M. (2023). Financial networks for text classification. IEEE Transactions on
fraud detection: A comparative study of quantum machine Knowledge and Data Engineering, 35(4), 4335–4345.
learning models. International Journal of Quantum https://doi.org/10.1109/tkde.2021.3130598
Information, 22(02). [27] Surjeet, S., Bulla, C., Arya, A., Idrees, S., Singh, G., Rao, S.
https://doi.org/10.1142/s0219749923500442 G., & Shirahatti, A. (2023, March 6). A quantum machine
[15] Jadhav, A., Rasool, A., & Gyanchandani, M. (2023). learning approach for bridging the gap between Quantum
Quantum Machine Learning: Scope for real-world problems. and classical computing. International Journal of Intelligent
Procedia Computer Science, 218, 2612–2625. Systems and Applications in Engineering.
https://doi.org/10.1016/j.procs.2023.01.235 http://www.ijisae.org/index.php/IJISAE/article/view/4539
[16] Jerbi, S., Trenkwalder, L. M., Poulsen Nautrup, H., Briegel, [28] Tychola, K. A., Kalampokas, T., & Papakostas, G. A.
H. J., & Dunjko, V. (2021). Quantum enhancements for (2023). Quantum Machine Learning—an overview.
deep reinforcement learning in large spaces. PRX Quantum, Electronics, 12(11), 2379.
2(1). https://doi.org/10.1103/prxquantum.2.010328 https://doi.org/10.3390/electronics12112379
[17] Jhanwar, A., & Nene, M. J. (2021). Enhanced machine [29] Valdez, F., & Melin, P. (2022). A review on quantum
learning using quantum computing. 2021 Second computing and deep learning algorithms and their
International Conference on Electronics and Sustainable applications. Soft Computing, 27(18), 13217–13236.
Communication Systems (ICESC). https://doi.org/10.1007/s00500-022-07037-4
https://doi.org/10.1109/icesc51422.2021.9532638 [30] Weigold, M., Barzen, J., Leymann, F., & Salm, M. (2021).
[18] Khan, T. M., & Robles-Kelly, A. (2020). Machine learning: Expanding data encoding patterns for quantum algorithms.
Quantum vs classical. IEEE Access, 8, 219275–219294. 2021 IEEE 18th International Conference on Software
https://doi.org/10.1109/access.2020.3041719 Architecture Companion (ICSA-C).
[19] Kharsa, R., Bouridane, A., & Amira, A. (2023). Advances in https://doi.org/10.1109/icsa-c52384.2021.00025
quantum machine learning and deep learning for image [31] Wichert, A. (2023). Qiskit. Quantum Artificial Intelligence
classification: A survey. Neurocomputing, 560, 126843. with Qiskit, 41–62. https://doi.org/10.1201/9781003374404-
https://doi.org/10.1016/j.neucom.2023.126843 3
[20] Lewis, L., Huang, H.-Y., Tran, V. T., Lehner, S., Kueng, R., [32] Zahorodko, P. V., Semerikov, S. O., Soloviev, V. N., Striuk,
& Preskill, J. (2024). Improved machine learning algorithm A. M., Striuk, M. I., & Shalatska, H. M. (2021).
for predicting ground state properties. Nature Comparisons of performance between quantum-enhanced
Communications, 15(1). https://doi.org/10.1038/s41467- and classical machine learning algorithms on the IBM
024-45014-7 Quantum experience. Journal of Physics: Conference Series,
[21] Liu, J., Liu, M., Liu, J.-P., Ye, Z., Wang, Y., Alexeev, Y., 1840(1), 012021. https://doi.org/10.1088/1742-
Eisert, J., & Jiang, L. (2024). Towards provably efficient 6596/1840/1/012021
quantum algorithms for large-scale machine-learning [33] Zeguendry, A., Jarir, Z., & Quafafou, M. (2023). Quantum
models. Nature Communications, 15(1). Machine Learning: A Review and Case Studies. Entropy,
https://doi.org/10.1038/s41467-023-43957-x 25(2), 287. https://doi.org/10.3390/e25020287
[22] Priyanka, G. S., Venkatesan, M., & Prabhavathy, P. (2023).
Advancements in quantum machine learning and Quantum
Deep Learning: A Comprehensive Review of Algorithms,
challenges, and future directions. 2023 International
Conference on Quantum Technologies, Communications,
Computing, Hardware and Embedded Systems Security (iQ-
CCHESS). https://doi.org/10.1109/iq-
cchess56596.2023.10391745
[23] Rahmaniar, W., Ramzan, B., & Ma'arif, A. (2024). Deep
learning and quantum algorithms approach to investigating
the feasibility of wormholes: A Review. Astronomy and
Computing, 47, 100802.
https://doi.org/10.1016/j.ascom.2024.100802
[24] Ramezani, S. B., Sommers, A., Manchukonda, H. K.,
Rahimi, S., & Amirlatifi, A. (2020). Machine learning
www.ijaers.com Page | 41