0% found this document useful (0 votes)
30 views

Quantum Machine Learning: Exploring Quantum Algorithms For Enhancing Deep Learning Models

Using quantum algorithms to improve deep learning models' capabilities is becoming increasingly popular as quantum computing develops. In this work, we investigate how quantum algorithms using quantum neural networks (QNNs) might enhance the effectiveness and performance of deep learning models. We examine the effects of quantum-inspired methods on tasks, including regression, sorting, and optimization, by thoroughly analyzing quantum algorithms and how they integrate with deep learning systems.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
30 views

Quantum Machine Learning: Exploring Quantum Algorithms For Enhancing Deep Learning Models

Using quantum algorithms to improve deep learning models' capabilities is becoming increasingly popular as quantum computing develops. In this work, we investigate how quantum algorithms using quantum neural networks (QNNs) might enhance the effectiveness and performance of deep learning models. We examine the effects of quantum-inspired methods on tasks, including regression, sorting, and optimization, by thoroughly analyzing quantum algorithms and how they integrate with deep learning systems.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

International Journal of Advanced Engineering Research and

Science (IJAERS)
Peer-Reviewed Journal
ISSN: 2349-6495(P) | 2456-1908(O)
Vol-11, Issue-5; May, 2024
Journal Home Page Available: https://ijaers.com/
Article DOI: https://dx.doi.org/10.22161/ijaers.115.5

Quantum Machine Learning: Exploring Quantum


Algorithms for Enhancing Deep Learning Models
Hari Gonaygunta*, Mohan Harish Maturi, Geeta Sandeep Nadella, Karthik Meduri,
Snehal Satish5

Department of Information Technology, University of the Cumberlands, USA


*Email: [email protected]

Received: 03 Apr 2024, Abstract—Using quantum algorithms to improve deep learning models'
Receive in revised form: 05 May 2024, capabilities is becoming increasingly popular as quantum computing
develops. In this work, we investigate how quantum algorithms using
Accepted: 15 May 2024,
quantum neural networks (QNNs) might enhance the effectiveness and
Available online: 23 May 2024 performance of deep learning models. We examine the effects of quantum-
©2024 The Author(s). Published by AI inspired methods on tasks, including regression, sorting, and optimization,
Publication. This is an open access article by thoroughly analyzing quantum algorithms and how they integrate with
under the CC BY license deep learning systems. We experiment with Estimator QNN and Sampler
(https://creativecommons.org/licenses/by/4.0/) QNN implementations using Qiskit machine-learning, analyzing their
forward and backward pass outcomes to assess the effectiveness of
Keywords—Quantum Machine learning
quantum algorithms in improving deep learning models. Our research
(QML), Deep learning, QNN, Qiskit.
clarifies the scope, intricacy, and scalability issues surrounding QNNs
Estimator QNN, Sampler QNN
and offers insights into the possible advantages and difficulties of
quantum-enhanced deep learning. This work adds to the continuing
investigation of quantum computing's potential to advance machine
learning and artificial intelligence paradigms by clarifying the interaction
between quantum algorithms and deep learning systems.

I. INTRODUCTION Despite its unparalleled computational capacity, quantum


Two of the most exciting areas in computer science are computing is still in its infancy and faces obstacles,
quantum data processing and machine learning (Tychola et including noise, de-coherence, and difficulty scaling up
al., 2023). With the ability to use the laws of quantum quantum machines.
physics to solve complicated computational problems, The combined characteristics of deep learning and
Quantum technology can change an array of various quantum computing provide impetus for investigating their
sectors ten times faster than regular computers. On the interaction. Using quantum phenomena like superposition
other hand, challenges like picture identification, natural and entanglement allows quantum computing to get
language processing, and drug discovery have been around the restrictions imposed on classical computation
remarkably solved by machine learning, especially deep (Jadhav et al., 2023). These characteristics may be used to
learning (Liu et al., 2024). create new algorithms that handle and analyze big datasets
Deep learning and quantum computing have faster than their traditional equivalents.
fundamental constraints, notwithstanding their respective Quantum machine learning methods may solve some of
triumphs. Issues, including the curse of dimensionality, the core problems in deep learning. Quantum algorithms,
sluggish convergence rates, and the requirement for for instance, may make feature mapping, reduce
enormous volumes of labeled training data, are common to dimensionality, and optimize strategies more effectively,
classical deep learning models (Valdez & Melin, 2023). improving the functionality of deep neural networks.

www.ijaers.com Page | 35
Gonaygunta et al. International Journal of Advanced Engineering Research and Science, 11(5)-2024

Quantum machine learning is also promising for solving Based on concepts fundamentally different from
intrinsic quantum problems, including optimizing quantum classical computing, quantum computing uses the exciting
circuits or mimicking quantum systems (Avramouli et al., field of quantum mechanics. Qubits, the quantum
2023). equivalents of classical bits, are the fundamental building
1.1 Research Motivations blocks of quantum computing (Kharsa et al., 2023).
Because of superposition, qubits can concurrently occupy
Realizing the inherent constraints of classical
many states, as opposed to traditional bits of information
computers and conventional deep learning approaches
limited to two possible values: 0 and 1. This greatly
drives research at the nexus of quantum science and deep
increases the range of possible computations since a qubit
learning. Due to constraints like the exponential increase
may simultaneously be a mixture of 0 and 1. Quantum
in computational resources needed for larger and more
algorithms are based on superposition, enabling them to
complicated optimization problems, traditional computers
investigate several possible solutions to a problem
have difficulty processing large-scale datasets and solving
simultaneously (Ramezani et al., 2020).
these challenging issues. In the meantime, despite their
great potential, deep learning models frequently suffer Quantum bits and superposition: It can display
from problems including overfitting, sluggish convergence entanglement, a distinctive quantum phenomenon in which
rates, and the requirement for large amounts of labeled the states of two qubits are inextricably connected
training data (Santosh et al., 2022). By utilizing quantum regardless of their distance. This phenomenon greatly
dynamics entanglement and superposition, quantum increases the computing capabilities of quantum computers
computing enables a paradigm change in computing by by allowing them to execute coordinated operations on
allowing calculations to be completed tenfold more entangled qubits. By utilizing these characteristics,
quickly than traditional computers (Liu et al., 2024). By quantum computing can potentially address
investigating quantum algorithms to improve deep learning computationally demanding issues beyond the capabilities
models, scientists hope to overcome these obstacles and of conventional computers (Zahorodko et al., 2021). The
uncover new possibilities for resolving challenging issues possibilities for quantum computing are enormous and
in various domains, from voice and picture recognition to potentially revolutionary, ranging from modeling intricate
medication development and optimization. The ultimate quantum systems to optimizing massive logistical
goal of this study is to push the limits of computation and networks.
machine learning to facilitate revolutionary advances in Solving major technical obstacles, such as de-
artificial intelligence and science. coherence and error correction, as well as creating scalable
quantum technology, are necessary to realize this promise.
Research on quantum computing is still driven by the
II. BACKGROUND STUDY
fascination of using qubits and superposition to solve
Two important areas are explored in the background problems and open up new computational and problem-
research for this work: deep learning and quantum solving possibilities (Avramouli et al., 2023).
technology. With the potential for exponential
2.2 Quantum Gates and Circuits
computational speedups, quantum computing uses the
concepts of quantum physics to manipulate data in ways Modern doors and quantum systems are the
that traditional computers cannot (Egon et al., 2023). fundamental components of quantum computing,
Meanwhile, by autonomously deriving abstractions from providing the means of controlling qubits and carrying out
data, the deep learning tech is part of ML for impressive calculations. Quantum gates are simple procedures that
performance in several disciplines. Scaling problems, change the state of qubits, much like logic gates are used
sluggish convergence rates, and the curse of in conventional computers to carry out logical operations
dimensionality beset conventional deep learning models. (Alchieri et al., 2021). Quantum gates can execute
By incorporating quantum computing concepts into deep operations that use the special characteristics of quantum
learning frameworks, researchers hope to get beyond these physics, in contrast to classical gates, which operate on
constraints and open up new possibilities for improved bits (0s and 1s) in Figure 1.
performance and efficiency when tackling challenging
tasks (Fikadu & Pandey, 2023). Laying the foundations for
investigating quantum algorithms to improve deep learning
models requires understanding the fundamental ideas and
difficulties in both quantum computing and deep learning.
2.1 Quantum Computing Fundamentals

www.ijaers.com Page | 36
Gonaygunta et al. International Journal of Advanced Engineering Research and Science, 11(5)-2024

2024). Labeled data might not always be easily accessible


for certain tasks or domains, a major obstacle to deep
learning model training.

Fig.1: Quantum Gates and Circuits

The Hadamard gate, which produces limbo by


changing a qubit from a definite state (0 or 1) to a state Fig.2: Challenges in deep learning models
that mixes 0 and 1, is one of the basic quantum gates (Jerbi
et al., 2021). This gate is essential for creating quantum High-Performance Hardware: Large computing
states that allow for simultaneous exploration of many resources, such as powerful GPUs or specialized hardware
solutions and parallel computing. Similar to the traditional like TPUs (Tensor Processing Units), are frequently
NOT gate, the Pauli-X gate is another crucial gate that required to train deep learning models (Bishwas et al.,
allows a qubit to be switched from 0 to 1 or vice versa 2020). For smaller businesses or academics with limited
(Buffoni & Carus, 2021) to accomplish desired functions funding, deep learning solutions may not be as scalable
on qubits, quantum circuits are chains of quantum gates due to the high cost of obtaining such gear. This problem
organized in a certain order. Qubits are shown as lines in is made more difficult by the increasing need for more
these circuit representations, and gates are shown as powerful hardware as deep learning models become larger
symbols operating on these lines (Khan & Robles-Kelly, and more complicated (Gil-Fuster et al., 2024).
2020). The order and configuration of gates in a quantum
Suboptimal Hyper-parameter Optimization: Many
circuit dictate how the computation is performed.
parameters that control the construction, method of
Quantum circuits of different gates are used to develop training, and evaluation process of deep learning models
fundamentals, such as the infinite integer factoring are usually involved. Optimizing the hyperparameter
technique proposed by Fried and the randomized algorithm combination can greatly influence how well the model
for searching developed by Grover (Batra et al., 2021). performs. Manually adjusting hyperparameters takes time
Gather, which happens whenever a number of the qubits and effort and frequently calls for domain knowledge and
are connected to the point that their current conditions rely experience (Lewis et al., 2024). Although there are
on each other even if particles differ by enormous automated methods for optimizing hyperparameters, their
distances, is a further significant idea in quantum devices. effectiveness may not always be guaranteed, resulting in
Due to their ability to generate and modify entangled less-than-ideal model performance.
states, quantum gates are useful for applications like
Data Security and Privacy: Data privacy and security
quantum cryptography and teleportation (Dunjko &
are challenges raised by deep learning models trained on
Wittek, 2020).
private or sensitive data. Protecting the privacy and
2.3 Challenges in Deep Learning integrity of data fit for training and test inference is critical
Despite its astounding achievements, in Figure 2, deep as deep-learning models proliferate in various applications,
learning still has several issues that prevent mainstream such as cybersecurity, finance, and healthcare. Strong data
acceptance and use in various fields (Liu et al., 2024). security procedures are even more crucial in light of
Among these difficulties are: worries about possible weaknesses, hostile assaults, and
Data Availability: Deep learning models need much- unintentional biases in deep learning models (Dave, 2022).
labeled data to discover patterns and provide precise
predictions. Getting tagged data may be expensive, time-
consuming, and sometimes not feasible (Surjeet et al.,

www.ijaers.com Page | 37
Gonaygunta et al. International Journal of Advanced Engineering Research and Science, 11(5)-2024

III. RESEARCH METHODOLOGY quantum circuit, enabling it to handle the quantum-ready


This study's research approach uses a series of discrete data following its design and specifications. This is
phases to investigate Qiskit, a quantum computing important because it bridges the Quantum and classical
framework, to investigate the use of quantum neural worlds, allowing the QNN to deal with classical data in a
networks for classification problems (Wichert, 2023). quantum context and explore quantum-enhanced learning
There are several intriguing ways to improve deep learning techniques.
models using quantum algorithms. First, using quantum
parallelism and entanglement to build deep learning
models might speed up optimization processes, leading to
faster convergence and more effective model training.
Second, using quantum computing's enhanced capacity to
handle high-dimensional data, approaches such as
quantum feature mapping and dimensionality reduction
enable more effective feature representation and decrease
computational complexity (Das, 2023). The ability to
encode conventional data into quantum states is made
possible by quantum data encoding techniques, which may
make data processing and representation in quantum-based
models more effective. The following is an outline of the
methodology:
3.1 Quantum Algorithms for Enhancing Deep Learning
Fig.3: QNN Model
With linked nodes or neurons structured in layers,
traditional neural networks are used for computation
statements motivated by social intellect and capable of The Estimator QNN computes expectation values for
identifying patterns in data and solving complicated the forward pass based on a possible hybrid mechanics
problems (Priyanka, 2023). Modifying parameters with variable and a parametric quantum network as inputs. Lists
machine learning or deep learning approaches teaches of observables may also be entered into the Estimator
these networks. QNN to create more intricate QNNs.
Quantum Machine Learning (QML) aims to combine Let us use a basic example, Figure 4, to demonstrate
ideas from conventional and quantum computers to how an Estimator QNN works. Building the parametrized
develop and improve learning approaches (Beer et al., circuit is where we begin. Two parameters make up this
2020). This merging is embodied by Quantum Neural quantum circuit: one denotes a Q-N-N contribution, and
Networks (QNNs), which combine parametrized quantum the additional a trainable load.
circuits with conventional neural networks. QNNs are
positioned at the nexus of two domains and provide two
views:
From the machine learning perspective, figure 3 QNNs
work similarly to classical models in that they are
computationally trained to find underlying patterns in data.
As shown in Figure 2, they function by loading classical
Figure 4: QNN input
input into quantum states, processing it with quantum
gates defined by adaptable weight, and measuring the Now that we have defined the expected value
output state. computation, we can construct an observable. The
Estimator QNN will automatically create the default
3.2 Data Loading
observable if it is not set. The number of qubits in this
When discussing data loading concerning Estimator quantum circuit is $n$.
QNN, we mean converting traditional input data into a
3.3 Data preprocessing
quantum processing-ready format. This entails converting
traditional data into quantum states controlled by the Two primary phases are involved in data preparation
QNN's configured quantum circuit (Das, 2023). The input for quantum neural networks (QNNs) based on the code
settings provided during creation are used to initialize the snippets that have been provided:

www.ijaers.com Page | 38
Gonaygunta et al. International Journal of Advanced Engineering Research and Science, 11(5)-2024

Data Encoding into Quantum States: For quantum Estimator QNN and Sampler-QNN software versions
circuits to handle input data in the context of QNNs, use Qiskit primitives from Figure 5, the building blocks for
classical data must be converted into a quantum running QNNs on simulators or real quantum hardware.
description. Shifting classical data onto quantum states is a Each of these implementations accepts an extra class of the
common step in an encoding procedure. This can be done appropriate basic, Base-Sampler for Sampler QNN and
using volume-encoded data, angle coding, or other Base Estimator for Estimator QNN (Innan et al., 2023).
encoding approaches (Weigold et al., 2021). This phase is The QNN classes automatically instantiate the proper
demonstrated in the given code when using reference primitive (Sampler or Estimator) for smooth
QuantumCircuit from Qiskit to create the quantum circuits operation if no instance is explicitly supplied. Let us
(qc1 and qc2) to process the quantum-ready data during explore the theory of utilizing a Quantum Neural Network
the forward pass, and these circuits are initialized with (QNN) in Qiskit Machine algo to do a forward and
parameters (params1 and inputs2) that reflect the classical backward pass together (Abbas et al., 2021). We will
input data. review the underlying idea of these procedures and provide
Quantum Circuit Parameterization: The quantum code examples.
circuits of the QNN process the classical data once it has Forward Pass
been encoded into quantum states. These quantum circuits In a QNN, a forward move entails calculating the
are parametrized, which means that the variables (weights) output, transferring the input data via the quantum circuit,
are changed within the training phase to maximize the and maybe doing some afterward. This is how it operates:
network's performance for the assigned job (Shi et al.,
Input Preparation: Quantum states—typically represented
2023). The given code builds the quantum circuits (qc1
as qubits in a quantum circuit—are created by encoding
and qc2) using parameters (params1 and inputs2) and then
the incoming data. A characteristic of the incoming data
modifies them using gates like RX and Ry to represent the
may be correlated with each qubit.
QNN's processing phases. The network's performance is
then enhanced by training these parameters using Quantum Circuit Execution: The quantum circuit's training
optimization methods like backpropagation to minimize a weights determine the encoded quantum states' processes.
specified loss function. Measurement: The quantum circuit is executed, and then
3.4 Implementation and Measurement the qubits are measured. The results of these
measurements yield classical data that can be handled
Quantum Neural Networks (QNNs), which are
further.
application-agnostic compute units tailored to various use
cases, are available through the Qiskit Machine Learning The output of the Sampler QNN is a probability
package. These QNNs have two distinct implementations distribution across all potential measurement results, with
that are organized around an interface: each element representing the likelihood of detecting a
particular measurement outcome. The output vector in this
instance is shaped like (1, 4), meaning that there is one
sample and four potential measurement results. The
corresponding probability for each possible event is
around 0.018, 0.257, 0.527, and 0.198. Conversely, the
Estimator QNN yields a single probability value for every
input sample. The output vector's structure of (2, 1)
denotes that two input samples were processed
Fig.5: Input QNN Estimator QNN concurrently, with one probability value computed for
each sample. In this instance, the probability value
obtained from both samples is around 0.297.
Neural Network: This is the interface for all neural
networks within the Qiskit Machine Learning framework. Backward Pass:
It is an abstract class from which all QNNs inherit. The backward pass in a QNN involves calculating
Estimator QNN: This implementation evaluates quantum gradients of the loss function concerning the quantum
mechanical observables for its operations. circuit's trainable parameters (weights). Here is how it
works:
Sampler QNN: In contrast, depending on the data acquired
by testing a quantum computing circuit, Sampler-QNN Compute Loss: First, the loss function is computed using
functions. the predicted output from the forward pass and the target
output.

www.ijaers.com Page | 39
Gonaygunta et al. International Journal of Advanced Engineering Research and Science, 11(5)-2024

Gradient Calculation: Grades of the loss functions through capacity in deeper QNNs allows them to take on more
deference to the trainable limits (weights) are calculated difficult learning tasks and extract higher-level features
using methods like backs-propagation shown in Figure 6. from the data. However, the effective use of deep QNNs
Parameter Update: The gradients are used to update the necessitates strong optimization methods, effective
parameters of the quantum circuit to minimize the loss resource management, and noise and quantum error
function. mitigation approaches to guarantee realistic scalability and
practicality in quantum computing platforms; the depth of
QNNs must be matched with the quantum resources
available, such as the number of qubits, circuit coherence
times, and gate fidelities.

REFERENCES
[1] Abbas, A., Sutter, D., Zoufal, C., Lucchi, A., Figalli, A., &
Woerner, S. (2021). The power of Quantum Neural
Networks. Nature Computational Science, 1(6), 403–409.
https://doi.org/10.1038/s43588-021-00084-1
[2] Alchieri, L., Badalotti, D., Bonardi, P., & Bianco, S. (2021).
An introduction to quantum machine learning: From
Quantum Logic to Quantum Deep Learning. Quantum
Machine Intelligence, 3(2). https://doi.org/10.1007/s42484-
021-00056-8
Figure 6: Gradient model training [3] Avramouli, M., Savvas, I. K., Vasilaki, A., & Garani, G.
(2023). Unlocking the potential of quantum machine
The quantum neural network (QNN) propagates input
learning to advance drug discovery. Electronics, 12(11),
data forward during each epoch, and then gradients
2402. https://doi.org/10.3390/electronics12112402
calculated by gradient descent propagate backward. The [4] Batra, K., Zorn, K. M., Foil, D. H., Minerali, E., Gawriljuk,
objective is to repeatedly adjust the QNN's parameters to V. O., Lane, T. R., & Ekins, S. (2021). Quantum machine
reduce the loss function and enhance its functionality. learning algorithms for Drug Discovery Applications.
Attaining an accuracy score of 83% signifies that the QNN Journal of Chemical Information and Modeling, 61(6),
has successfully learned to categorize or predict outcomes 2641–2647. https://doi.org/10.1021/acs.jcim.1c00166
with a high degree of accuracy after several training [5] Beer, K., Bondarenko, D., Farrelly, T., Osborne, T. J.,
epochs. The QNN may effectively modify its parameters Salzmann, R., Scheiermann, D., & Wolf, R. (2020).
Training Deep Quantum Neural Networks. Nature
to better suit the training data by utilizing Gradient
Communications, 11(1). https://doi.org/10.1038/s41467-
Descent, which improves the QNN's performance in deep
020-14454-2
learning tasks. [6] Bishwas, A. K., Mani, A., & Palade, V. (2020). 4. from
classical to Quantum Machine Learning. Quantum Machine
Learning, 67–88. https://doi.org/10.1515/9783110670707-
IV. CONCLUSION
004
In summary, the depth and complexity of quantum neural [7] Buffoni, L., & Caruso, F. (2020). New trends in Quantum
networks (QNNs) are critical to their effectiveness and Machine Learning <sup>(a)</sup>. Europhysics
generalizability in various applications. The quantity of Letters, 132(6), 60004. https://doi.org/10.1209/0295-
layers in a neural network architecture—conventional and 5075/132/60004
[8] Das, P. (2023). Design and Comparative Analysis of
Quantum—is called the QNN's depth. Deeper QNNs are
Quantum Hashing Algorithms Using Qiskit.
often associated with a higher ability to identify complex
https://doi.org/10.36227/techrxiv.24037440.v1
patterns and correlations in the input data, which may [9] Dave, R. (2022). Machine Learning Applications of Data
enhance performance on challenging tasks. Moreover, Security and Privacy a Systemic Review.
there are drawbacks to deepening a QNN, including https://doi.org/10.36227/techrxiv.20063219.v1
greater computing complexity, noise sensitivity, and the [10] Dunjko, V., & Wittek, P. (2020). A non-review of Quantum
possibility of gradients disappearing or ballooning during Machine Learning: Trends and explorations. Quantum
training. Views, 4, 32. https://doi.org/10.22331/qv-2020-03-17-32
[11] Egon, K., ROSINSKI, J., KARL, L., & EUGENE, R.
The debate on QNN depth entails weighing the trade- (2023). Quantum Machine Learning: The Confluence of
offs between computational viability and model Quantum Computing and Ai.
expressiveness. The potential for improved representation https://doi.org/10.31219/osf.io/rf4xp

www.ijaers.com Page | 40
Gonaygunta et al. International Journal of Advanced Engineering Research and Science, 11(5)-2024

[12] Fikadu Tilaye, G., & Pandey, A. (2023). Investigating the algorithms in Quantum Computing: A survey. 2020
effects of hyperparameters in quantum-enhanced deep International Joint Conference on Neural Networks
reinforcement learning. Quantum Engineering, 2023, 1–16. (IJCNN). https://doi.org/10.1109/ijcnn48605.2020.9207714
https://doi.org/10.1155/2023/2451990 [25] Santosh, K., Das, N., & Ghosh, S. (2022). Deep learning
[13] Gil-Fuster, E., Eisert, J., & Bravo-Prieto, C. (2024). models. Deep Learning Models for Medical Imaging, 65–
Understanding Quantum Machine Learning also requires 97. https://doi.org/10.1016/b978-0-12-823504-1.00013-1
rethinking generalization. Nature Communications, 15(1). [26] Shi, J., Li, Z., Lai, W., Li, F., Shi, R., Feng, Y., & Zhang, S.
https://doi.org/10.1038/s41467-024-45882-z (2023). Two end-to-end quantum-inspired deep neural
[14] Innan, N., Khan, M. A.-Z., & Bennai, M. (2023). Financial networks for text classification. IEEE Transactions on
fraud detection: A comparative study of quantum machine Knowledge and Data Engineering, 35(4), 4335–4345.
learning models. International Journal of Quantum https://doi.org/10.1109/tkde.2021.3130598
Information, 22(02). [27] Surjeet, S., Bulla, C., Arya, A., Idrees, S., Singh, G., Rao, S.
https://doi.org/10.1142/s0219749923500442 G., & Shirahatti, A. (2023, March 6). A quantum machine
[15] Jadhav, A., Rasool, A., & Gyanchandani, M. (2023). learning approach for bridging the gap between Quantum
Quantum Machine Learning: Scope for real-world problems. and classical computing. International Journal of Intelligent
Procedia Computer Science, 218, 2612–2625. Systems and Applications in Engineering.
https://doi.org/10.1016/j.procs.2023.01.235 http://www.ijisae.org/index.php/IJISAE/article/view/4539
[16] Jerbi, S., Trenkwalder, L. M., Poulsen Nautrup, H., Briegel, [28] Tychola, K. A., Kalampokas, T., & Papakostas, G. A.
H. J., & Dunjko, V. (2021). Quantum enhancements for (2023). Quantum Machine Learning—an overview.
deep reinforcement learning in large spaces. PRX Quantum, Electronics, 12(11), 2379.
2(1). https://doi.org/10.1103/prxquantum.2.010328 https://doi.org/10.3390/electronics12112379
[17] Jhanwar, A., & Nene, M. J. (2021). Enhanced machine [29] Valdez, F., & Melin, P. (2022). A review on quantum
learning using quantum computing. 2021 Second computing and deep learning algorithms and their
International Conference on Electronics and Sustainable applications. Soft Computing, 27(18), 13217–13236.
Communication Systems (ICESC). https://doi.org/10.1007/s00500-022-07037-4
https://doi.org/10.1109/icesc51422.2021.9532638 [30] Weigold, M., Barzen, J., Leymann, F., & Salm, M. (2021).
[18] Khan, T. M., & Robles-Kelly, A. (2020). Machine learning: Expanding data encoding patterns for quantum algorithms.
Quantum vs classical. IEEE Access, 8, 219275–219294. 2021 IEEE 18th International Conference on Software
https://doi.org/10.1109/access.2020.3041719 Architecture Companion (ICSA-C).
[19] Kharsa, R., Bouridane, A., & Amira, A. (2023). Advances in https://doi.org/10.1109/icsa-c52384.2021.00025
quantum machine learning and deep learning for image [31] Wichert, A. (2023). Qiskit. Quantum Artificial Intelligence
classification: A survey. Neurocomputing, 560, 126843. with Qiskit, 41–62. https://doi.org/10.1201/9781003374404-
https://doi.org/10.1016/j.neucom.2023.126843 3
[20] Lewis, L., Huang, H.-Y., Tran, V. T., Lehner, S., Kueng, R., [32] Zahorodko, P. V., Semerikov, S. O., Soloviev, V. N., Striuk,
& Preskill, J. (2024). Improved machine learning algorithm A. M., Striuk, M. I., & Shalatska, H. M. (2021).
for predicting ground state properties. Nature Comparisons of performance between quantum-enhanced
Communications, 15(1). https://doi.org/10.1038/s41467- and classical machine learning algorithms on the IBM
024-45014-7 Quantum experience. Journal of Physics: Conference Series,
[21] Liu, J., Liu, M., Liu, J.-P., Ye, Z., Wang, Y., Alexeev, Y., 1840(1), 012021. https://doi.org/10.1088/1742-
Eisert, J., & Jiang, L. (2024). Towards provably efficient 6596/1840/1/012021
quantum algorithms for large-scale machine-learning [33] Zeguendry, A., Jarir, Z., & Quafafou, M. (2023). Quantum
models. Nature Communications, 15(1). Machine Learning: A Review and Case Studies. Entropy,
https://doi.org/10.1038/s41467-023-43957-x 25(2), 287. https://doi.org/10.3390/e25020287
[22] Priyanka, G. S., Venkatesan, M., & Prabhavathy, P. (2023).
Advancements in quantum machine learning and Quantum
Deep Learning: A Comprehensive Review of Algorithms,
challenges, and future directions. 2023 International
Conference on Quantum Technologies, Communications,
Computing, Hardware and Embedded Systems Security (iQ-
CCHESS). https://doi.org/10.1109/iq-
cchess56596.2023.10391745
[23] Rahmaniar, W., Ramzan, B., & Ma'arif, A. (2024). Deep
learning and quantum algorithms approach to investigating
the feasibility of wormholes: A Review. Astronomy and
Computing, 47, 100802.
https://doi.org/10.1016/j.ascom.2024.100802
[24] Ramezani, S. B., Sommers, A., Manchukonda, H. K.,
Rahimi, S., & Amirlatifi, A. (2020). Machine learning

www.ijaers.com Page | 41

You might also like