Comparative Analysis of Classification Efficiency of Quantum Machine Learning Algorithms
Comparative Analysis of Classification Efficiency of Quantum Machine Learning Algorithms
Abstract- Machine learning has achieved outstanding results strategies, climate change factors, complex biological
across various applications. The collaboration between quantum processes, and economic events.
physics and machine learning can potentially deliver new and
groundbreaking opportunities for scientific exploration and The quantum computing is one of the astonishing discoveries
business applications, creating a new and exciting field of research of 21st century originated from of Moore's Law which state that
known as quantum machine learning (QML). The quantum capacity of integrated circuit doubles every two years since
devices such as quantum classifiers, designed to address 1965 and size will halve in the same period. Presently the size
classification challenges in machine learning, have recently of reduce to nano scale and further reduction is not possible
garnered significant interest. This study thoroughly examines the without entering to atomic scale. Quantum computing in
research on quantum classifiers with particular emphasis on
atomic regime use quantum phenomena to tackle difficult
recent advancements. Various quantum classification algorithms,
such as Quantum KNN, Quantum Neural Networks, and
mathematical problems where classical computers fails [4].
Quantum Support Vector Machines, are explored in depth. A The subsequent sub-sections will describe such important
detailed comparative analysis of these techniques is conducted to issues in detail.
understand real-time scenario. The study is anticipated to come Quantum Computing: It is a physical process to include
out with insight into the best QML algorithm for classification Physics in computing theory. The most accurate description of
problems. nature up to this point is quantum mechanics, which was
developed in the last century. Applying the principles of
Keywords- Quantum Machine Learning, QKNN, QSVM, QNN. quantum mechanics to computer science is known as quantum
computing. In digital computers, all information is converted
I. INTRODUCTION into a series of bits, and logic gates are used for all
Machine learning algorithms gather data and forecast new computations. The state of the bits in a digital computer
samples. Unlike other mathematical methods, these algorithms determines its state. Therefore, one of the 2n potential states for
use training dataset (known data) to build and update their a computer with n states is possible. A quantum computer
predictive model. It has immense applications in signal likewise comprises bits, but instead of representing 1’s or 0’s,
processing, image processing, voice recognition, and these bits can simultaneously represent a linear combination of
autonomous system processing. It is one of the fastest-growing 0’s or 1’s or both 1’s and 0’s, a phenomenon known as
technological topics combining computer science, statistics, AI, superposition [5].
and data science. Machine learning has evolved because of new
learning theories, algorithms, internet data, and low-cost A. The Quantum Bit:
computing to enhance financial modeling, marketing, The most fundamental quantum computing unit of operation
manufacturing, and healthcare industries through data-intensive is the quantum bit (qubit). A two-dimensional vector space over
machine learning [1]. the complex numbers is also used to express it. and are
QML combines machine learning and quantum computing. the usual base.
It uses quantum computers' to achieve several-fold faster data
processing speeds. "QML" refers to using quantum systems to BIT 0 QUBIT
construct algorithms that improve computer programs over time 100
by using quantum computers' efficiency to address machine %
0
learning problems. The ability to study numerous superposition 1 50% 1
quantum states at once helps speed up the analysis. [2]. Every V/S
100%
day, about 328.77 million terabytes of data are produced; this
year which is expected rise at a level of 120 zettabytes for the , OR
A qubit can be in any superposition and is advancements in quantum computing theory, there is potential
to create classification models that are strengthened by quantum
not always in either or ; instead, it can be in any
technology. These models have the capability to perform
quantum state . where and are complex numbers, and intricate classification problems. So far, several quantum
where Qubits can be thought of as being in classification models and procedures have been developed,
multiple parallel worlds since they have a specific probability of some of which have already been demonstrated
being both a ‘1’ and a ‘0’, and they are simultaneously 1 and 0. experimentally[11]. This research has enhanced the analysis of
For instance, if a qubit splits probabilistically into two parallel quantum classifiers from multiple perspectives and could offer
universes, it can be reunited once more as a photon with either a useful insights for future, more sophisticated models.
1 or a 0. Consequently, this enables quicker calculations in
II. CLASSIFICATION IN QUANTUM MACHINE LEARNING
several parallel universes.
In a classification procedure, the primary objective is to
B. Geometrical Representation: categorize a set of items by assigning each object a label that
The resulting 2-D geometrical representation of state of a corresponds to its true class (such as the category of felines, the
qubit is illustrated in figure 2. When only real-valued category of malignant cells, the category of human visages,
amplitudes are assumed for a quantum state .A etc.). When applying a traditional method of categorization for
qubit's state can be expressed as supervised learning systems, we start by choosing the d
characteristics that define all the items in a specific dataset.
[6].
Therefore, every item is denoted by a d-dimensional real vector
X, where X is ϵ ℝd and is written as X = (x1, ..., xd).
|1⟩
|+⟩= |0⟩ + |1⟩ In a formal sense, indicating a pattern by a pair (Xi, li),
where Xi represents the vector in d dimensions that relates to
п/4 | q⟩= cosθ |0⟩ + sinθ|1⟩
the item and li indicates the class to which the object belongs.
θ
|0⟩ The objective of the classification procedure is to create a
classifier that can assign a class (or label) to any unlabeled
|-⟩= |0⟩ - |1⟩
object with the highest level of accuracy. The technique consists
of two distinct stages: firstly, the training of the classifier, and
then performing the proper test. To propose a quantum
methodology for a conventional classifier, one must adhere to
Fig. 2. Geometric Depiction of a Qubit the following three steps:
C. Classical Machine Learning Algorithms A. Encoding:
Machine learning (ML) improves computer data handling. A density operator, specifically a theoretical quantum entity,
We can't always evaluate or generalize data. Then, machine is linked to each pattern, referred to as a density pattern in this
learning is implemented. Following the proliferation of datasets, context. Consider X as a vector in ℝd. We transform the vector
the need for machine learning has increased. Machine learning X in ℝd into a X1 vector in ℝd + 1 using the following method:
is utilized across numerous industries to isolate pertinent data. The expression X1 = α(2x1, ..., 2xd), where α is a normalization
The objective of machine learning is to gain knowledge from factor defined as a =
data. Many studies have investigated the phenomenon of
unprogrammed robot learning. This challenge involves
B. Classification:
managing enormous quantities of data; consequently, numerous
programmers and mathematicians employ various techniques we offer a quantum-inspired alternative to the conventional
[4]. classification method, which is often used on datasets consisting
of real vectors. Instead, our approach is applied to A collection
D. How Quantum Machine Learning Works of density operator data, also known as density patterns.
The interaction of quantum physics with machine learning
C. Decoding:
could open up historically unimaginable possibilities for both
disciplines. Machine learning has revolutionized in recent years The outcomes of the classification process in the realm
numerous innovative methods in the domains of science and involving true vectors are decoded [12]. Fig. 3 Depicted the
commerce, Prior research has identified possible quantum Quantum computer-based pipeline for executing supervised
benefits in the domain of supervised [7] and unsupervised machine learning task i.e. classification.
learning[8] . In Ref. [7], the authors present quantum algorithms
for classification problems that have the potential to achieve an
exponential speedup. Classifications, a crucial field in machine
learning, are currently extensively utilized in both commercial
and academic domains. These applications span from facial
sentiment analyses and recommendation systems for sentiment Fig. 3. Steps for Classification Task on QML
analyses[9] and cancer detection [10]. Due to the recent
III. LITERATURE REVIEW QKNN exceeds Centroid and QNN in classification accuracy
This section provides an analysis of the most recent and time[20],
classification methods that have been proposed, which are Gurmohan Singh et al. suggested a Quantum Support
either quantum or hybrid in nature. Vector Machine (QSVM) approach to classify the MNIST
Maria Schuld et al. outlined the benefits of a quantum benchmark. Analysis of acceleration of Quantum SVM
pattern classification algorithm based on handwritten digit variational back-ends for physical quantum processors & kernel
recognition data from the MNIST database and Trugenberger's matrix techniques. Classical & quantum SVM algorithms were
proposition for a quantum computer to measure the Hamming compared for execution time and accuracy [7].
distance [13]. Above literature review concludes that mainly the work
Chen Ding et al. presented a support vector machine done in quantum machine learning is based on these three
(SVM) procedure inspired by quantum mechanics that algorithms as depicted in table 1.
outperforms previous classical algorithms exponentially. IV. QUANTUM-INSPIRED TECHNIQUES FOR CLASSIFICATION
Through experimentation, the viability of the proposed
algorithm is demonstrated. proposed algorithm exhibits Due to the inherent vulnerability to errors in contemporary
favorable performance on low-rank datasets or datasets that can quantum computers, a prevalent strategy involves executing a
be adequately approximated by low-rank matrices, akin to the portion of the complete operation on a quantum computer while
quantum SVM algorithm [14]. delegating the other components to a classical computer.
Specifically, a quantum kernel estimator employs a quantum
Amer Delilbasic et al proposed QSVM for remote sensing circuit as the kernel function. By employing a quantum feature
Data Classification is compared with classical SVM in terms of map, this quantum circuit converts classical input into quantum
Accuracy on IBM Qiskit platform [15]. states before computing the inner product of said quantum
Alina O. Sawchuk et al proposed a QVSM is compared states.
with classical SVM. before applying QVSM, QPCA is used to The inner product is utilized for subsequent processing by
reduce the features. IBM Qiskit is used to implement QSVM the traditional Support Vector Machine (SVM). The
and Google Collab is used for classical SVM. categorization is executed as a final step, employing a kernel-
ElectricalFaultDetection data set is used for both of these based Support Vector Machine (SVM) on a conventional
algorithms. [16]. computer, utilizing the computed kernel. To summarize, the
Mohammed Zidan et al. An input is classified into one of quantum algorithm is responsible for calculating the kernel
two binary classifications, irrespective of the incompleteness of matrix, while the traditional SVM method is done on a normal
the input pattern. right after the execution of unitary operators, computer [17].
the opposition between neurons is executed using the A. Quantum Support Vector Machine (QSVM):
entanglement measure to determine the engaging class
The SVM is specifically designed for binary classification
according to the win-take-all criterion[18].
of new testing vectors. SVM refers to a type of issue
TABLE I. QUANTUM MACHINE LEARNING ALGORITHMS
characterized by having more equations than unknowns, which
is known as an over-determined system of equations. The
Algorithm References Description algorithm produces a hyperplane defined by the equation
Quantum Support [7],[14], Faster quantum classifiers + b = 0. This hyperplane ensures that to acquire a training point
Vector Machine [15],[16], outperform traditional ones. The
[19],[21] quantum simulator and in the positive class, + b ≥1, and in order to acquire a
superconducting quantum
processors back quantum training point in the negative class, + b ≤ - 1. During
algorithms.
Quantum Neural [18],[22], Addressed multi classification and
the training process, the algorithm strives to optimize the
Network [23],[24], regression issues in numerous distance between the two classes. This is logical since we wish
[25],[26],[27],[28 applications. to create a clear separation between the two classes in order to
],[29] obtain a more accurate classification result for new data
Quantum K [13],[30],[31],[20 The quantum minimum search
Nearest neighbor ],[11] algorithm accelerates similarity
samples, such as . Mathematically, the determination of SVM
searches. is to identify a hyperplane that maximizes the distance between
Ba Dema et al. Developed A Quantum Support Vector two parallel hyperplanes, subject to the restriction that the
Machine Quantum Annealer for Multiclass Classification. product of the predicted label ( ) and the linear combination
Results of QSVM were better in terms od precision[19].
Yue Ruan et al. developed a Hamming distance-based of the weights ( ) and the input vector ( ) [32]. Within the
Quantum classification algorithm k-nearest neighbors (QKNN) quantum context, it is presumed that Oracles for the training
using MNIST Dataset. Due to the fact that time cost is only data are available, which furnish quantum vectors.
pertinent to feature vectors not dataset size, this approach can
greatly increase QKNN performance in Big Data scenarios.
the norms | | and the The symbol signifies the sum of all inputs. The symbol
represents a quantum state. The symbol , with
labels | |. The performance of quantum machine learning is
components , , ..., , represents a vector
dependent on these oracles and can be seen as a minimum
[34]. Table 3 shows some of the literature review papers
measure for the actual complexity. An effective method for
where Quantum NN(QNN) outperformed some quantum
creating these states involves utilizing quantum RAM, which
requires O(MN) hardware resources but just O(log MN) and classical algorithms.
instructions for accessing them. Table 2 shows some of the
TABLE III. : STUDY OF QNN WITH DIFFERENT DATASETS
literature review papers where Quantum SVM outperformed
classical Support Vector Machine algorithms. Ref Problem Data set Platform Classical Counter
addressed used For part
TABLE II. STUDY OF QUANTUM SVM USING DIFFERENT DATASETS Quantum
[18] Classificatio Reactor Quantum Quantum
Ref. Problem Data sets Platform used for n Coolant Simulator competitive NN
addressed Quantum Pump Data
[7] Classification MNIST QASM Simulator Set
[14] Classification Low Rank data set Julia [22] Classificatio Iris, MNIST IBMQx4 TTN,
[33] Classification Semcity Toulouse Quantum Simulator n MERA
[10] Classification Breast Cancer IBM Quantum [23] Classificatio IRIS, BAS, IBM QASM Individual
and Recognition n Synth simulator Variational
[16] Classification Electrical Fault Qiskit public library Quantum Circuit
Detection [24] Classificatio MNIST,Fas Quantum Classical Feed
[19] Classification IRIS Quantum Annealing n hionMNIST Simulator forward Neural
Machine and ORL Network
[21] Classification Wine, Breast IBMQX4 face
Cancer [25] Detection Airpush TensorFlow AlexNet, Mobile
and ,Dowgin, Net, Efficient Net,
B. Quantum Neural Network (QNN): Classificatio FakeInst , VGG16,
n Fusob , and VGG19
In 1995, Kak introduced the notion of quantum neural Jisut, and
computing for the first time. The concept of integrating nerve Mecor
computing with quantum computing to establish a novel [35] Classificatio DiCOVA, PennyLane Classical Neural
computing paradigm has been advanced in the realm of n COUGHVI Network
quantum research. T Menneer provided a comprehensive D
analysis of quantum neural networks (QNN) from a multi- [36] Classificatio ant/bee and Qiskit ResNet18,
n potato leaf (Terra) AlexNet VGG16
universe perspective and concluded that QNN outperformed image
conventional neural networks. [27] Classificatio Medical Pennylane Classical CNN
n Imaging and
Dataset PyTorch
W1 Φ1 [28] Classificatio Wine, PyTorch Quantum
Φ2 n Breast Unoptimized
W2 Cancer classifier
Output
Φ3
∑ [29] Classificatio
n
breast
cancer,
QC Q-BNN, EF, EW,
IEM …
W3 PIMA and
Φ4
BUPA
W4
C. Quantum K Nearest Neighbor (QKNN):
The k-Nearest neighbor algorithm is a widely used and
Fig. 4. Quantum M-P Model
straightforward standard textbook approach for pattern
A weight coefficient, denoted as W, is assigned to every classification. When T is a training set, consisting of feature
input of a neuron. This coefficient signifies the degree of vectors and their corresponding classifications, together with a
excitation or inhibition in connections between neurons in the random input vector , the objective is to determine the class
brain and is utilized to quantify the connection's strength. cx for new input by selecting the class that is most frequently
Figure 4 depicts the theoretical model of a quantum M-P model, observed among its k nearest neighbors. This is predicated on
which is based on the classical M-P model. the idea that feature vectors that are "close" to each other
The outputs of a neuron that correspond to specific inputs encode similar examples, a fact that holds true for numerous
are stated in the following manner: applications. Aimeur, Brassard, and Gambs proposed the
concept of utilizing the overlap or fidelity between two
quantum states |a⟩ and |b⟩ as a "identity measure”. The
reliability can be achieved using a straightforward quantum
PA
IS
IS
an tic
er
s
MNIST, IRIS, and WINE is shown in Fig 5. The Graph shows
rip
M
IR
IR
nc
e
BU
PI
th
St
Ca
for the MNIST datasets QSVM had 97.5 % Accuracy and
d
Sy
st
ea
outperformed the QNN (95%) and KQNN (95%). For IRIS
rs
Br
Ba
Dataset QSVM has highest accuracy 94 % and for WINE
dataset QNN betrayed the remaining both with 97%. Figure 6,
7 and 8 respectively, depicts the results of the QSVM, QKNN Fig. 8. Performance of QNN on Different Datasets
and QNN algorithms based on the different benchmarking
datasets. VI. ILLUSTRATIVE DISCUSSION ON QML
Building quantum computers with many qubits is crucial for
implementing and testing QML algorithms and handling
enormous data sets in the near future. Figure 9 illustrates qubit
numbers from several technology companies, including
Righetti, IBM, Q-Wave, Xanadu, Google, and Microsoft. [34] .
The data illustrates the progressive increase in the quantity of
qubits attained by different organizations and firms in the field
of quantum computing over a period of time. The quantum
computer developed by Google, known as Sycamore, consists
of 53 qubits. Equivalent to the quantity of a recently revealed
quantum computer developed by IBM. An IBM Quantum
Summit in 2023 showcased a chip with 1,121 qubits
IBM 2016
RIGETTI
2018
OXFORD
2019
90 Toulouse
85
80
Fig. 9. Growth of Qubits Company & Year-wise
MNIST
Cancer
IRIS
Breast
VII. CONCLUSION
Until now, Small-scale quantum computers limit data
Fig. 6. Performance of QSVM on Different Datasets
utilization. So, researchers Create algorithms for small-scale
and "noisy intermediate-scale quantum" (NISQ) devices,
compatible with multiple qubits. Qubits with restricted
functionalities can lose crucial data and hinder massive data
processing on quantum devices. Future work includes Classical Machine Learning,” SMU Data Sci. Rev., vol. 1, no. 4, p. 11,
enhancing feature extraction methods and designing quantum 2018.
implementation solutions. [22] E. Grant et al., “Hierarchical quantum classifiers,” npj Quantum Inf., vol.
4, no. 1, pp. 17–19, 2018.
REFERENCES [23] D. Arthur and P. Date, “A Hybrid Quantum-Classical Neural Network
Architecture for Binary Classification,” 2022.
[1] M. I. Jordan and T. M. Mitchell, “Machine learning: Trends, [24] P. Easom-McCaldin, A. Bouridane, A. Belatreche, R. Jiang, and S. Al-
perspectives, and prospects,” Science (80-. )., vol. 349, no. 6245, pp. Maadeed, “Efficient Quantum Image Classification Using Single Qubit
255–260, 2015. Encoding,” IEEE Trans. Neural Networks Learn. Syst., pp. 1–15, 2022.
[2] S. B. Ramezani, A. Sommers, H. K. Manchukonda, S. Rahimi, and A. [25] F. Mercaldo, G. Ciaramella, G. Iadarola, M. Storto, F. Martinelli, and A.
Amirlatifi, “Machine Learning Algorithms in Quantum Computing: A Santone, “Towards Explainable Quantum Machine Learning for Mobile
Survey,” Proc. Int. Jt. Conf. Neural Networks, no. 2, 2020. Malware Detection and Classification †,” Appl. Sci., vol. 12, no. 23,
[3] H. Much, D. Is, and G. Every, “Amount of Data Created Daily Top Data 2022.
Created Stats ( Editor ’ s Choice ) How Much Data Is Generated Every [26] S. Otgonbaatar, G. Schwarz, M. Datcu, and D. Kranzlmuller, “Quantum
Day ?,” pp. 1–7, 2023. Transfer Learning for Real-World, Small, and High-Dimensional
[4] M. Schuld, I. Sinayskiy, and F. Petruccione, “An introduction to quantum Remotely Sensed Datasets,” IEEE J. Sel. Top. Appl. Earth Obs. Remote
machine learning,” Sep. 2014. Sens., vol. 16, pp. 9223–9230, 2023.
[5] H. A. Bhat, F. A. Khanday, B. K. Kaushik, F. Bashir, and K. A. Shah, [27] A. Matic, M. Monnet, J. M. Lorenz, B. Schachtner, and T. Messerer,
“Quantum Computing: Fundamentals, Implementations and “Quantum-classical convolutional neural networks in radiological image
Applications,” IEEE Open J. Nanotechnol., vol. 3, pp. 61–77, 2022. classification,” Proc. - 2022 IEEE Int. Conf. Quantum Comput. Eng.
[6] P. Botsinis et al., “Quantum Search Algorithms for Wireless QCE 2022, pp. 56–66, 2022.
Communications,” IEEE Commun. Surv. Tutorials, vol. 21, no. 2, pp. [28] I. C. S. Araujo and A. J. Silva, “Quantum ensemble of trained
1209–1242, Apr. 2019. classifiers,” 2020.
[7] G. Singh, M. Kaur, M. Singh, and Y. Kumar, “Implementation of [29] O. P. Patel and A. Tiwari, “Novel quantum inspired binary neural
Quantum Support Vector Machine Algorithm Using a Benchmarking network algorithm,” Sadhana - Acad. Proc. Eng. Sci., vol. 41, no. 11, pp.
Dataset,” Indian J. Pure Appl. Phys., vol. 60, no. 5, pp. 407–414, 2022. 1299–1309, 2016.
[8] J. S. Otterbach et al., “Unsupervised Machine Learning on a Hybrid [30] Y. Dang, N. Jiang, H. Hu, Z. Ji, and W. Zhang, “Image classification
Quantum Computer,” 2017. based on quantum K-Nearest-Neighbor algorithm,” Quantum Inf.
[9] V. Martinez and G. Leroy-Meline, “A multiclass Q-NLP sentiment Process., vol. 17, no. 9, 2018.
analysis experiment using DisCoCat,” pp. 2–5, 2022. [31] N. Wiebe, A. Kapoor, and K. M. Svore, “Quantum Algorithms for
[10] Z. Shan et al., “Demonstration of Breast Cancer Detection Using QSVM Nearest-Neighbor Methods for Supervised and Unsupervised Learning,”
on IBM Quantum Processors,” pp. 0–10, 2022. pp. 1–31.
[11] C. Feng, B. Zhao, X. Zhou, X. Ding, and Z. Shan, “An Enhanced [32] A. Kariya and B. K. Behera, “Investigation of Quantum Support Vector
Quantum K-Nearest Neighbor Classification Algorithm Based on Polar Machine for Classification in NISQ era,” pp. 1–15, 2021.
Distance,” Entropy, vol. 25, no. 1, pp. 1–12, 2023. [33] A. Delilbasic, G. Cavallaro, M. Willsch, F. Melgani, M. Riedel, and K.
[12] G. Sergioli, R. Giuntini, and H. Freytes, “A new quantum approach to Michielsen, “Quantum Support Vector Machine Algorithms for Remote
binary classification,” PLoS One, vol. 14, no. 5, pp. 1–14, 2019. Sensing Data Classification,” Int. Geosci. Remote Sens. Symp., no. July,
[13] M. Schuld, I. Sinayskiy, and F. Petruccione, “Quantum computing for pp. 2608–2611, 2021.
pattern classification,” Lect. Notes Comput. Sci. (including Subser. Lect. [34] Z. Abohashima, M. Elhosen, E. H. Houssein, and W. M. Mohamed,
Notes Artif. Intell. Lect. Notes Bioinformatics), vol. 8862, pp. 208–220, “Classification with Quantum Machine Learning: A Survey,” no. 9, pp.
2014. 1–16, 2020.
[14] C. Ding, T. Y. Bao, and H. L. Huang, “Quantum-Inspired Support Vector [35] M. Esposito, G. Uehara, and A. Spanias, “Quantum Machine Learning
Machine,” IEEE Trans. Neural Networks Learn. Syst., vol. 33, no. 12, for Audio Classification with Applications to Healthcare,” 13th Int. Conf.
pp. 7210–7222, 2022. Information, Intell. Syst. Appl. IISA 2022, pp. 1–4, 2022.
[15] A. Delilbasic, G. Cavallaro, M. Willsch, F. Melgani, M. Riedel, and K. [36] G. Subbiah, S. S. Krishnakumar, N. Asthana, P. Balaji, and T. Vaiyapuri,
Michielsen, “Quantum Support Vector Machine Algorithms for Remote “Quantum transfer learning for image classification,” Telkomnika
Sensing Data, Classification Department of Information Engineering and (Telecommunication Comput. Electron. Control., vol. 21, no. 1, pp. 113–
Computer Science , University of Trento , Italy J ¨ ulich Supercomputing 122, 2023.
Centre , Forschungszentrum J ¨ School of Engineering and Na,” Int.
Geosci. Remote Sens. Symp., no. 951733, pp. 2608–2611, 2021. About Authors:
[16] A. O. Savchuk and N. N. Shapovalova, “Classification problem solving Prof. (Dr.) Savita Kumari Sheoran a Ph.D. in
using quantum machine learning mechanisms,” CEUR Workshop Proc., Computer Science from Banasthali Vidyapith,
vol. 3077, pp. 160–173, 2022.
Rajasthan in the area of mobile ubiquitous
[17] R. D. M. Simoes, P. Huber, N. Meier, N. Smailov, R. M. Fuchslin, and computing. She has more than 19 years of teaching
K. Stockinger, “Experimental Evaluation of Quantum Machine Learning
Algorithms,” IEEE Access, vol. 11, no. December 2022, pp. 6197–6208,
experience in India and abroad. She has published
2023. over 75 research papers journals and conferences
[18] M. Zidan et al., “Quantum classification algorithm based on competitive and has been granted one patent for wireless
learning neural network and entanglement measure,” Appl. Sci., vol. 9, communication system. She has published 07 books/chapters and
no. 7, pp. 1–15, 2019. supervised Ph.D. students for doctoral research.
[19] BADEMA, J. Arai, and K. Horikawa, “Support Vector Machine for Vikesh Yadav is a PhD scholar in the Dept.
Multiclass Classification using Quantum Annealers,” pp. 1–6, 2020. Computer Science, Indira Gandhi University
[20] Y. Ruan, X. Xue, H. Liu, J. Tan, and X. Li, “Quantum Algorithm for K- Meerpur in the area of Quantum computing &
Nearest Neighbors Classification Based on the Metric of Hamming Machine learning. She has obtained a M.Tech
Distance,” Int. J. Theor. Phys., vol. 56, no. 11, pp. 3496–3507, 2017. degree in Computer Science & Engineering from
[21] C. Havenstein, D. Thomas, S. Chandrasekaran, C. L. Havenstein, and D. GJUS&T Hisar (Haryana)
T. Thomas, “Comparisons of Performance between Quantum and