Quantum_Machine_Learning_Classifier_and_Neural_Net
Quantum_Machine_Learning_Classifier_and_Neural_Net
7,000
Open access books available
186,000
International authors and editors
200M Downloads
154
Countries delivered to
TOP 1%
most cited scientists
12.2%
Contributors from top 500 universities
Abstract
This chapter explores quantum machine learning (QML) and neural network
transfer learning. It begins by describing the potential of QML. The discussion
then shifts to transfer learning, leveraging pre-trained neural models across diverse
domains. A demonstration of advancements in both fields forms the core of the chap-
ter, showcasing how QML classifiers can be used with classical neural networks for
enhanced performance. To improve the accuracy of COVID-19 screening, ensemble
method and sliding window mechanism measurements have been employed using
computer vision on frequency domain spectrograms of audio files. Parallel with this,
the accuracy of these measurements could be improved by quantum machine transfer
learning. The chapter describes a case study where a hybrid approach demonstrated
significant improvements in data processing accuracy, offering an understanding of
practical applications. In conclusion, the authors present ideas on how the combina-
tion of QML and transfer learning could unfold new horizons in various fields with
complex, large-scale datasets. The chapter concludes with predictions about the
trajectory of these technologies, emphasizing their role in shaping the future of trans-
fer learning. This combination of current research and visionary thinking inspires
further exploration at the intersection of quantum computing machine learning and
neural network transfer learning.
1. Introduction
1
Transfer Learning – Leveraging the Capability of Pre-trained Models Across Different Domains
Transfer learning is a technique where a data model developed for one task is
reused as the starting point for a model on a second task. It is particularly useful in
scenarios where the second task has limited data available for training. In essence,
transfer learning allows for the leveraging of pre-trained models to achieve quicker
and more efficient learning in a new, but related, problem. Transfer learning is an idea
that is taken from how the human mind works. In the human mind, previous experi-
ence is used to handle training on a new task that must be learned to assist it into
learning the new task more quickly. An example is learning a new foreign language.
If one language has already been learned, then the new language training is guided by
that previous experience, to speed up the process, as described in “A computer science
perspective on models of the mind” by Brooks et al. [1].
It involves reusing a model, which has been trained on a quantum computer for
one task, for a different but related task. This is particularly beneficial given the
computational expense of training models on quantum computers.
Transfer learning in the quantum domain can be especially powerful for tasks
where classical data needs to be augmented with quantum data or vice versa. For
example, a quantum machine learning model initially trained on quantum simulation
data might be adapted for a more specialized task in quantum chemistry with mini-
mal retraining.
2
Quantum Machine Learning Classifier and Neural Network Transfer Learning
DOI: http://dx.doi.org/10.5772/intechopen.115051
The rest of this chapter illustrates the terms described above in the Introduction
with a survey of the research that serves as an example of each of the concepts, a
practical example of using quantum machine learning transfer learning in a hybrid
with classical neural networks, and a conclusion.
3
Transfer Learning – Leveraging the Capability of Pre-trained Models Across Different Domains
academic access to the older machines. To gain access to the Rigetti real quantum
computers, one can use Amazon Web Services (AWS) and pay for each “shot”.
Results and analysis: the study reports the successful application of transfer
learning in hybrid systems, with specific emphasis on the classical-to-quantum (CQ )
approach due to its relevance to current quantum technology. The CQ approach is
used to classify high-resolution images using real quantum processors.
Conclusion: the authors conclude that transfer learning is a promising approach
in the context of near-term quantum devices. They note the potential benefits of
combining the power of quantum computers with the well-established methods of
classical machine learning, especially for tasks like image processing.
This research is significant as it explores the intersection of quantum comput-
ing and machine learning, demonstrating the feasibility and potential advantages
of applying transfer learning techniques in hybrid classical-quantum settings.
The results indicate that such hybrid approaches could be valuable in efficiently
processing complex data using the combined strengths of classical and quantum
computation.
The article “Screening for COVID-19 via Acoustics using Artificial Intelligence” by
Bakhitov et al. [5] presented a novel approach for COVID-19 screening by analyzing
audio files using deep learning techniques. This research did not include quantum
computing or transfer learning, however, in subsequent research done for this chap-
ter, the authors have extended the work to include both.
The original hybrid approach started with crowdsourced audio files of indi-
viduals exhibiting COVID-19 symptoms and compared them to those of healthy
subjects. The methodology involved processing these audio samples into log-power
spectrograms (image format) using the librosa Python library, which were then
analyzed by Convolutional Neural Networks (CNNs) to identify patterns indicative
of the virus.
The study utilized a dataset from the Coswara project found at Ref. [6], which
contained audio samples from 1433 healthy individuals and 681 positive COVID-19
cases. Each dataset entity contained nine files recorded by one individual such as
breathing heavy, counting fast, vowel sounds, etc. The researchers trained their
model on 70% of this data, validated it on 20%, and tested it on the remaining 10%,
achieving promising initial results with an 85% Area Under the (Receiver Operating
Characteristics (ROC)) Curve (AUC).
This original research emphasized the importance of a hybrid approach that
combined audio preprocessing, image transformation, and advanced deep learn-
ing (specifically CNNs) to address the challenge of rapid and accessible COVID-19
screening. The significant aspect of this approach was its potential to reduce the
costs and logistical challenges associated with traditional testing methods while also
minimizing the risk of false negatives. The article also highlighted the high incidence
of false negatives in current COVID-19 tests and suggested that this AI-driven method
could offer a more reliable alternative for preliminary screening.
Future directions at that time for this work included improving the model through
more advanced techniques such as exploring the use of quantum computing and
transfer learning. The team aimed to process the entire Coswara dataset with the
improved model and test their approach on additional COVID-19 audio datasets as
they became available. They did not become available.
6
Quantum Machine Learning Classifier and Neural Network Transfer Learning
DOI: http://dx.doi.org/10.5772/intechopen.115051
The way that this research was improved and extended for this chapter was to
add quantum transfer learning. An illustration of the addition of quantum comput-
ing to the neural network to create a hybrid network is to take the classical neural
network done for the original research such as that seen in Figure 1, and replace it
with a hybrid classical-quantum neural network, such as that seen in Figure 2. The
difference between the two figures is that the final fully connected (fc) layer of the
neural network in Figure 2 has been cut out and replaced by a quantum circuit layer,
simulated in this instance by the use of the Pennylane quantum computing simula-
tion software platform found at Ref. [7] in Python. The effect of this was that the
first neural network from Figure 1 became a feature extractor for the second, hybrid
neural network from Figure 2. The way this is done is to freeze the pre-trained layers
from the first neural network that is used for the transfer learning, before adding the
quantum circuit layers for the hybrid. The authors placed the data model in the data
scientist sharing platform Hugging Face at Ref. [8] and the code on GitHub at Ref. [9].
Similar to the approach in Ref. [4] with the transfer learning done with the materials
of solar cells, the authors picked a similar dataset to use for the transfer learning of
the model, picking a different subset of the Coswara data, that had not been used for
training the original model. Transfer learning from the original counting audio files
data subset to train the model for the deep breathing subset.
The original research resulted in data model summary measurements as seen in
Ref. [5]. This data model was simplified for the purposes of illustrating classical and
quantum transfer learning for this chapter. The classical transfer learning resulted in
the data model summary measurements as seen in Table 1.
The classical model, originally adept at processing and classifying audio files for
medical screening, provided a solid foundation due to its effective pattern recognition
in audio data. By incorporating a quantum circuit to process the extracted features,
the goal was to leverage quantum computing’s potential to handle high-dimensional
data and execute computations beyond the reach of classical systems alone.
This hybrid approach was not only a test of quantum transfer learning’s feasibility
but also an investigation into its potential to enrich classical machine learning models
with quantum efficiency. The process involved addressing the distinctive challenges
Figure 1.
Classic neural network processing images down to 50 nodes then 2 nodes for decision of COVID-19 positive or
negative in original research from Bakhitov et al. [5].
7
Transfer Learning – Leveraging the Capability of Pre-trained Models Across Different Domains
Figure 2.
Addition of quantum layer for hybrid for the research done for this chapter.
Table 1.
Original classical neural network without the quantum transfer learning.
of quantum computing, such as error rates and qubit coherence, with the Pennylane
simulator, while also scrutinizing the model’s scalability and performance against
purely classical or quantum solutions.
The outcome of this case study illustrated the practical application of quantum-
enhanced machine learning models, shedding light on both the obstacles and advan-
tages of integrating quantum circuits into classical neural networks. By successfully
implementing this hybrid model, the team contributed to the quantum machine
learning field, showcasing an effective strategy for employing quantum computing to
augment classical machine learning tasks. This case study not only demonstrated the
model’s high accuracy in the specific context of medical screening but also under-
scored the broader potential of quantum computing to revolutionize various sectors,
8
Quantum Machine Learning Classifier and Neural Network Transfer Learning
DOI: http://dx.doi.org/10.5772/intechopen.115051
marking a significant step forward in the fusion of quantum and classical computing
technologies.
The addition of the quantum layer resulted in the data model summary measure-
ments as seen in Table 2.
Although in our example we did not cross domains outside of the Coswara data-
set to apply our trained model to a new domain of data, there are examples of this
done in “Hybrid model of quantum transfer learning to classify face images with a
COVID-19 mask,” Soto-Paredes and Sulla-Torres [10], in which the classic transfer
learning model chosen was ResNet-18 and the quantum layers of the target model
was used with a basic entangler layers template for four qubits using the Pennylane
quantum simulator. Their main finding was 99.05% accuracy in classifying the
Linear-17 [−1, 2] 10
DressedQuantumNet-18 [−1, 2] 0
Total Params: 16,876,942.
Trainable Params: 2062.
Non-trainable Params: 16,874,880.
Input size (mB): 0.25.
Forward/backward pass size (MB): 33.76.
Params size (MB): 64.38.
Estimated total size (MB): 98.39.
Table 2.
Measurements of quantum transfer learning: Note the great reduction in the trainable parameters with the pre-
trained model.
9
Transfer Learning – Leveraging the Capability of Pre-trained Models Across Different Domains
correct protective mask images (no mask, incorrectly worn mask, correctly worn
mask). Mari et al. “Transfer learning in hybrid classical-quantum neural networks,”
which is a foundational paper that is posted on the Pennylane site and describes the
theory of transfer learning in hybrid classical-quantum and quantum-quantum
neural networks [11]. In “Quantum deep transfer learning,” by Wang et al. [12] which
describes the theory of transfere learning in four steps of transfer learning across
domains as “(1) For a given task with the dataset, find a source domain dataset for
knowledge transfer. (2) Train a model on source domain dataset. (3) Build a criteria
for the transfer process…depending on the specific task… (4) Train the target task
model on the target domain dataset using the learning information obtained by (3).”
In “COVID-19 detection on IBM quantum computer with classical-quantum transfer
learning,” by Acar and Yilmaz [13], describes using transfer learning on MRI images
of lungs of people positive for COVID-19 as compared to healthy individuals. Leider
et al. “Quantum machine learning classifier,” uses the Iris dataset and the Pennylane
simulator [14] and Leider et al. “Hybrid Quantum Machine Learning Classifier with
Classical Neural Network Transfer Learning.” that uses the wine dataset and the
Pennylane simulator [15].
The reason for using the quantum simulator of Pennylane is to overcome the
current constraints to the capabilities of quantum computing, which is known as the
Noisy Intermediate-Scale Quantum (NISQ ) era. This is because today’s quantum
computers are still quite primitive, error prone and therefore most research work
done on them is academic at this time, because of said noise creating inconsistent
results. Quantum computers are considered probabilistic, meaning that quantum
programs have to be run repeatedly in “shots” of 1000 times or more in order to get
the most probable result.
3. Conclusions
Figure 3 shows that when using quantum machine transfer learning, in the earlier
epochs the learning rate is much faster and more accurate. This is the major reason
that transfer learning is attractive; it saves computing time. Figure 4 shows that
the loss rate is also reduced in the earlier epochs using transfer learning. Quantum
machine transfer learning currently uses “dressed” quantum circuits that have clas-
sical layers of the hybrid neural network before and after the quantum circuit layer,
and there is significant slowdown in translating the information from classical to
Figure 3.
Accuracy of the fully connected ( fc) layer to the quantum layer is better in the earlier epochs in the pre-trained
model.
10
Quantum Machine Learning Classifier and Neural Network Transfer Learning
DOI: http://dx.doi.org/10.5772/intechopen.115051
Figure 4.
Loss of the fully connected layer to the quantum layer is better in the earlier epochs in the transfer learning
example.
Acknowledgements
Conflict of interest
11
Transfer Learning – Leveraging the Capability of Pre-trained Models Across Different Domains
Author details
© 2024 The Author(s). Licensee IntechOpen. This chapter is distributed under the terms of
the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0),
which permits unrestricted use, distribution, and reproduction in any medium, provided
the original work is properly cited.
12
Quantum Machine Learning Classifier and Neural Network Transfer Learning
DOI: http://dx.doi.org/10.5772/intechopen.115051
References
[6] GitHub Dataset Repository and [15] Leider A, Jaoude GG, Mosley P.
Description of the Coswara Project. Hybrid quantum machine learning
Available from: https://github.com/ classifier with classical neural
iiscleap/Coswara-Data network transfer learning. In: Future
[7] Pennylane. Available from: https://
of Information and Communication
pennylane.ai/ Conference. Cham, Switzerland:
Springer Nature; 2023. pp. 102-116
[8] Bakhitov D. Hugging Face Repository
for the Counting Normal Dataset Used [16] Bakhitov D, GitHub Code
for the Original Model. Available from: Repository for the Data Model Used
https://huggingface.co/bakhitovd/ for this Chapter. Available from: https://
covid_conv_10 github.com/Bakhitovd/covid_screening
13