0% found this document useful (0 votes)
81 views36 pages

Breast Cancer-Caps - A Breast Cancer Screening System Based On Cap

Uploaded by

Vijay Jeyakumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
81 views36 pages

Breast Cancer-Caps - A Breast Cancer Screening System Based On Cap

Uploaded by

Vijay Jeyakumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 36

Turkish Journal of Electrical Engineering and Computer Sciences

Volume 30 Number 5 Article 9

1-1-2022

Breast cancer-caps: a breast cancer screening system based on


capsule network utilizing the multiview breast thermal infrared
images
DEVANSHU TIWARI

MANISH DIXIT

KAMLESH GUPTA

Follow this and additional works at: https://journals.tubitak.gov.tr/elektrik

Part of the Computer Engineering Commons, Computer Sciences Commons, and the Electrical and
Computer Engineering Commons

Recommended Citation
TIWARI, DEVANSHU; DIXIT, MANISH; and GUPTA, KAMLESH (2022) "Breast cancer-caps: a breast cancer
screening system based on capsule network utilizing the multiview breast thermal infrared images,"
Turkish Journal of Electrical Engineering and Computer Sciences: Vol. 30: No. 5, Article 9. https://doi.org/
10.55730/1300-0632.3906
Available at: https://journals.tubitak.gov.tr/elektrik/vol30/iss5/9

This Article is brought to you for free and open access by TÜBİTAK Academic Journals. It has been accepted for
inclusion in Turkish Journal of Electrical Engineering and Computer Sciences by an authorized editor of TÜBİTAK
Academic Journals. For more information, please contact [email protected].
Turkish Journal of Electrical Engineering & Computer Sciences Turk J Elec Eng & Comp Sci
(2022) 30: 1804 – 1820
http://journals.tubitak.gov.tr/elektrik/
© TÜBİTAK
Research Article doi:10.55730/1300-0632.3906

Breast cancer-caps: a breast cancer screening system based on capsule network


utilizing the multiview breast thermal infrared images

Devanshu TIWARI1,∗, Manish DIXIT2 , Kamlesh GUPTA3


1
Department of Computer Science and Engineering, Rajiv Gandhi Proudyogiki Vishwavidyalaya, Bhopal, India
2
Department of Computer Science and Engineering, Madhav Institute of Technology and Science,
Gwalior, India
3
Department of Information Technology, Rustamji Institute of Technology, BSF Academy, Tekanpur, India

Received: 21.10.2021 • Accepted/Published Online: 28.04.2022 • Final Version: 22.07.2022

Abstract: This paper proposed an accurate and fully automated breast cancer early screening system called the “Breast
Cancer-Caps”. The capsule network is used in this approach for the cancer detection in breast utilizing the thermal
infrared images for the first time. This capsule network is trained with the help of Dynamic as well as Static breast
thermal images dataset consisting of left, right, frontal views along with a new multiview thermal images. These
multiview breast thermal images are fabricated by concatenating the conventional left, frontal and right view breast
thermal images. The other current and popular deep transfer learning models such as Visual Geometry Group 19 (VGG
19), Residual Network 50 (ResNet50V2) and InceptionV3 network are also trained with the aid of same Static and
Dynamic breast thermal images augmented dataset for comparing the performance of these models with the proposed
system. The “Breast Cancer-Caps” system tends to delivers the best testing and validation accuracies as compared to
their other deep transfer learning models. This proposed system delivers an encouraging testing accuracy of more than
99% utilizing the multiview breast thermal images as input over the Dynamic breast thermal images testing dataset.
Whereas the testing accuracies of 95%, 94% and 89% are achieved by the VGG 19, ResNet50V2, InceptionV3 models
respectively over the Dynamic breast thermal images testing dataset utilizing the same multiview breast thermal images
as input.

Key words: Thermal images, breast cancer, capsule network, ResNet50V2, InceptionV3, VGG 19, Static, Dynamic,
multiview

1. Introduction
The most commonly diagnosed cancer in women’s all over the world is the breast cancer malignancy. Its early
diagnosis followed by the prognosis can be very helpful in enhancing the chances of patient’s survival [1]. As
per GLOBOCAN data [2], the breast cancer is the fifth most deadly type of cancer surpassing the lung cancer.
The GLOBOCAN 2020 is an online database providing worldwide cancer statistics and estimates of incidence
and mortality in 185 countries for 36 types of cancer, and for all cancer sites combined. The cancer of breast
originates from the interior lining of the milk ducts or lobules in the breast. The cells of breast cancer are the
outcome of DNA or RNA mutation [3]. The major reasons for this harmful mutation are the excessive use of
chemicals, electromagnetic radiation, viruses, fungi, and mechanical cell-level injury, parasites, heat, water, and
food, free radicals, aging of DNA or RNA etc. [4]. One of the most effective way of diagnosing breast cancer is
∗ Correspondence: [email protected]

1804
This work is licensed under a Creative Commons Attribution 4.0 International License.
TIWARI et al./Turk J Elec Eng & Comp Sci

to perform regular screening either manually by the doctor or employing medical imaging techniques [5]. The
various medical imaging techniques such as mammography, histopathology, ultrasound, infrared thermography
etc. are often used by the doctors for the diagnosis of breast cancer. The infrared thermography technique is
mostly used for performing the regular screening of breast in order to detect any sort of abnormality or lesion.
Such regular screening of breasts as a part of normal health checkup is quite imperative for the detection of
breast cancer in early stages and thus reducing its mortality rate. Even some research carried out in the past
is claiming that the infrared thermography technique is quite capable of detecting breast abnormalities prior to
cancer development [6]. At the same time the infrared thermography technique is painless and can be conducted
in a contactless manner, which eventually makes it more popular among the female patients all over the world.
Whereas the mammography, histopathology, and ultrasound techniques are painful, expensive on the part of
patients and also conducted in a noncontactless manner. Therefore, the infrared thermography technique is
mostly prescribed by the doctors as a complementary test for doing the early breast cancer screening. In case
of cross-validating a suspicious breast thermal images, it is the mammography or histopathology techniques
used by the doctors in order to reach a final diagnostic decision [7, 8]. The thermography technique utilizes
the concept of capturing the infrared radiation emitted from the skin of breast, which determines bilateral
symmetry patterns in normal cases. Any variation in such symmetry patterns will highlights the presence of
anomalies or presence of cancer cells [9]. Generally the Static and Dynamic are the two acquisitions protocols
used for the recording of breast thermal infrared images [10, 11]. The Dynamic thermal images are obtained
during a particular period of time surpassing the cooling step [12], whereas the Static thermal images are taken
at any point of time and one per patient at different angles.
The various computer aided diagnosis (CAD) systems based on machine learning and deep learning are
developed in the last decade for the automated early detection and screening of various cancer type classification
[13]. The existence of these CAD systems are even more important in present time, where the whole world is
experiencing an extreme deficiency of physicians as well as medical facilities for the purpose of diagnosis and
treatment of patients in real time [14]. A substantial and quality research is carried out during the last two years
in order to develop CAD systems for the automated breast cancer detection majorly utilizing the Mammography,
Ultrasound, Infrared thermography and other imaging modalities, which could assist the doctors. The current
research trend lies in the development of CAD systems based on deep learning and deep transfer learning models
for the correct binary classification into breast normal and abnormal cases. Especially thermography is more
popular among the doctors for doing the regular annual screening of women breast as a part of normal health
checkup. Due to this fact, there is an imperative need to develop more accurate and efficient CAD systems
for performing the early screening of breast thermography images into normal or abnormal images. The other
contributions of this research study are stated below:
1. This research article presents an accurate fully automated breast cancer screening system based on
the capsule network utilizing both the Static and Dynamic thermal images for the first time.
2. This research article presents a novel idea of training the capsule network with the help of a multiview
breast thermal images, which can be fabricated by concatenating the conventional left, right and frontal views
breast thermal images for the first time. Then experimentation and simulation is done to showcase that these
multiview concatenated thermal images tend to deliver better results as compared to the conventional single
view thermal images.
3. This research article also presents a brief experimentation utilizing both types of i.e. Static and
Dynamic breast thermal images to showcase which acquisition protocol type delivered better results during

1805
TIWARI et al./Turk J Elec Eng & Comp Sci

training and testing phase.


There are total six sections present in this research study. Initially the basic information regarding the
breast cancer disease and its diagnosis utilizing the infrared thermal images along with the role of machine &
deep learning for automating this procedure is well presented in the introduction section. Then the literature
review section presents some of the major state of the art automated approaches utilizing the infrared breast
thermal images for the breast cancer detection and also the grey areas presents in this research domain. The
third section is all about illustrating the proposed method including the brief dataset description, preprocessing
and augmentation for increasing the number of training samples along with the description and algorithm of the
proposed “Breast Cancer-Caps” system based on the capsule network. The next section is the results section,
which presents the performance of the proposed system along with the performance of the VGG19, ResNet50,
InceptionV3 based systems used for the comparison over the test static and dynamic breast thermal images
dataset. The discussion section presents the merits and issues associated with the training as well as testing of
the proposed “Breast Cancer-Caps” system using the left, right frontal and multiview breast thermal images.
Finally the conclusion section simply presents a brief analysis of the performance of the proposed “Breast
Cancer-Caps” over all four views of breast thermal images and some deep insights of this study to be used in
the future.

2. Literature review
The machine learning based approaches for the breast cancer detection utilizing the thermal images generally
consist of three major stages i.e. segmentation, feature extraction as well as reduction and last classifier training.
When it comes to the accurate breast cancer region of interest segmentation, Fuzzy c-means, Otsu’s thresholding,
level-set methods, the edge based segmentation, k-means, extended hidden Markov model, etc., are used
commonly. Then gray level cooccurrence matrices (GLCM) features, gray level run length (GLRL), Haralick,
gray level size zone matrix (GLSZN) and neighborhood grey tone difference matrix (NGTDM) methods, etc.,
[15–17] are used for the purpose of feature extraction. The various popular machine learning classifiers such as
support vector machine (SVM), naïve Bayes (NB), random forest (RF), AdaBoost, k-nearest neighbor (kNN),
decision tree, least square support vector machine (LSSVM), etc., [18–21] are trained with the aid of extracted
features in order to perform accurate classification. The approaches based on these methods tend to deliver 90%
to 97% accuracies over the local as well as global datasets of breast thermal images. The majority of existing
approaches for the breast cancer detection utilizing the thermal images are based on the conventional machine
learning classifiers, whereas the number of approaches based on deep learning and deep transfer learning is far
less. This very fact is also proved with the help of a graph 1 presented in Appendix 1, which showcases the
comparison among the number of research and review articles published related to the breast cancer classification
utilizing the thermal images based on machine learning and deep learning as per IEEE Xplore and PubMed
research databases.
The deep learning or deep transfer learning based breast cancer classification consists of only one major
stage i.e. tuning and training of convolutional neural networks utilizing the augmented or normal large size
breast thermal images datasets. These deep learning based approaches are completely automated as the various
convolutional layers consisting of filters tend to perform the segmentation, features extraction and training
by themselves. Some of the recent and accurate breast cancer detection approaches utilizing the thermal
images have used the MobileNet, InceptionNet, VGG 16, VGG 19, ResNet152, ResNet18, ResNet101, ResNet50,
ResNet34, etc. [22–25]. A brief comparison among these state of the art thermography based breast cancer

1806
TIWARI et al./Turk J Elec Eng & Comp Sci

classification approaches is presented with the aid of Appendix 1. From the literature review above, the following
points can be concluded as follows:
1. Deep learning based approaches are fully automated and also tend to be more accurate as compared
to the machine learning based approaches. The capsule networks are the evolving and new class of neural
networks, which tend to deliver better results in various cancer classification and other computer vision tasks.
So there is a need to develop a fully automated capsule network based breast cancer screening system using the
thermal infrared images as no such system exists until now. The other reason of using the capsule network is
the fact that this class of neural network tends to perform better on the augmented datasets as compared to
other deep transfer learning models.
2. The majority of all the abovementioned approaches have only used the frontal view breast thermal
images for breast cancer classification. There is no approach proposed yet, which even tried experimentation
with the different views of breast thermal images. A breast multiview concatenated thermal image consisting
of front, left and right may represent more information and hence proved to be more useful for getting good
results.
3. Experimentation is also required to do with respect to the usage of Static and Dynamic thermal images
for the training as well as testing purpose just to showcase which of these two thermal acquisition protocols are
delivering better results. None of the approaches has done this type of experimentation.

3. Proposed approach

The preprocessing along with augmentation and training of capsule network is the major stages of this proposed
approach for the binary breast cancer classification into normal or abnormal cases. The overall proposed
approach is well presented with the help of Figure 1 below:

Figure 1. The overall proposed approach based on the capsule network utilizing the multiview breast thermal images
for the breast cancer detection.

1807
TIWARI et al./Turk J Elec Eng & Comp Sci

3.1. Dataset description


The DMR-IR dataset [16] is used in this research study1 . This is the only database available for the breast
thermal images consisting of images acquired using both the Static and Dynamic protocol. In this database,
breast thermal images of 287 women volunteers are present. These thermal images are acquired with the aid
of thermal camera i.e. FLIR SC-620 at resolution of 640 × 480 with thermal sensitivity of 40 mk (at 20 °C).
For this study and in order to cover the entire breast region, we have taken per patient three lateral views i.e.
left, frontal and right of Static as well as Dynamic thermal images of 150 subjects for training purposes. Out
of these 150 subjects in which 80 are unhealthy and 70 are healthy subjects. Hence the training Static breast
thermal images dataset consist of total 600 concatenated multiview, left, right and frontal views thermal images
of both the healthy and unhealthy subjects. Whereas the training Dynamic breast thermal images dataset
consist of total 600 concatenated multiview, left, right and frontal views thermal images of the healthy and
unhealthy subjects. For the testing purpose, Static and Dynamic testing dataset consist of thermal images of
50 subjects in which 25 are diagnosed as healthy and remaining 25 are diagnosed as unhealthy subjects (having
breast cancer). The Static testing dataset consist of 50 multiview, 50 left, 50 right and 50 frontal view breast
thermal images of both the healthy and unhealthy subjects. Similarly, the Dynamic testing dataset consist of
50 multiview, 50 left, 50 right and 50 frontal view breast thermal images of both the healthy and unhealthy
subjects.

3.2. Preprocessing and augmentation


Initially, both Static and Dynamic thermal images undergo cropping, contrast enhancement, normalization
and lastly resizing as preprocessing steps. Then multiview breast thermal images are generated by initially
concatenating the left, frontal and right views with the aid of hconcat() function used for the horizontal
concatenation and vconcat() function used for the vertical concatenation available in the libraries of the OpenCv
and NumPy in Python. These images are transformed into the thermal images of dimension 480 × 120, as this
size as input offered the best accuracy quotient with least training time. Now both these preprocessed Static
and Dynamic datasets are augmented with the help of the ImageDataGenerator function of Keras. These two
datasets now underwent augmentation utilizing data generation of 4 types like rotation range, shear range,
rescaling and zoom range [26, 27]. After doing a number of experiments, the best values for rotation range =
5, shear range = 0.02, zoom range = 0.02 and rescale = 1./255. The augmentation is done in order to have
a large size dataset of infrared breast thermal images, so the capsule network model trains perfectly and the
problem of over fitting is avoided.

3.3. Capsule network


Primarily the capsule network consist of capsules and in turn these capsules are the clusters of neurons [28].
The different instantiation parameters of the concerned entity are represented by the activity vectors of the
neurons of a capsule. Whereas the probability that a spatial entity exists is denoted by the length of these
vectors. In conventional neural network, the pooling layers are held responsible for most of the deficiencies. The
performance issues related to pooling’s layers are overcome with the aid of routing by agreement principle used in
the capsule network [29]. The first layer generated output are send to the next layer consisting of parental level
capsules in this routing by agreement principle, as all the capsules attempts to classify the parental capsules’
1 http://visual.ic.uff.br/dmi

1808
TIWARI et al./Turk J Elec Eng & Comp Sci

outputs with even different capsules’ coupling coefficient [30]. An increase in the coupling coefficient of the
related capsules is caused due to the anticipated conformations of the parent capsules objectified outputs. As
the capsule x is lx therefore its exposure for the y parent capsule is computed employing the formula illustrated
with the help of Equation 1 as:

l y|x = Oxy lx . (1)

The ly|x simply represents the y th higher layer capsule anticipated vector of its output as compared in the
preceeding layers by the x capsule. The Oxy is the coefficient weighted matrix as a result of regressive process.
The hxy as the coupling coefficeint, which totally depends on the amount of conformations of the parent capsules
as well as the beneath layers. This hxy is represented with the aid of Equation 2:

hxy = exp(gxy )/ exp(gxz ). (2)
z

At the beginning of the process, the gxy is equal to zero and its simply represents the probability of log for
determining, if capsule y is coupled to capsule x or not. So the qy parent capsule input vector is well defined
with the aid of Equation 3 as:
∑ ′
qy = hxy l y|x . (3)
x

Equation 4 simply defined that the initial vector can be employed to get the final output of each capsule. This
equation also makes sure that capsules output never go above 1 value.

2 2
my = ||qy || qy /(1 + ||qy || ) ||qy |. (4)

The y’s capsule output as well as the input vector are denoted by the terms qy and my . Since the probability of
log updates in innumerable in this routing process. Which also means that the large inner product is produced
due to the agreements between my and l( y|x) exploiting the realities of the two supportive vectors. Finally,
Equation 5 represents the coupling coefficients as well as the mxy agreement for amending the log probability.


fxy = my . l y|x (5)

The loss function is computed with the aid of Equation 6 below. It is a function that intends to give high loss
value on capsules processing parameters of long output instigation types especially when the entity existence is
not recorded.
( )2 2
tz = Rz max 0, m+ − ||mz || + δ(1 − Rz )max(0, ||mz || − m− ) (6)

Whenever k class exist, the value of Rz is 1 otherwise 0. Since important parameters such as δ , m− , m+ are
prior established, the learning procedure starts.

3.3.1. The proposed capsule network architecture


The architecture of our proposed capsule network for the breast cancer detection simply consist of four layers
along with an input layer, which takes a breast thermal images image of dimension 480 × 120. Then followed by
a convolutional layer (Conv2D) consisting of “Relu” activation function and 256 filters in order to extract the
primary feature maps. In this layer, every capsule consists of kernel size of 9 with stride value two. The next

1809
TIWARI et al./Turk J Elec Eng & Comp Sci

layer is the primary capsule layer, which divides the features maps into capsules. In this layer, a weight matrix
of size 8 × 16 is multiplied to each capsule after convolution. Finally, the operations of squashing and dynamic
routing are applied to each capsules in the digitacaps layer [31]. This layer is an auxiliary layer to replace each
capsule with its length and is used for performing the correct classification and delivers output as the normal
or abnormal breast. The architecture of the proposed capsule network for breast cancer screening is illustrated
with the help of Figure 2. The model summary of the proposed capsule network for breast cancer screening is
illustrated with the help of Appendix 2. The algorithm of the proposed ”Breast Cancer-Caps” based on capsule
network is given below as proposed breast cancer-caps algorithm:

Figure 2. The proposed capsule network architecture for the breast cancer screening.

3.4. Comparison approach


The VGG19 [32], ResNet50 [33] and InceptionV3 [34] deep transfer learning (DTL) models are employed for
the purpose of comparison and evaluation on both the datasets i.e. Static and Dynamic breast thermal images
datasets. Initially, three models are properly fine-tuned and trained using the same Static and Dynamic breast
thermal images augmented datasets. The objective of this comparison is to present the fact that how efficiently
these popular and complex DTL models perform on the abovementioned datasets in comparison with the capsule
network. The VGG19, ResNet50V2 and InceptionV3 DTL models hyperparameters values are chosen based on
a experimental method [35] offering the best performance and presented with the help of Table 1. The Adam
[36] as an optimizer technique is used for weight adjustment along with the 0.00001 learning rate and mini batch
size of 16 for all the three DTL models. The overall algorithm for the comparison approach is given below as
comparison approach algorithm:

4. Results
The Python 3.6 is used as an implementation programming language along with the Google Colaboratory (colab)
platform is used for the experimentation and simulation in this research study. This result section consists of
two subsections. The first section illustrates the performance of the proposed ”Breast Cancer-Caps” system

1810
TIWARI et al./Turk J Elec Eng & Comp Sci

Algorithm 1 Proposed breast cancer-caps algorithm.


Input:
1. Augmented Dynamic breast thermal images dataset consists of normal and abnormal images
2. Augmented Static breast thermal images dataset consists of normal and abnormal images
Output: The Fine-tuned and trained capsule network as the ”Breast Cancer-Caps”
Procedure:
1: Preprocess the breast thermal images taken from the DMR-IR database in order to remove the noise,
symbols and artifacts and develop two datasets i.e. Static and dynamic breast thermal images datasets.
2: Then generate a multiview breast thermal image by concatenating the conventional frontal, left and right
breast region views in both these Static and Dynamic breast thermal images datasets using the hconcat()
and vconcat() functions available in the OpenCV and NumPy libraries.
3: Then perform the augmentation of both these datasets using the Keras ImageDataGenerator function of
four types i.e. rotation range = 5, shear range = 0.02, zoom range = 0.02 and rescale = 1./255.
4: These augmented datasets images are resized to respective dimensions of 480 × 120 for the capsule network.
As the capsule network is based on the routing by agreement principle and its step by step procedure as
under:

Routing_CapsNet ( l y|x , I, n)

I Initialize log prior probabilities (logit) of all capsules in a layer (x) and above layer (y) for all capsule
x in layer n and capsule y in layer n+1: gxy ←0 .

II Repeat steps III to VI for i th times.


III Compute softmax function i.e. how capsule (x) of layer (n) is coupled to capsule above layer (n+1)
for all capsule x in layer n: softmax (gx )

IV Compute input to a capsule (y) in just above layer (n+1)


∑ ′
for all capsule y in layer n+1: qy = x hxy l y|x

V Compute output of a capsule (y) in just above layer (n+1) using squash function
2 2
for all capsule y in layer n+1: my =||qy || qy /(1+||qy || ) ||qy ||
VI Update initial log it for next iteration

for all capsule x in layer n and capsule y in layer n+1: gxy ←gxy +my . l y|x

VII Return output of the capsule y Return my


5: The capsule network tends to converge at 500 epochs as the training loss remains constant after this.

using various performance evaluation metrics utilizing the single views as well as multiview breast thermal
images. As the two i.e. Static and Dynamic testing datasets are prior developed from the DMR-IR database
and consist of breast thermal images of multiview, left, right and frontal view of 25 breast cancer patients as
unhealthy subjects and 25 healthy subjects. The accuracy, sensitivity, specificity, precision, F1 score, mean
square error [37], cross entropy loss [38], area under the receiver operating characteristic curve (ROC-AUC)
[39, 40] and Kappa coefficients [41] are the performance evaluation metrics used. The performance of the
proposed ”Breast Cancer-Caps” system over both the Static and Dynamic testing datasets are presented with
the help of Table 2. The testing procedure is repeated five times and the mean values of the performance
evaluation metrics are presented in Table 2. Whereas the best confusion metrics obtained during this testing

1811
TIWARI et al./Turk J Elec Eng & Comp Sci

Table 1. The VGG19, InceptionV3 and ResNet50V2 hyperparameters values.

DTL models parameters VGG19 InceptionV3 ResNet50


Input image size 224 × 224 299 × 299 224 × 224
Number of layers 19 48 50
Learning rate 0.00001 0.00001 0.00001
Batch size 16 16 16
Number of epochs for training 500 500 500
Momentum 0.9 0.9 0.9
Optimizer Adam Adam Adam

Algorithm 2 Comparison approach algorithm.


Input:
1. Augmented Dynamic breast thermal images dataset consists of normal and abnormal images
2. Augmented Static breast thermal images dataset consists of normal and abnormal images
Output:The fine-tuned and trained VGG 19, ResNet50 and InceptionV3 models for breast cancer screening

Procedure:
1: Preprocess the breast thermal images taken from the DMR-IR database in order to remove the noise,
symbols and artifacts and form two datasets i.e. Static and Dynamic breast thermal images datasets.
2: Then generate a multiview breast thermal images by concatenating the conventional frontal, left and right
breast views in both these Static and Dynamic breast thermal images datasets.
3: Then perform the augmentation of both these datasets using the Keras ImageDataGenerator function of
four types i.e. rotation range = 5, shear range = 0.02, zoom range = 0.02 and rescale = 1./255.
4: These augmented datasets images are resize to respective dimensions of 224 × 224 for the VGG 19,
ResNet50V2 and dimension of 299 × 299 for the InceptionV3 DTL models.
5: The fine tuning and training of three DTL models are done over both the augmented datasets.
6: The VGG19 models tend to converge at 100 epochs but trained till 500 epochs.
7: Whereas, the ResNet50 and InceptionV3 tend to converge at 200 epochs but trained till 500 epochs.

phase over the Static and Dynamic breast thermal testing datasets are presented in Appendix 3. The proposed
”Breast Cancer-Caps” system is delivering 100% training accuracy over the 70% augmented dataset and more
than 96% validation accuracy over the remaining 30% augmented dataset. The ROC curve obtained over the
validation testing dataset as well as the graph depicting the training loss and training & validation accuracy
graph of the proposed ”Breast Cancer-Caps” system over both the Static and Dynamic training datasets are
presented with the help of Figure 3 below.
The performance of the VGG 19, ResNet 50V2 and InceptionV3 DTL models over both the Static and
Dynamic testing datasets are presented with the help of Tables 3–5. Whereas the ROC curve over the validation
testing dataset, training loss graph and training & validation accuracy graph of these DTL models over both
the training datasets are illustrated with the help of Figures 4–6. Apart from this, performance comparison of
the proposed ”Breast Cancer-Caps” system with the some of the existing state of the art approaches for the
breast cancer classification over the Static and Dynamic testing datasets are also presented in Appendix 3. The
performance of the proposed ”Breast Cancer-Caps” system is much superior as compare to the performance of
these three DTL models and the existing state of the art approaches.The GUI screenshot of the working ”Breast

1812
TIWARI et al./Turk J Elec Eng & Comp Sci

Table 2. The performance of the proposed ”Breast Cancer-Caps” over the Static and Dynamic testing datasets.

Performance Multiview Left view Right view Frontal view


evaluation metrics breast thermal breast thermal breast thermal breast thermal
image image image image
Static Dynamic Static Dynamic Static Dynamic Static Dynamic
Accuracy 98 99.5 83 85 87 87 93 94
Sensitivity 98 98.04 82.35 84.31 87.76 86.27 95.74 95.83
Specificity 98 100 83.67 85.71 86.27 87.76 90.57 92.31
Precision 98 100 84 86 86 88 90 92
F1 score 98 99.01 83.17 85.15 86.87 87.13 92.78 93.88
Mean square error 0.021 0.019 0.178 0.174 0.151 0.150 0.071 0.070
Cross entropy loss 0.086 0.084 0.66 0.64 0.340 0.338 0.451 0.448
AUC 98.2 99.1 90.5 91.4 92.5 92.7 96.5 97.1
Kappa coefficient 0.96 1 0.68 0.72 0.72 0.73 0.88 0.88

Figure 3. The ROC curve, training loss and training & validation accuracy graph of the proposed ”Breast Cancer-Caps”
over the Static breast thermal images dataset is represented by a, b, c and over the Dynamic breast thermal images
dataset is represented by d, e, f.

Cancer-Caps” system is illustrated with the help of Appendix 3.

5. Discussion
The proposed ”Breast Cancer-Caps” system based on the capsule network is proved to be very accurate especially
in terms of performing the binary classification of breast cancer thermal images as compared to other deep
transfer learning models. The major reason of employing the capsule network is its ability to deal with affine
transformations and noise, which are very common in the breast infrared thermal images. Initially, the breast

1813
TIWARI et al./Turk J Elec Eng & Comp Sci

Table 3. The performance of the VGG19 DTL model over Static and Dynamic testing datasets.

Performance Multiview Left view Right view Frontal view


evaluation metrics breast thermal breast thermal breast thermal breast thermal
image image image image
Static Dynamic Static Dynamic Static Dynamic Static Dynamic
Accuracy 95 96 82 83 86 88 89 90
Sensitivity 94.12 96 83.3 83.67 89.13 91.3 91.4 91.67
Specificity 95.92 96 80.7 82.35 83.3 85.19 86.7 88.46
Precision 96 96 80 82 82 84 86 88
F1 score 95.05 96 81.6 82.8 85.4 87.5 88.6 89.8
Mean square error 0.071 0.069 0.181 0.178 0.174 0.150 0.148 0.147
Cross entropy loss 0.152 0.149 0.68 0.66 0.64 0.438 0.438 0.418
AUC 96.7 97.4 89.5 90.5 91.4 92.7 92.7 92.8
Kappa coefficient 0.96 0.96 0.64 0.64 0.68 0.75 0.76 0.8

Table 4. The performance of the ResNet 50V2 DTL model over Static and Dynamic testing datasets.

Performance Multiview Left view Right view Frontal view


evaluation metrics breast thermal breast thermal breast thermal breast thermal
image image image image
Static Dynamic Static Dynamic Static Dynamic Static Dynamic
Accuracy 94 95 81 82 82 83 90 91
Sensitivity 94 94.12 80.39 82 83.3 83.67 95.4 93.6
Specificity 94 95.9 81.6 82 80.7 82.35 85.7 88.68
Precision 94 96 82 82 80 82 84 88
F1 score 94 95.05 81.19 82 81.6 82.8 89.39 90.7
Mean square error 0.070 0.068 0.184 0.181 0.181 0.178 0.147 0.145
Cross entropy loss 0.148 0.146 0.69 0.68 0.68 0.66 0.518 0.511
AUC 97.1 97.7 89.9 89.5 90.2 90.5 92.8 92.9
Kappa coefficient 0.88 0.96 0.61 0.64 0.62 0.64 0.80 0.82

Table 5. The performance of the InceptionV3 DTL model over Static and Dynamic testing datasets.

Performance Multiview Left view Right view Frontal view


evaluation metrics breast thermal breast thermal breast thermal breast thermal
image image image image
Static Dynamic Static Dynamic Static Dynamic Static Dynamic
Accuracy 89 91 80 82 81 83 84 85
Sensitivity 89.8 91.8 82.6 84.78 82.98 86.67 88.6 88.8
Specificity 88.24 90.2 77.7 79.63 79.2 80 80.36 81.8
Precision 88 90 76 78 78 78 78 80
F1 score 88.9 90.9 79.17 81.25 80.41 82.1 82.9 84.21

Mean square error 0.148 0.145 0.187 0.179 0.184 0.178 0.175 0.174
Cross entropy loss 0.438 0.411 0.695 0.67 0.69 0.66 0.64 0.64
AUC 92.7 92.9 89.2 90.1 89.9 90.5 90.8 91.4
Kappa coefficient 0.76 0.82 0.6 0.64 0.61 0.64 0.65 0.65

1814
TIWARI et al./Turk J Elec Eng & Comp Sci

Figure 4. The ROC curve, training loss and training & validation accuracy graph of the comparison approach based
on VGG19 model over the Static breast thermal images dataset is represented by a, b, c and over the Dynamic breast
thermal images dataset is represented by d, e, f.

Figure 5. The ROC curve, training loss and training & validation accuracy graph of the comparison approach based on
ResNet50V2 model over the Static breast thermal images dataset is represented by a, b, c and over the Dynamic breast
thermal images dataset is represented by d, e, f.

1815
TIWARI et al./Turk J Elec Eng & Comp Sci

Figure 6. The ROC curve, training loss and training & validation accuracy graph of the comparison approach based on
InceptionV3 model over the Static breast thermal images dataset is represented by a, b, c and over the Dynamic breast
thermal images dataset is represented by d, e, f.

thermal image of different sizes are tried as input in order to find the best size that offers the least training
time while offering the best accuracy quotient. The proposed capsule network based model offers the highest
training time with input size as 512 × 512, whereas the proposed model offers the best accuracy quotient with
least training time with input of 480 × 120 size breast thermal images. So this input size is used in the research
study for training and testing the “Breast Cancer-Caps” system. The other input sizes that are used in the
experimentation are 512 × 480, 480 × 480, 480 × 240, 480 × 60, etc. The performance comparison along with
the more complex deep transfer learning models in the result section simply proves that the capsule network
based system is more robust and accurate. The parameters associated with the capsule network for the proper
tuning are decided after performing a number of experiments. The capsule network tends to converge at 500
epochs during training, whereas the other three DTL models tends to converge much earlier during training at
either 100 or 200 epochs. Apart from the fact that the capsule network are taking more time to converge, there
is no other drawbacks, which can be associated with this proposed system. As the doctors are more inclined
towards using the breast frontal view for the correct diagnosis as compared to the left and right view. So we
have decided to utilize all the three views along with the newly concatenated multiview breast thermal for
performing the experimentation in this research study.
Remarks: Some of the major insights of the main result are as follows:
1. Multiview breast thermal images are more useful for getting reliable and accurate results in comparison
to the single view images in this research domain. So 3D breast thermal images can be generated and used in
future for better classification accuracy.
2. Multiview thermal images with resolution 480 × 120 is the most appropriate for the training and
testing of this capsule network based ”Breast Cancer-Caps” system. As with this resolution, the proposed
systems offer least training time without compromising on the classification accuracy quotient.
3. The Dynamic acquisition protocol based breast thermal images are far better in comparison to the

1816
TIWARI et al./Turk J Elec Eng & Comp Sci

static protocol based breast thermal images in terms of delivering better performance for the breast cancer or
abnormality detection.

6. Conclusion
This proposed ”Breast Cancer-Caps” system based on the capsule network is the first approach, which has
utilized the concept of multiview breast thermal images for the detection of breast cancer abnormality. In this
research study, the concatenated multiview breast thermal images tend to deliver high accuracy in comparison
with conventional single views breast thermal images as these multiview images offer more information to the
proposed ”Breast Cancer-Caps” system. So the concept of multiview breast thermal images could also be used
along with other new and complex deep learning neural networks, which might offers better results in future.
This ”Breast Cancer-Caps” system tends to delivers better results with the help of Dynamic multiview as well
as single view breast thermal images in comparison to the Static breast thermal images. So Dynamic breast
thermal images are proved to be more effective for the correct detection of breast cancer abnormality in general.
It is the single view frontal breast thermal images after the concatenated multiview breast thermal
images, which delivers satisfactory results. The single view left and right breast thermal images are not that
much effective for the detection of breast cancer abnormality as they both offer least testing accuracies values.
The proposed ”Breast Cancer-Caps” system tends to outperform the three deep transfer learning models i.e.
VGG19, ResNet50V2 and InceptionV3. These DTL models are very popular and also deliver good results
in other medical imaging classification tasks. The proposed ”Breast Cancer-Caps” system achieves testing
accuracies of 99.5% and 98% with concatenated multiview Dynamic as well as Static breast thermal images.
The fact that capsule networks perform better in comparison to other conventional neural networks especially
over the augmented datasets is well established and proved in this research study. Capsule network based
”Breast Cancer-Caps” systems perform better in comparison with popular deep transfer learning models such
as ResNet50V2, InceptionV3 and VGG19.
The future work involves the development of single view as well as multiview large size breast thermal
images dataset along with the clinical information of each women volunteer, so this clinical information can also
be taken into account for the breast cancer accurate classification. Apart from this, the quotient of utilizing this
multiview breast thermal images could be used in future along with other advanced 3D CNN networks as well
DTL models for further enhancing the accuracy and robustness. A set up consisting of thermal cameras along
with the proposed ”Breast Cancer-Caps” system can be installed in the hospitals for performing the regular
screening of women breast in order to diagnose any type of breast cancer abnormality and hence can play an
important role in providing proper healthcare.

References

[1] Duffy SW, Tabár L, Yen AM, Dean PB, Smith RA et al. Benecial Effect of Consecutive Screening Mammog-
raphy Examinations on Mortality from Breast Cancer: A Prospective Study. Radiology 2021; 299 (3): 541-547.
doi:10.1148/radiol.2021203935

[2] Sung H, Ferlay J, Siegel RL, Laversanne M, Soerjomataram I et al. Global Cancer Statistics 2020: GLOBOCAN
Estimates of Incidence and Mortality Worldwide for 36 Cancers in 185 Countries. CA a cancer journal of clincians
2021; 71 (3): 209-249. doi: 10.3322/caac.21660

[3] Xu X, Yuan X, Ni Jiali, Guo J, Gao Y et al. MAGI2‐AS3 inhibits Breast cancer by downregulating DNA methylation
of MAGI2. Journal of Cellular Physiology 2021; 236 (2): 1116-1130. doi: 10.1002/jcp.29922

1817
TIWARI et al./Turk J Elec Eng & Comp Sci

[4] Jahan N, Jones C, Rahman RL. Endocrine prevention of Breast cancer. Molecular and Cellular Endocrinology 2021;
530: 111284. doi: 10.1016/j.mce.2021.111284
[5] Allweis TM, Hermann N, Berenstein-Molho R, Guindy M. Personalized Screening for Breast Cancer: Rationale,
Present Practices, and Future Directions. Annals of surgical oncology 2021; 28 (8): 4306-4317. doi: 10.1245/s10434-
020-09426-1
[6] Lozano A, Hassanipour F. Infrared imaging for Breast cancer detection: An objective review of foundational
studies and its proper role in Breast cancer screening. Infrared Physics & Technology 2019; 97: 244-257. doi:
10.1016/j.infrared.2018.12.017
[7] Gogoi UR, Majumdar G, Bhowmik MK, Ghosh AK. Evaluating the Efficiency of Infrared Breast Thermography
for Early Breast Cancer Risk Prediction in Asymptomatic Population. Infrared Physics & Technology 2019; 99:
201-211. doi: 10.1016/j.infrared.2019.01.004
[8] Ter-Minassian M, Schaeffer ML, Jefferson CR, Shapiro SC, Suwannarat P et al. Screening and Preventative
Strategies for Patients at High Risk for Breast Cancer. JCO Oncology Practice 2021; 17 (4) : e575-e581. doi:
10.1200/OP.20.00262
[9] Yao X, Wei W, Li J, Wang L, Xu Z et al. A comparison of mammography, ultrasonography, and far-infrared
thermography with pathological results in screening and early diagnosis of Breast cancer. Asian Biomedicine 2014;
8 (1): 11–19. doi: 10.5372/1905-7415.0801.257
[10] Hakim A, Awale RN. Thermal Imaging - An Emerging Modality for Breast Cancer Detection: A Comprehensive
Review. Journal of Medical Systems 2020;44 (8):136. doi: 10.1007/s10916-020-01581-y
[11] Mambou SJ, Maresova P, Krejcar O, Selamat A, Kuca K. Breast Cancer Detection Using Infrared thermal Imaging
and a Deep Learning Model. Sensors 2018; 18 (9): 2799. doi: 10.3390/s18092799
[12] Zuluaga-Gomez J, Al Masry Z, Benaggoune K, Meraghni S, Zerhouni N. A CNN-based methodology for Breast
cancer diagnosis using thermal images. Computer Methods in Biomechanics and Biomedical Engineering: Imaging
& Visualization 2020; 9 (2): 131–145. doi: 10.1080/21681163.2020.1824685
[13] González FJ, González R, López JC. Thermal contrast of active dynamic thermography versus Static thermography.
Biomedical Spectroscopy and Imaging 2019; 8 (1-2): 41–45. doi: 10.3233/BSI-190188
[14] Roslidar R, Rahman A, Muharar R, Syahputra MR, Arnia F et al. A Review on Recent Progress in thermal
Imaging and Deep Learning Approaches for Breast Cancer Detection. IEEE Access 2020; 8: 116176-116194. doi:
10.1109/ACCESS.2020.3004056
[15] Husaini MASA, Habaebi MH, Hameed SA, Islam MR, Gunawan TS. A Systematic Review of Breast Cancer
Detection Using Thermography and Neural Networks. IEEE Access 2020; 8: 208922-208937. doi: 10.1109/AC-
CESS.2020.3038817
[16] Silva L, Saade D, Sequeiros G, Silva AC, Paiva AC et al. A new database for Breast research with infrared image.
Journal of Medical Imaging and Health Informatics 2014; 4 (1): 92–100. doi: 10.1166/jmihi.2014.1226
[17] Silva LF, Santos AAS, Bravo RS, Silva AC, Muchaluat-Saade DC et al. Hybrid analysis for indicating patients with
Breast cancer using temperature time series. Computer Methods and Programs in Biomedicine 2016; 130: 142–153.
doi: 10.1016/j.cmpb.2016.03.002
[18] Lashkari A, Pak F, Firouzmand M. Full Intelligent Cancer Classification of thermal Breast Images to Assist
Physician in Clinical Diagnostic Applications. Journal of Medical Signals and Sensors 2016; 6 (1): 12-24.
[19] Raghavendra U, Acharya UR, Ng EYK, Tan JH, Gudigar A. An integrated index for Breast cancer identification
using histogram of oriented gradient and kernel locality preserving projection features extracted from thermal
images. Quantitative Infrared Thermography Journal 2016; 13: 195-209. doi: 10.1080/17686733.2016.1176734
[20] Santana MA, Pereira JMS, Silva FL, Lima NM, Sousa FN et al. Breast cancer diagnosis based on mammary
thermography and extreme learning machines. Research on Biomedical Engineering 2018; 34 (1): 45–53. doi:
10.1590/2446-4740.05217

1818
TIWARI et al./Turk J Elec Eng & Comp Sci

[21] Madhavi V, Thomas CB. Multi-view Breast thermal images analysis by fusing texture features. Quantitative
InfraRed Thermography Journal 2019; 16 (1): 111–128. doi: 10.1080/17686733.2018.1544687
[22] AlFayez F, El-Soud MWA, Gaber T. Thermal images Breast Cancer Detection: A Comparative Study of Two
Machine Learning Techniques. Applied Sciences 2020; 10 (2): 551. doi: 10.3390/app10020551
[23] Mishra V, Rath SK. Detection of Breast cancer tumours based on feature reduction and classification of thermal
images. Quantitative InfraRed Thermography Journal 2021; 18 (5): 300-313. doi: 10.1080/17686733.2020.1768497
[24] Yadav SS, Jadhav SM. Thermal infrared imaging based Breast cancer diagnosis using machine learning techniques.
Multimedia Tools and Applications 2020; 81: 13139–13157. doi: 10.1007/s11042-020-09600-3
[25] Ekici S, Jawzal H. Breast cancer diagnosis using thermography and convolutional neural networks. Medical Hy-
potheses 2020; 137: 109542. doi: 10.1016/j.mehy.2019.109542
[26] Galván JCT, Guevara E, Machuca EMK, Villanueva AC, Flores JL et al. Deep convolutional neural networks for
classifying Breast cancer using infrared thermography. Quantitative InfraRed Thermography Journal 2021. doi:
10.1080/17686733.2021.1918514.
[27] Perez L, Wang J. The effectiveness of data augmentation in image classification using deep learning. Cornell
University arXiv 2017. doi: 10.48550/arXiv.1712.04621
[28] Roth HR, Lee CT, Shin HC, Seff A, Kim L et al. Anatomy-specific classification of medical images using deep
convolutional nets. In: IEEE 12th International Symposium on Biomedical Imaging (ISBI); New York, NY, USA;
2015. pp. 101 - 104.
[29] Hinton GE, Krizhevsky A, Wang SD. Transforming auto-encoders. In: 21st International Conference on Artificial
Neural Networks; Espoo, Finland; 2011. pp. 44-51.
[30] Sabour S, Frosst N, Hinton GE. Dynamic Routing Between Capsules. In: 31st Conference on Neural Information
Processing Systems; Long Beach, CA, USA; 2017. pp. 3859–3869.
[31] Shahroudnejad A, Mohammadi A, Plataniotis KN. Improved Explainability of Capsule Networks: Relevance Path
by Agreement. In: IEEE Global Conference on Signal and Information Processing (GlobalSIP); Anaheim, CA, USA;
2018. pp. 549-553.
[32] Liu T, Wang Z. HiCNN: a very deep convolutional neural network to better enhance the resolution of Hi-C data.
Bioinformatics 2019; 35 (21): 4222-4228. doi: 10.1093/bioinformatics/btz251
[33] He K, Zhang X, Ren S, Sun J. Deep residual learning for image recognition. In: IEEE Conference on Computer
Vision and Pattern Recognition (CVPR); Las Vegas, NV, USA; 2016. pp. 770–778.
[34] Szegedy C, Liu W, Jia Y, Sermanet P, Reed S et al. Going Deeper with Convolutions. In: IEEE Conference on
Computer Vision and Pattern Recognition (CVPR); Boston, MA, USA; 2015. pp. 1-9.
[35] Ozdemir MA, Ozdemir GD, Guren O. Classification of COVID-19 electrocardiograms by using hexaxial feature
mapping and deep learning. BMC Medical Informatics and Decision Making 2021; 21: 170. doi: 10.1186/s12911-
021-01521-x.
[36] Kingma DP. Adam: a method for stochastic optimization. Cornell university arXiv 2015. doi:
10.48550/arXiv.1412.6980
[37] Amyar A, Modzelewski R, Li H, Ruan S. Multi-task deep learning based CT imaging analysis for covid-
19 pneumonia: classifcation and segmentation. Computers in Biology and Medicine 2020; 126: 104037. doi:
10.1016/j.compbiomed.2020.104037.
[38] Yeung M, Sala E, Schönlieb CB, Leonardo R. Unified Focal loss: Generalising Dice and cross entropy-based losses to
handle class imbalanced medical image segmentation. Computerized Medical Imaging and Graphics 2022; 25:102026.
doi: 10.1016/j.compmedimag 2021.102026
[39] Ozdemir MA, Degirmenci M, Izci E, Akan A. EEG-based emotion recognition with deep convolutional neural net-
works. Biomedical Engineering / Biomedizinische Technik 2021; 66 (1): 43–58. doi: 10.1515/bmt-2021-frontmatter1

1819
TIWARI et al./Turk J Elec Eng & Comp Sci

[40] Li WT, Ma J, Shende N, Castaneda G, Chakladar J et al. Using machine learning of clinical data to diagnose
Covid-19: a systematic review and meta-analysis. BMC medical informatics and decision making 2020; 20 (1): 247.
doi: 10.1186/s12911-020-01266-z.
[41] Fatourechi M, Ward RK, Mason SG, Huggins J, Schlögl A et al. Comparison of Evaluation Metrics in Classification
Applications with Imbalanced Datasets. In: 2008 Seventh International Conference on Machine Learning and
Applications; San Diego, CA, USA; 2008. pp. 777-782.

1820
Appendix 1

The main objective of this Appendix 1 was to assess the existing research done in this domain of
machine and deep learning application in the classification of breast cancer utilizing the infrared
thermal images. For this purpose, popular research databases on the likes of IEEE Xplore and
PubMed with the particular search items are searched exhaustively. The research studies included
in this appendix are based on the following selection criteria:
1. Only deep learning based approaches for the breast cancer classification using the Infrared
thermal images are included.
2. Only machine learning based approaches for the Breast cancer classification using the
Infrared thermal images are included.
3. The research studies were limited to the interval from 2016 to October 2021.
4. Only classification or detection approaches are included, whereas prediction approaches
utilizing big data are not excluded from this study.
5. The research studies should mention the future direction or at least offer some narrative to
improve the existing work are also included.
The table 1 below simply summarize the search items employed for the searching of these unique
research studies for the Thermography based Breast cancer classification.
Table 1. The list of research article sources and search items used
Research studies sources Search terms
IEEE Xplore "Breast cancer" AND "Deep learning" AND "Infrared
thermal images" OR “Thermography”
" Breast cancer " AND "CNN" AND “Infrared thermal
images" OR “Thermography”
" Breast cancer " AND "Deep transfer learning" AND
“Infrared thermal images" OR “Thermography”
" Breast cancer " AND "ML" AND “Infrared thermal
images" OR “Thermography” OR “Thermal images”
" Breast cancer " AND "Machine learning" AND
“Infrared thermal images" OR “Thermography” OR
“Thermal images”
filter applied: journals only
Publication year: 2016 to 2021
PubMed " Breast cancer" AND "Deep learning" AND "Infrared
thermal images" OR “Thermography”
" Breast cancer " AND "CNN" AND "Infrared thermal
images" OR “Thermography”
" Breast cancer " AND "Deep transfer learning" AND
"Infrared thermal images" OR “Thermography”
" Breast cancer " AND "Machine learning" AND
"Infrared thermal images" OR “Thermography”
" Breast cancer " AND "ML" AND "Infrared thermal
images" OR “Thermography”

Whereas the Graph 1 below simply presents the search results i.e. paper counts of five years i.e.
from 2016 to 2021 illustrating the comparison among the number of research and reviews articles

1
published related to the Breast cancer classification utilizing the thermography images based on
Machine learning and deep learning as per IEEE Xplore and PubMed research databases.

Graph 1. The search results of five years i.e. from 2016 to 2021 illustrating the comparison among
the number of research and reviews articles published related to the Breast cancer classification
utilizing the thermography images based on Machine learning and deep learning as per IEEE
Xplore and PubMed research databases.
A brief comparison among these state of the art Thermography or Infrared thermal images based
Breast cancer classification approaches is presented with the aid of table 2 and 3.
Table 2: Comparison among the major state of the art breast cancer classification approaches
using thermography based on machine learning
Author and Machine learning Thermal Thermography Results
year classifiers used image lateral Acquisition protocol (Accuracy Percentage)
view is used used
Silva et al. and BayesNet and Radom Frontal view Dynamic 95.36%
2016 [1] Forest (RF), KNN
classifiers
Lashkari et al. AdaBoost, SVM, KNN, Frontal view - AdaBoost = 85.33 %
and 2016 [2] NB etc. classifiers (left breast)
AdaBoost = 87.42%
(right breast)
Raghavendra et KNN, Decision Tree Frontal view - DT = 98%
al. and 2016[3] (DT), NB, PNN, SVM
etc.
Sathish et al. Linear, Polynomial, Frontal view Static Gaussian SVM = 91%
and 2017 [4] Gaussian and quadratic
SVM classifier
Gogoi et al. DT, KNN, SVM, NB, Frontal view Static SVM = 97.33% (Best)
and 2017 [5] ANN, RF, LDA and ANN = of 92.5% (Best)
AdaBoost

2
Santana et al. Multilayer Perceptron Left, right and - 83% (Overall)
and 2018 [6] networks (MLP) and frontal view
Extreme Learning alone
Machines (ELM)
Madhavi et al. GLCM, GLSZM, Left, right and Static 96%
and 2019 [7] GLRLM, NGTDM frontal view
along with LSSVM alone
classifier
AlFayez et al. Extreme Learning Frontal view Static ELM = 100%
and 2020 [8] Machine (ELM) only MLP = 82.2%
Classifier and
Multilayer Perceptron
(MLP) Classifier
Mishra et al. GLCM, GLRL along Frontal view Static RF=95.45%
and 2020 [9] with RF, KNN etc. only
Karthiga et al. Logistic regression(LR), Frontal view Static and Dynamic SVM = 93.3%.
and 2021 [10] Linear SVM, Quadratic only
SVM, Cubic SVM, Fine
Gaussian SVM,
Medium Gaussian
SVM, KNN etc.

Table 3: Comparison among the major state of the art Breast cancer classification approaches
using Thermography based on Deep learning
Author and year Which model is used Which Thermal Thermography Results
image lateral Acquisition protocol (Accuracy
view is used used Percentage)
Roslidar et al. and ResNet101, Frontal view Static and Dynamic DenseNet201
2019[11] ShuffleNetV2, only = 100%
DenseNet and
MobileNetV2
Fernández-vies et ResNet18, ResNet34, Frontal view Static Resnet50 = 98.65%
al. and 2019 [12] ResNet50, ResNet152, only
VGG16 and VGG19
Yadav et al. and Augmentation method Frontal view - Inception V3
2020[13] along with Baseline only = 98.5%
CNN Model,
VGG16,
InceptionV3
Ekici et al. and Convolutional neural Left, Right and Static and Dynamic 98.95%
2020 [14] networks optimized by Frontal alone
Bayes algorithm
Galván et al. and ResNet-101 Frontal view Static ResNet-101
2021 [15] only delivers sensitivity
of 92.3% and
specificity of 53.8%

3
References
[1] L.F. Silva, A.A.S. Santos, R.S. Bravo, A.C. Silva, D.C. Muchaluat-Saade, A. Conci,
Hybrid analysis for indicating patients with breast cancer using temperature time series,
Computer Methods Programs Biomed. 130 (2016) 142–153.
[2] A. Lashkari, F. Pak, M. Firouzmand, Full Intelligent Cancer Classification of Thermal
Breast Images to Assist Physician in Clinical Diagnostic Applications, J. Med. Signals
Sens. 6(1) (2016) 12-24.
[3] U. Raghavendra, U.R. Acharya, E.Y.K Ng, J.H. Tan, A. Gudigar, An integrated index for
breast cancer identification using histogram of oriented gradient and kernel locality
preserving projection features extracted from Thermal images, Quant. Infr. Thermography
J. 13 (2016) 195-209.
[4] D. Sathish, S. Kamath, K. Prasad, R. Kadavigere, Role of normalization of breast Thermal
images and automatic classification of breast cancer, Visual Computer 35 (2017) 57-70.
[5] U.R. Gogoi, M.K. Bhowmik, D. Bhattacharjee, A.K. Ghosh, Singular value based
characterization and analysis of Thermal patches for early breast abnormality detection,
Australasian Physical & Engineering Sciences in Medicine (2018). doi:
https://doi.org/10.1007/s13246-018-0681-4.
[6] M.A. Santana, J.M.S. Pereira, F.L. Silva, N.M. Lima, F.N. Sousa, G.M.S. Arruda, W.P.
Santos, Breast cancer diagnosis based on mammary thermography and extreme learning
machines, Research on Biomedical Engineering 34(1), 45–53 (2018).
[7] Madhavi, V., Thomas, C.B.: Multi-view breast Thermal images analysis by fusing texture
features. Quantitative InfraRed Thermography Journal, 16(1), 111–128 (2019).
doi:10.1080/17686733.2018.1544687.
[8] AlFayez, F., El-Soud, M.W.A., Gaber, T.: Thermal images Breast Cancer Detection: A
Comparative Study of Two Machine Learning Techniques. Applied Sciences 10(2), 551
(2020).
[9] V. Mishra, S.K. Rath, Detection of breast cancer tumours based on feature reduction and
classification of Thermal images, Quantitative InfraRed Thermography Journal (2021).
doi:10.1080/17686733.2020.1768497.
[10] R. Karthiga, K. Narasimhan, Medical imaging technique using curvelet transform and
machine learning for the automated diagnosis of breast cancer from Thermal image, Pattern
Anal. Application (2021). doi:https://doi.org/10.1007/s10044-021-00963-3.
[11] R. Roslidar, K. Saddami, F. Arnia, M. Syukri, K. Munadi, A study of fine-tuning CNN
models based on Thermal imaging for breast cancer classification, In: Proceedings of IEEE
Int. Conf. Cybern. Comput. Intell. (CyberneticsCom) 77-81 (2019).
[12] F.J. Fernández-Ovies, E.S.de. Alférez-Baquero, E.J. Andrés-galiana, Detection of breast
cancer using infrared thermography and deep neural networks, In: International Work-
Conference on Bioinformatics and Biomedical Engineering, Granada, Spain 514–523
(2019).
[13] S.S. Yadav, S.M. Jadhav, Thermal infrared imaging based breast cancer diagnosis using
machine learning techniques, Multimedia Tools and Applications (2020). doi:
https://doi.org/10.1007/s11042-020-09600-3.

4
[14] S. Ekici, H. Jawzal, Breast cancer diagnosis using thermography and convolutional neural
networks. Medical Hypotheses 137 (2020). doi:
https://doi.org/10.1016/j.mehy.2019.109542.
[15] Galv´an JCT, Guevara E, Machuca EMK, Villanueva AC, Flores JL, Gonza´lez FJ. Deep
convolutional neural 25 networks for classifying breast cancer using infrared
thermography. Quantitative InfraRed Thermography Journal 26 2021. doi:
https://doi.org/10.1080/17686733.2021.1918514.

5
Appendix 2

Layer (type) Output Shape Param #

input_12 (InputLayer) [(None, 480, 120, 3)] 0

Conv1 (Conv2D) (None, 472, 112, 256) 62464

primarycap_conv2d (Conv2D) (None, 232, 52, 256) 5308672

primarycap_reshape (Reshape) (None, 386048, 8) 0

primarycap_squash (Lambda) (None, 386048, 8) 0

digitcaps(CapsuleLayer) (None, 2, 16) 98828288

capsnet (Length) (None, 2) 0

Total params: 104,199,424

Trainable params: 104,199,424

Non-trainable params: 0

Table 1: The proposed capsule network model summary for the breast cancer detection utilizing
the infrared thermal images.

6
Appendix 3

The confusion matrices of the proposed “Breast Cancer-Caps” system over the test Static and Dynamic breast thermal images datasets
consisting of the 25 breast cancer and 25 normal breast thermal images for each views i.e. multiview, left, right and frontal are presented
with the aid of figures below. In these confusion matrices, true positive means a breast thermal image belongs to a patient diagnosed
with breast cancer and the proposed system correctly classified this breast thermal image as breast cancer or abnormal image. On the
other hand, the false positive means a breast thermal image belongs to a patient diagnosed with breast cancer and the proposed system
incorrectly classified this breast thermal image as a normal image. Whereas the true negative means a breast thermal image belongs to
a healthy or normal patient and the proposed system correctly classified this breast thermal image as a normal image. Finally the false
negative means a breast thermal image belongs to a healthy or normal patient and the proposed system incorrectly classified this breast
thermal image as an abnormal image.

7
8
Similarly the confusion matrices of the systems used for comparison based on VGG19, ResNet50 and InceptionV3 models over the test
Static and Dynamic breast thermal images datasets consisting of the 25 breast cancer and 25 normal breast thermal images for each
views i.e. multiview, left, right and frontal are presented with the aid of figures below.

9
10
11
12
13
14
Table 1 Performance comparison of the Proposed “Breast Cancer-Caps” system with the existing deep learning based state of the art
approaches for the breast cancer classification over the Static breast thermal images testing dataset

Author and Breast Accuracy Precision Sensitivity Specificity F1 MSE LOSS AUC Kappa
year view score
Multi 94 94 94 94 94 0.07 0.448 97.1 0.88
Fernández-
vies et al. Left 81 82 80.3 81.6 81.1 0.184 0.69 89.9 0.61
and 2019
[1] Right 82 80 83.3 80.7 81.6 0.181 0.68 90.2 0.62

Frontal 90 84 95.4 85.7 89.3 0.147 0.418 92.8 0.80

Multi 89 88 89.8 88.24 88.9 0.148 0.438 92.7 0.76


Yadav et
al. and Left 80 76 82.6 77.7 79.17 0.187 0.695 89.2 0.6
2020[2]
Right 81 78 82.9 79.2 80.4 0.184 0.69 89.9 0.61

Frontal 84 78 88.6 80.3 82.9 0.175 0.64 90.8 0.65

Multi 98 98 98 98 98 0.021 0.086 98.2 0.96


Proposed
Breast Left 83 84 82.3 83.6 83 0.178 0.66 90.5 0.68
Cancer-
Caps Right 87 86 87 86.2 86 0.151 0.340 92.5 0.72
system
Frontal 93 90 95 92 92 0.071 0.151 96.5 0.88

15
Table 2 Performance comparison of the Proposed “Breast Cancer-Caps” system with the existing deep learning based state of the art
approaches for the breast cancer classification over the Dynamic breast thermal images testing dataset

Author Breast Accuracy Precision Sensitivity Specificity F1 MSE LOSS AUC Kappa
and year view score
Multi 95 96 94.12 95.9 95 0.068 0.452 96.7 0.96
Fernández-
vies et al. Left 82 82 82 82 82 0.181 0.68 89.5 0.64
and 2019
[1] Right 83 82 83.6 82.3 82.8 0.178 0.66 90.5 0.64

Frontal 91 88 93.6 88.68 90.7 0.145 0.411 92.9 0.82

Multi 91 90 91.8 90.2 91 0.145 0.411 92.9 0.82


Yadav et
al. and Left 82 78 84.7 79.6 81.2 0.179 0.67 90.1 0.64
2020[2]
Right 83 78 86.6 80 82.1 0.178 0.66 90.5 0.64

Frontal 85 80 88.8 81.8 84.2 0.174 0.64 91.4 0.65

Multi 99.5 98.04 100 100 99.01 0.019 0.084 99.1 1


Proposed
Breast Left 85 86 84.3 85.7 85.5 0.174 0.640 91.4 0.72
Cancer-
Caps Right 87 88 86.2 87.7 87.1 0.150 0.338 92.7 0.73
system
Frontal 94 92 95.8 92.31 93.88 0.070 0.148 97.1 0.88

16
Figure 1: GUI screenshot of the working “Breast Cancer-Caps” early screening system

17
References
[1] F.J. Fernández-Ovies, E.S.de. Alférez-Baquero, E.J. Andrés-galiana, Detection of breast cancer using infrared thermography and
deep neural networks, In: International Work-Conference on Bioinformatics and Biomedical Engineering, Granada, Spain 514–
523 (2019).
[2] S.S. Yadav, S.M. Jadhav, Thermal infrared imaging based breast cancer diagnosis using machine learning techniques,
Multimedia Tools and Applications (2020). doi: https://doi.org/10.1007/s11042-020-09600-3.

18

You might also like