The Open Neuroimaging Joural
The Open Neuroimaging Joural
net
REVIEW ARTICLE
Abstract:
Breast cancer is a potentially fatal disease for its sufferers. Treatment is beneficial in the event of an early diagnosis. Medical imaging is critical in
detecting breast cancer early and accurately; nevertheless, it is plagued by false negative and false positive findings, which frequently lead to
wrong diagnosis and therapy. The response time is also lengthy, resulting in delay in treatment for patients whose lives could have been saved if
the condition had been recognised sooner. Increasing survival rates and decreasing treatment-related side effects of breast cancer have long been
established as a goal of screening programs. Artificial intelligence is a vast discipline with numerous algorithms that improve breast cancer
detection imaging modalities' selectivity, specificity, and accuracy. In the initial sections of the paper, various risk factors for the disease are
highlighted, followed by an in-depth study of existing work related to different imaging processes involved in breast cancer detection. In the later
sections of the paper, some very new deep learning algorithms are mentioned with their achievements in breast cancer detection. Various data sets
available are also tabulated in this research paper with precision.
Keywords: Breast cancer, BIRADS, Artificial intelligence AI, Convolutional neural networks CNN, Digital mammographs DM, Precision,
Magnetic resonance imaging MRI, Digital breast tomosynthesis.
Article History Received: September 06, 2022 Revised: February 22, 2023 Accepted: March 21, 2023
Women make up about 50% of the world's population, and delayed for any reason [19]. If a woman notices a lump in her
over 27% of cancer patients are breast cancer patients. Breast breast, she should see a doctor immediately as it may be an
cancer mortality is second among all malignancies [6]. The early sign of breast cancer, which requires more clinical study
milk-production glands in the breast are called lobules and the and, ultimately, a surgical biopsy. It's not always the case that
cancer of the breast originates in these tissues [7, 8]. Moreover, breast lumps indicate cancer. They can be harmless in some
it is not contagious, although women with a family history are cases [20, 21]. This paper aims to provide a comprehensive
at a higher risk of its development [9, 10]. Fig. (1) represents overview of the many strategies for the early diagnosis of
risk factors that can allow breast cancer to develop inside a breast cancer. In order to assess their capabilities and identify
areas of improvement that can be addressed using cutting-edge
patient’s body [11 - 18].
technology. This work has the potential to enhance the current
The patient has a chance of survival if the condition is system of early diagnosis of breast cancer to save lives and
caught early. Cancer patients are at risk of death if therapy is reduce the likelihood of false detection.
Fig. (1). Different risk factors for breast cancer to develop inside a patient’s body.
20%
Upper
Upper inner
outer Quadrant 10% 50%
Quadrant
Lower Lower
outer inner 10% 10%
Quadrant Quadrant
Left
Right
Fig. (2). Four Quadrants of the breast to study the location of the tumour.
Breast Cancer Detection Methodologies The Open Neuroimaging Journal, 2023, Volume 16 3
Fig. (2) represents the four quadrants in which the breast defeat death. To make the most of the available technologies,
region is divided to study and analyse the probability of relevant research must be reviewed to assess the untapped
occurrence of tumour. Fig. (2) represents the probability of potential of underlying technologies and pinpoint the areas
occurrence expressed in terms of percentage. Generally, left where more investigation is needed [26].
breast cancers are more likely to occur than bilateral ones. The
early detection of fatal diseases finds its way into medical 2. BREAST CANCER DETECTION
imaging processes also called cancer screening [22]. Breast Once any patient is suspected of breast cancer, certain
cancer screenings include mammography. Mammograms can diagnostic tests are required for its diagnosis and prognosis.
even detect disease without symptoms. Deceptive positive and The confirmation of the disease can be performed in two basic
negative reports lead to unnecessary biopsies and undetected steps
malignancy, respectively. Double testing [23] to obtain a
precise conclusion adds expense, time, and labour to the 2.1. Breast Cancer Imaging
suspect. Computer-aided solutions [24] can detect and The primary goal of these imaging techniques is to
diagnose. These computer-aided solutions can help doctors pinpoint the location of the tumour; subsequent procedures are
analyse results more accurately and clearly, allowing the used to verify if the tumour is malignant. Mammograms,
suspect to land on a certain test result [25].Several latest ultrasound, MRI, and tomosynthesis are all mentioned as
technologies can also assist in making the process more potential methods of breast cancer diagnosis in this article.
flawless and in favour of victim women as timely detection of Breast cancer imaging approaches and the BIOPSY procedure
the disease is the only tool to fight the deadly disease and for detecting breast cancer are contrasted in Fig. (3).
Breast Cancer
Detection
2.1.1. Mammographs BIRADS is an acronym for breast imaging reporting and data
system to grade the mammogram test result for breast cancer
Mammographs are X-rays of the breast, taken at special
detection. It gives information about breast density and tests if
angles to identify the hidden tumours in the breast. The
any abnormalities are there that need further investigation.
technique uses ionisation radiation to detect these tumours. It BIRADS is a numeric scale ranging from 1 to 6 (BIRADS 0
becomes difficult to detect these tumours using mammographs indicates incomplete test, and retest is required).
with denser breasts. Generally, the accuracy of mammographs
lies between 85% to 90%. Additional testing is used to get 2.1.2. Ultrasound
double sure about the disease after the first mammography.
Ultrasound uses sound waves to detect the location of the
Debelee in 2019 elaborated on the different cancer imaging
tumour. Knowing whether the mass discovered is a cyst filled
processes and techniques on which deep learning algorithms
with fluid or a solid mass is more helpful. It is less likely to be
can be applied such as Screen-film mammography (SFM) is not cancer if a cyst is found, but the probability of cancer increases
a digital cancer screening imaging technique but is highly if a solid mass is found. It acts as the guidance system in case
sensitive and is still in practice in countries like Ethiopia of biopsy to guide the needle to take out the cancer for further
having the trade-off between dynamic range and contrast tests. It also tests armpit lymph nodes. In thick breasts, it is
resolution [27, 28]. Digital Mammography DM is an utilised when mammograms cannot detect malignancy. US is
advancement on film mammography as it is a digital technique adopted as a second choice to DM as it cannot be interpreted
hence is enabled with (Computer-aided design solutions) CAD straightforwardly and is operator dependent. However, it
and is quicker in response but suffers from low specificity [29]. doesn’t involve ionization radiation and is therefore safe in this
Fig. (4) has categorized breast cancer into different grading regard [30, 31]. Fig. (5) distinguishes the cancerous lump on
categories from BIRADS 1 to BIRADS 6, as shown in Fig. (4). the basis offluid-solid mass after the ultrasound method.
Needed for
pateints at high
disease risk
DM+DBT
*93% **70%
Fig. (8). Enhancement in sensitivity and specificity by the combination of two processes involved for breast cancer detection.
6 The Open Neuroimaging Journal, 2023, Volume 16 Sharma et al.
The imaging process is combined according to the process of digital mammography [31].
prevailing situation of the cancerous growth present in the
breast to confirm the actual status of the disease inside the 2.2. BIOPSY
patient’s body. Digital mammography is the basic preliminary
Suppose any patient is found with suspected lesions in the
examination practised nowadays and is followed in
breast. In that case, tissues are removed from this ROI located
combination with other imaging processes to fetch other
by any of the imaging processes discussed in Section 2.1 by
important details related to the suspected growth of tissues
forming a lump or tumour. These combinations are also helpful clinical processes of removal of tissues for further testing. This
in gathering the associated information like the nature of the process of removal of tissue from the patient’s breast is termed
mass (solid or fluid), and the extent of growth of the mass, and as biopsy. Further discussions on biopsy are beyond the scope
finding other such growths if they are hidden inside the dense of this paper. Fig. (9) shows the possible outcomes of biopsy,
breast layers. The enhancement in the outcome of the imaging which can be cancerous or non-cancerous. The
processes for breast cancer detection when used in combination histopathological images are obtained after a biopsy is
is schematically represented in Fig. (8). The two parameters of performed in case of breast cancer detection. These images are
evaluation considered in this figure are the specificity and used to identify the cancer and its treatment to be followed as
sensitivity and the outcome is also measured and evaluated on the next step by the patient. There exist several methods to
these bases. The two combinations are made in the preliminary obtain histo- pathological images.
Malignant Benign
Biopsy
Fig. (9). Possible outcomes of biopsy for breast cancer detection [45]. The images are taken from BreakHis data set, This dataset (publicly available
athttps://web.inf.ufpr.br/vri/databases/breast-cancer-histopathological-database-BreakHis/.
Classification
Input image of images
obtained from Detection
according to
imaging of features
the identified
processes features
Segmentation
Feature
of image on
extraction
breast region
from the
PREDICTION
and ROI
image
region
3. ROLE OF ARTIFICIAL INTELLIGENCE IN BREAST deploy various machine learning algorithms on it to automate
CANCER IMAGING and improve the accuracy of the screening process [45, 46].
Deep learning, a machine learning technology, has been
In sections, 2.1 and 2.2 of this paper, different types of
developing for years [47]. With this method, the machine
images are obtained after each process discussed so far in this
learns through experience, and a number of algorithms are
paper. These images are analysed to draw conclusions related
designed to control and direct this learning process [48 - 49].
to the suspected growth of tissues located as ROI in the images.
This technology can facilitate medical imaging for breast
It is indeed a difficult task to recognise and identify these
cancer screen testing, decrease the chances of falsified reports,
tumours manually and then to land on some conclusions based
and initiate appropriate response post diagnosis. The Deep
on the naked eye observation of the images obtained after
learning accepts data in various forms, such as text, images,
different cancer detection methodologies discussed in the
and time series. If the input data is in text format, then it is an
previous sections. Fig. (10) demonstrates the basic steps in
artificial neural network (ANN) [50] for images it is
image analysis such as segmentation, feature extraction, convolutional neural network (CNN) [51, 52] and for time
classification, and prediction. series, it is a recurrent neural network (RNN) The full-scale
The latest technological trends of data acquisition, pre- digital mammograph is usually a high pixel image of about
processing, and processing of the data can be used in alliance 4000 x 3000 pixels and ROI, that is, the possible area occupied
with medical imaging for deriving maximum information from by a probable cancerous growth is 100 X 100 pixels, which is a
the images obtained by different imaging processes as limited zone of focus for classification of the lesions. This is
discussed in previous sections of the paper. Artificial one of the major challenges in this investigation especially
Intelligence is the field in which an artificial brain is embedded beyond the known regions of clinical interpretation and
in the machine with which the machine is expected to be as investigation. Region-based convolutional neural networks(R-
good as humans. This artificial brain is neural network CNN) [53] along with its other variants [54 - 57] in deep
composed artificially with a multi-layered architecture having learning techniques are used to establish object detection and
many hidden layers. Many algorithms are currently working in classification algorithms in this case. Efforts are also seen in
this field for optimised usage and better performance. Once a the work to train the neural network with datasets with full
machine is AI enabled, it has the inherent advantage of making annotation and larger datasets with the status of cancer growth
decisions and reacting to the environment. Letting the machine to make the algorithm more effective to use and apply. It is
learn from the experiences and surroundings is important here. also observed that the pretraining method, such as ImageNet is
Machine learning, is a subset of AI, that is specific in letting also a favourable approach for letting a deep learning model be
the machine learn. The learning of the machine is further trained on a larger dataset. Shen [58] used a fully annotated
categorised by the mode of learning, such as supervised dataset with ROI knowledge to pre-train the classifier model
learning or unsupervised learning. Supervised learning is a for local image patches. In this proposed work, the weight
type of machine learning that requires training data to train the parameters associated with the local image classifiers are used
neural network. In the case of medical imaging, this training to initialize the whole image weight parameters, which further
data is available in the form of different datasets. It’s a similar get more refined while in the process. The patch and image
process to learning with a tutor where each answer is matched classifiers referred in work are developed using a large dataset
with the answer in the tutor’s mind and if the similar match is of fully digitised mammographs without information on ROI.
found, learning of the fact is stored in the student's mind. The They contain more than thousands of images which are further
reduced to small digitised film mammographs containing one-
datasets used are annotated and marked with ROI, etc.
tenth of the previously existing images. Computer-aided
Unsupervised learning is not guided learning, but it is
detection systems such as R2 Image checker, cenova 1.0 and
exploration-based learning. The data sets are not annotated and
iCAD Second look1.4 that do not involve any deep learning
marked. Deep learning is a further subdomain of ML and AI. It
techniques are proven to be inferior to the approaches
can be categorised on the basis of the input data received at the
involving deep learning techniques in digitised mammographic
input stage of the DL model. Artificial neural networks accept
platforms [58, 59]. Turkki [59] also used CNN to extract the
data in the form of numbers. Convolutional Neural networks
local image descriptors to be used for the detection of breast
receive data in the form of images. Recurrent neural networks
cancer. Computational requirements are less stringent in this
accept data in the form of time series. In this paper, we are
technique as CNN is trained on ImageNet [59, 60] opted the
concentrating on medical imaging, thus the input is in the form
use of supervised and unsupervised CNN for the analysis
of images, and hence our technology of interest is deep
obreast cancer histopathological images instead of
convolutional neural networks as a facilitator and enhancer of
mammographs .In this work, transfer learning techniques and
medical imaging. Fig. (11) establishes the relation between
deep learning tools are used to classify the histopathological
these technologies and highlights the work that can be used
images. Fther clustering analysis is performed on the classified
more in favour of medical imaging.
images followed by autoencoder network with its dimension
reduction functionality to map the extracted features with lower
3.1. Deep Learning Convolutional Neural Networks for
dimensional space [60]. Deep convolutional neural networks
Breast Cancer Detection
which are applicable for image analysis of images obtained for
Recent advances in data gathering have provided breast cancer detection are as shown in Fig. (12).The diseased
researchers with an excellent opportunity to obtain and analyse data set for analysis and experimentation are acquired from
data linked to medical diagnosis, which they are doing to these references. [61 - 65].
8 The Open Neuroimaging Journal, 2023, Volume 16 Sharma et al.
Supervised
learning Deep learning(DL)
Deep Artifitial Neural Convolutional neural
Unsupervised
learning networks(ANN) networks (CNN)
learning
Fig. (11). Contributing technologies for medical imaging for breast cancer detection.
Images obtained
Segmentation
from different
imaging processes
SFM, DM, US, DBT, MRI Feature
Convolutional
Extraction
Neural
Image analysis
Networks
Detection
Histopathological
images of breast Classification
cancer & Prediction
Wang [66] in his work, compared a deep learning strategy 4.APPLICATION OF AI ENABLED TECHNOLOGICAL
named as stacked deionising autoencoder (SAE) with existing ADVANCES IN MOST COMMON MEDICAL IMAGING
machine learning benchmark classifiers such as Support vector PROCESSESS FOR RADIOLOGICAL DIAGNOSIS OF
machines, SVM, Linear discriminant analysis LDA and K- BREAST CANCER.
nearest neighbour KNN on the basis of accuracy, sensitivity, The most common medical imaging processes available for
specificity and AUC and found SAE to be the most efficient breast cancer detection are mammographs, ultrasound,
and thus concluded the deep learning strategies to be more magnetic resonance imaging, digital breast tomosynthesis.
advanced for classifying the masses and lesions w.r.t benign or Further, these common processes are subjected to biopsy to
malignant classification. A recent work proposed the deep find the tumour to be cancerous or not which also yield images
learning CNN inception V3 for the detection of lymph node for prediction and treatment. Thus, now realising the need for
metastasis on US images [67].The newly developed method of enhancement in the use of image analysis for better breast
deep learning radiomics [68] to detect early breast cancer cancer prediction, the study in this section is categorised for
using the US images with shear wave elastography features different medical imaging processes utilising AI, ML, DCCN
SWE that measure the tissue stiffness and use colour maps to advanced tools for making the images more informative and
demonstrate the distribution of Shear wave velocity SWV. The reducing the response time for reading the images.
combination has exhibited better performance in distinguishing
benign and malignant lesions [69] multiple instances based on 4.1. Breast Cancer Detection using AI-enabled
deep learning neural network for the determination of Mammographs
estrogenic receptor status, ERS using haematoxylin and eosin Kim focussed on the mammography images collected from
staining H&E to highlight the circular morphology, which three different countries such as South Korea, the USA, UK,
proved to be cost-friendly, less time consuming and involved for developing a large-scale variable dataset to test an AI based
lesser number of variables for preparation [70]. algorithm for the detection and diagnosis of breast cancer. The
Breast Cancer Detection Methodologies The Open Neuroimaging Journal, 2023, Volume 16 9
average age of women is 50.3 years in the said data of software for evaluating abnormality scores per breast [75].
mammography images.1,70,230 digital mammograms [71 - 74] Table 2 shows the systematic arrangement of the process of
have been considered in this work for the training of AI based obtaining the AI enabled digital mammography images.
algorithms collected between January 2000 to December 2018
The data set used in this work overcomes one major 4.2. Breast Cancer Detection using AI Tools Enabled MRI
methodological deficiencies of inadequate data to train the AI Adachi in [76] proposed the use of Retina Net to train an
algorithm. The proposed AI algorithm is two-stage trained artificial intelligent system for detecting malignancy in MRI
algorithm based on ResNet-34 CNN architecture. Fully images. The AI system was also evaluated and compared with
supervised learning is practiced using annotated mammograms human readers on the basis of sensitivity, specificity, and AUC,
in Stage I (Patch-learning) followed by semi-supervised and it was found that the AI system performance was better
learning using only mammograms in stage II (Image fine- than the system without AI assistance. Fig. (13) represents the
tuning). Lunit INSIGHT MMG is used as a diagnostic support images obtained from the MRI process [76].
Fig. (13). Different images obtained by MRI process of breast imaging [76].
Table 1. Aim of the five most common imaging processes for radiological diagnosis of breast cancer.
Imaging Process/ Refs. Aim Penetrating Waves Used Sample Image Sample Image
Benign Malignant
Digital Mammography To locate the breast cancer using low energy x- Ionisation radiations
DM [34, 36, 37] rays (30kVp).
[38] [38]
Ultrasound To identify whether the lump is a fluid filled mass Acoustic waves
US [31, 34] or solid mass.
[39] [39]
Magnetic Resonance Induction To know the extent of already located cancer Radio waves
MRI [40, 41] lump.
[42] [42]
Digital breast tomosynthesis To locate hidden lesions skipped by Ionisation radiations
DBT [43] mammographs.
[44] [44]
Note: [34] Debelee(2019)
[36] Yeon (2019)
[37] Song (2021)
[38] Huang(2020)
[39] Dhabyani(2019)
[40] Petrov(2014)
[41] Mohan (2013)
[42] Zhou (2020)
[43] Teertstra(2010)
[44] Jeong(2007)
10 The Open Neuroimaging Journal, 2023, Volume 16 Sharma et al.
Fig. (14). Images obtained for negative cases after MRI process (a) True negative case (b) false negative case [76] LHS images are the original
images and RHS images are detected by AI system.
These images are classified on the basis of the level of images with a rectangular box and the respective annotations
infection found by the radiologist at the time of imaging [77]. are stored in the COCO format. AI systems proposed in the
LabelMe, the graphic annotation tool, is used to annotate these work can come up with true or false reports. The aim is to
Breast Cancer Detection Methodologies The Open Neuroimaging Journal, 2023, Volume 16 11
make the system efficient enough to generate all true reports failed to detect it and generated a negative report, which is
and none of the erroneous reports should be the outcome of the incorrect and can mislead the patient's treatment and diagnosis
AI system. As both instances are likely to occur in the current of the current situation of the disease present in the patient's
setting, Fig. (14) depicts the many cases in which the patient
body. The chance of the occurrence of case (b) must be
obtained a negative final report following the MRI process for
breast cancer diagnosis. In case (a) the report generated is lowered using AI and its enabling technologies, allowing the
correct, that is, the patient does not have breast cancer, in case suspect of the lethal sickness to be sure about the outcome of
(b) the patient does have breast cancer but the imaging process its case (a) imaging process.
Original AI system
Original AI system
S-detect
Fig. (16). Ultrasound image enabled by AI algorithm for breast cancer detection [78].
True Positive
False Positive
DM DBT enabled
with AI
(b)
(a) 75% readers were able to
25% readers were able detect cancer with
to detect cancer with reading time 50.3 secs
reading time 77.6 secs
Fig. (17). DBT images enabled with AI (a) DM (b) AI enabled DBT image [84].
Another case is when when a suspect of the disease which represents the cases in which the MRI process has generated a
is detected positive in the report of MRI process. Fig. (15) positive report. In case a and b the patient is detected positive
12 The Open Neuroimaging Journal, 2023, Volume 16 Sharma et al.
when the patient is actually suffering from the disease in this suspected area of the breast.
way, the right treatment can be started at the right moment of
time and the survival rate of the cancer victim can be 5.2. Stage 2 (Image Pre-processing)
increased.. In this stage, the image is subjected to a variety of steps for
its pre-processing and analysis, such as its change in
4.3. Breast Cancer detection using AI tools enabled
orientation, removal of noise and artifacts, improving the
ultrasound
contrast and appearance such as reduction in blurring in the
The S-detected AI technique is used to highlight breast image.
lumps on ultrasound pictures of the breast, reducing the
likelihood of false negative findings. In comparison to US 5.3. Stage 3 (Image processing)
pictures that are not backed by AI algorithms, it has increased
This step includes colour processing, Multi-resolution
overall specificity, sensitivity, and accuracy. Fig. (16) depicts
processing, compression of the image to reduce its size to a
the contour marking of the margins of cancer masses in US
smaller one with minimum deterioration in the quality of the
pictures [78].
image, morphological processing to extract useful components,
The same kind of AI algorithms are applicable to descripting the shape of the image are some of the commonly
mammographs and tomosynthesis images and the performance used techniques employed on the images obtained from sensor
metrics of specificity, sensitivity and selectivity go high after captures using different imaging modalities for breast cancer
applying AI algorithm on the normal tomosynthesis image for detection.
breast cancer detection, which in turn increases the probability
of true positive true negative reports. Fig. (17) shows the 5.4. Stage 4 (Segmentation)
diagrammatic representation of the AI enabled DBT images.
It is a technique for segmenting an image in order to better
AI-enabled DBT images have reduced the reading time by study it for further investigation; the pixel set of segments of
19%, and the enhanced dimensional perspective makes the interest is a super pixel. It is typically applied to photographs in
lesion more identifiable, particularly those buried in folds, order to minimise their complexity for analysis. This procedure
which become evident with 3D views when bookmarking and labels the pixels and organises them into different categories in
flagging the lesion [79]. The relevance of ethical behaviour order to prioritise one group of pixel categories over another
must be proposed by the relevant authorities for the based on the necessity for observation from those images,
deployment of AI in breast cancer detection. Because it deals thereby distinguishing with lines and curves. In the field of
with the digitization of cancer patients' data, actions for data medical imaging, these segmentation approaches are
confidentiality are required for societal advantages and balance commonly used on digital mammograms, where they
[80]. AI algorithms have been effectively deployed on several outperformed the K-means algorithm for segmentation [84,
digital screening procedures for breast cancer detection, 85]. The combination of the K-means algorithm and the area
outperforming traditional methods without the support of AI- growing algorithm produces better results because the k-means
enabled imaging processes.This has undoubtedly improved the technique is used for segmentation and the region expanding
erroneous status of the reports generated by various imaging algorithm is used to eliminate the pectoral muscles from the
processes, and in addition to improving the accuracy of breast denser breast regions [86]. The combination of a grow cut with
cancer diagnosis, the incorporation of AI algorithms to the a gaussian mixture model GCGMM as a segmentation strategy
digital screening process has resulted in a reduction in reading on MRI images of breast cancer lesions for mass and non-mass
time [67, 81]. Researchers employing AI to improve breast enhancement, as well as background parenchymal
cancer screenings discovered that using AI reduced enhancement, reached a 95% accuracy [87]. Breast
radiologists' workload and improved the diagnosing process, parenchymal tissue is segmented from the air and other tissues
avoiding patient deaths [82]. XGBoost defeated logistic using a 3-D multiplanar multiprotocol CNN-based
regression for early cancer diagnosis; other algorithms used segmentation technique [88]. Convolutional neural networks
included Random Forest and deep neural networks [83]. outperformed the watershed method for segmenting skin, fibro
glandular, bulk, and fatty tissues on 3-dimensional ultrasound
5. VARIOUS STAGES INVOLVED IN RADIOLOGICAL images, achieving a JSI of 85.1%. (Table 3).
ANALYSIS FOR BREAST CANCER
This section provides the layout of the steps involved in 5.5. Stage 5 (Feature Extraction)
image analysis after the image is available for further prognosis It is the method to reduce redundant data in the image. The
and diagnosis. The latest AI algorithms used in different stages larger data in the image means a larger number of variables,
are also mentioned and compared. hence more computation is needed. Feature extraction helps to
group these variables in useful features of the image, which are
5.1. Stage 1 (Image acquisition) easy to process. Table 4 represents different traditional and
Although there are many ways of image acquisition for deep learning-based feature extraction methods, out of which it
breast cancer detection, in the scope of this paper, the five most has been established by many of the existing research works
common ways discussed so far are mammographs, ultrasound, that deep learning methods of feature extraction can extract
MRI’s and tomosynthesis. These images are then converted more complex features and are more robust in operations such
into different formats for the investigation of cancer in the as scale, occlusion, rotation etc [89, 90].
Breast Cancer Detection Methodologies The Open Neuroimaging Journal, 2023, Volume 16 13
5.6. Stage 6 (Classification and Prediction) forest, k-nearest neighbour, support vector machine, artificial
neural networks, and decision tree. These are all majorly
From the above-mentioned stages, features are extracted machine learning algorithms. Deep learning algorithms are a
and, on that basis, the observed instances are categorised into bit more advanced for the purpose such as such as logistic
different classes for prediction of breast cancer for further regression, convolutional neural networks, ResNet34, etc.
prognosis. Many algorithms have been designed in literature Prediction is generally made for being malignant or benign.
using AI, ML, and deep learning techniques for this purpose. Python is used as the coding language for the above algorithms
Some of the very known algorithms are naïve Bayes, random (Table 5).
Table 3. Different segmentation algorithms were used for breast cancer detection.
DCE MRI Improved segmentation with volumetric Attained best result with accuracy
images [90, 91] delineations such as Grow cut gaussian -95%
mixture model approach GCGMM for It is a volumetric segmentation
mass and non-mass enhancement a approach and is recognised as
background parenchymal enhancement most reproducible segmentation
Table 4. Different methods of feature extraction implemented in image analysis, (above) Traditional methods, (below) Deep
learning method [91, 92].
(Table 4) contd.....
Feature Extraction Methods For Breast Cancer Imaging Analysis
Scale-Invariant Feature Scale space extrema detection, Key point matching between Slow in speed Scale invariant
Transform (SIFT) Key point Localisation, nearest neigh bours
Orientation Assignment, Key
point descriptor, key point
matching.
Speeded-Up Robust Laplacian of Gaussian with Three times faster than SIFT not good at handling Faster matching, without
Features (SURF) Difference of Gaussian for Compares the contrast of viewpoint change and reducing the descriptor's
finding scale-space. features illumination change. performance. feature
descriptor has an extended
128-dimension version.
Features from Accelerated Non-Maximum Suppression Store 16-pixel points in circle Not much robust to Corner detection with much
Segment Test (FAST) around feature point as a vector high level noise faster speed
Binary Robust Independent Hamming Distance to match reduces the memory usage by Less recognition in Cannot do feature
Elementary Features these descriptors converting descriptors in large in plane rotation extraction by itself can be
(BRIEF) floating point numbers to binary used with SIFT and SURF
strings. etc
Deep Learning Methods: After involving deep learning convolutional neural networks for feature extraction
Name of the method Function used Principle involved Limitation Feature extracted
Super Point: Self- VGG style encode for feature CNN computes like SIFT in Needs to be enhanced Point detection and point
Supervised Interest Point extraction followed by two single forward pass for semantic description
Detection and Description decoders one for point segmentation
detection and other for point
description
D2-Net: A Trainable CNN VGG -16 architecture It is a CNN working on Needs to be enhanced Joint description and
for Joint Description and pertained on image net principle of detect and describe for accuracy of key detection of local features
Detection of Local Features 3D reconstruction and local points
visualisation.
For image matching its baseline
is Root SIFT
LF-Net: Learning Local Feature map generation using LF-Net is high density, multi- Needs to be improved LF-Net out performs SURF
Features from Images Res-NET scale CNN that returns key for larger frame by 39%
Scale-invariant key point point locations, scales, and differences
detection. orientations.
Orientation estimation.
Descriptor extraction.
Deep Graphical Feature Graph Neural network Compositional message passing Not mentioned Local features obtained by
Learning for the Feature converts weak local geometric neural networks are used and graph neral network when
Matching Problem features to rich local features CNN is viewed as graph neural used with different
for efficient feature matching network on grid graph where algorithms can outperform
each grid is a feature vector. traditional feature
extraction methods
Note: Summary of the table: Involving deep learning convolutional neural networks for feature extraction has increased efficiency, accuracy, speed and minimised human
involvement in the process
(Table 5) contd.....
Ref. Year Countries Imaging Modality AI Algorithm Achievement
[98] 2020 India Digital mammographs Harmony search and simulated Accuracy and for for local mammographic data
BCDR-FC03, Local annealing (HS-SA) algorithm set =99.89% and for BCDAR F03= 99.76%
mammographic data
Table 6. Various datasets are used to train AI neural networks to work for breast cancer detection.
6. CO-EVOLUTIONARY COMPUTATION-BASED AI sickness is diagnosed in its early stages. False positives and
ALGORITHMS USED IN RECENT WORKS FOR negatives can be prevented by correctly diagnosing a breast
RADIOLOGICAL ANALYSIS OF BREAST CANCER lump (tumour). Mammograms are commonly employed as the
initial step in diagnosing breast cancer and localising any
6.1. Data Sets used to Train Algorithms for Early Breast suspicious lesions, according to the study cited in this
Cancer Detection publication. MRIs are performed on high-risk patients or on
individuals with thicker breasts, as a needle guiding facilitator,
Any neural network or artificial intelligent system needs an
and for chemotherapy to determine the amount of recovery
effective data set for its training. Many times, the features of an
from previous therapy. Ultrasounds are used to determine if a
artificial intelligent system are not fully curated because of the
suspected tumour is fluid-filled (less likely to be cancerous) or
inadequate data set on which it is trained. Thus the data set
solid (more likely to be cancerous). Breast digital
plays a very vital role for any artificial intelligent system to be tomosynthesis provides a sectional image of hidden layers in
efficient in accuracy. Table 6 provides information on some the breast, potentially identifying tumours missed by
datasets that are available for breast cancer detection using mammograms. Patients undergoing biopsy have a sample
imaging methods enabled by deep learning convolutional removed from their breasts, and we then get histopathology
neural networks. The dataset collected under HIPPA is a images, which are then subjected to image analysis to
smaller dataset and lacks parameter optimisation [99]. determine whether the suspected growth of tissue is malignant
BreCaHAD dataset is a collection of 162 histopathological or not. Medical imaging systems outfitted with sophisticated
images that have limited pixel tonal value [100] and AI capabilities, such as deep learning convolutional neural
CAMLEYON 16 is collected from RUMC Netherlands is a networks, have surpassed traditional imaging methods across
bigger dataset of 1322 histopathological annotated images all imaging modalities and phases. These approaches can be
[101].Some more datasets are also mentioned in the Table 6 enhanced in the future for better performance and faster
with their key remarks. outcomes [107 - 110].
FUNDING [15] Easton DF, Pharoah PDP, Antoniou AC, et al. Gene-panel sequencing
and the prediction of breast-cancer risk. N Engl J Med 2015; 372(23):
None. 2243-57.
[http://dx.doi.org/10.1056/NEJMsr1501341] [PMID: 26014596]
[16] Kuchenbaecker KB, Hopper JL, Barnes DR, et al. Risks of breast,
CONFLICT OF INTEREST ovarian, and contralateral breast cancer for BRCA1 and BRCA2
Dr. Ayush Dogra is the editorial advisory board member of mutation carriers. JAMA 2017; 317(23): 2402-16.
[http://dx.doi.org/10.1001/jama.2017.7112] [PMID: 28632866]
the journal The Open Neuroimaging Journal. [17] Garcia-Closas M, Gunsoy N B, Chatterjee N. Combined associations
of genetic and environmental risk factors: Implications for prevention
ACKNOWLEDGEMENTS of breast cancer. J Natl Cancer Inst 2014; 106(11): dju305.
[http://dx.doi.org/10.1093/jnci/dju305] [PMID: 25392194]
Declared None. [18] Hamajima N, Hirose K, Tajima K, et al. Alcohol, tobacco and breast
cancer – collaborative reanalysis of individual data from 53
REFERENCES epidemiological studies, including 58 515 women with breast cancer
and 95 067 women without the disease. Br J Cancer 2002; 87(11):
[1] Galatzer-Levy IR, Karstoft KI, Statnikov A, Shalev AY. Quantitative 1234-45.
forecasting of PTSD from early trauma responses: A machine learning [http://dx.doi.org/10.1038/sj.bjc.6600596] [PMID: 12439712]
application. J Psychiatr Res 2014; 59: 68-76. [19] Rivera-Franco MM, Leon-Rodriguez E. Delays in breast cancer
[http://dx.doi.org/10.1016/j.jpsychires.2014.08.017] [PMID: detection and treatment in developing countries. Breast Cancer 2018;
25260752] 12
[2] Pizzoli SFM, Renzi C, Arnaboldi P, Russell-Edu W, Pravettoni G. [http://dx.doi.org/10.1177/1178223417752677] [PMID: 29434475]
From life-threatening to chronic disease: Is this the case of cancers? A [20] Rangayyan RM, Ayres FJ, Leo Desautels JE. A review of computer-
systematic review. Cogent Psychol 2019; 6(1): 1577593. aided diagnosis of breast cancer: Toward the detection of subtle signs.
[http://dx.doi.org/10.1080/23311908.2019.1577593] J Franklin Inst 2007; 344(3-4): 312-48.
[3] Blakely T, Shaw C, Atkinson J, Cunningham R, Sarfati D. Social [http://dx.doi.org/10.1016/j.jfranklin.2006.09.003]
inequalities or inequities in cancer incidence? Repeated census-cancer [21] Dinnes J, Moss S, Melia J, Blanks R, Song F, Kleijnen J. Effectiveness
cohort studies, New Zealand 1981–1986 to 2001–2004. Cancer Causes and cost-effectiveness of double reading of mammograms in breast
Control 2011; 22(9): 1307-18. cancer screening: Findings of a systematic review. Breast 2001; 10(6):
[http://dx.doi.org/10.1007/s10552-011-9804-x] [PMID: 21717195] 455-63.
[4] Kaur B, Goyal B, Daniel E. A survey on Machine learning based [http://dx.doi.org/10.1054/brst.2001.0350] [PMID: 14965624]
Medical Assistive systems in Current Oncological Sciences. Curr Med [22] Marmot M, Altman DG, Cameron DA, Dewar JA, Thompson SG,
Imaging Form Curr Med Imaging Rev 2021; 17 Wilcox M. The benefits and harms of breast cancer screening: An
[http://dx.doi.org/10.2174/1573405617666210217154446] [PMID: independent review. Lancet 2012; 380(9855): 1778-86.
33596810] [http://dx.doi.org/10.1016/S0140-6736(12)61611-0] [PMID:
[5] Helms RL, O’Hea EL, Corso M. Body image issues in women with 23117178]
breast cancer. Psychol Health Med 2008; 13(3): 313-25. [23] Gur D, Sumkin JH, Rockette HE, et al. Changes in breast cancer
[http://dx.doi.org/10.1080/13548500701405509] [PMID: 18569899] detection and mammography recall rates after the introduction of a
[6] Njor S, Nyström L, Moss S, et al. Breast cancer mortality in computer-aided detection system. J Natl Cancer Inst 2004; 96(3):
mammographic screening in Europe: A review of incidence-based 185-90.
mortality studies. J Med Screen 2012; 19(S 1): 33-41. [http://dx.doi.org/10.1093/jnci/djh067] [PMID: 14759985]
[http://dx.doi.org/10.1258/jms.2012.012080] [24] Freer TW, Ulissey MJ. Screening mammography with computer-aided
[7] El-Sharkawy A. Unravelling genetic modifiers of Incontinentia detection: Prospective study of 12,860 patients in a community breast
Pigmenti and their role in the NF-kB pathway View project. 2014. center. Radiology 2001; 220(3): 781-6.
Available from: [http://dx.doi.org/10.1148/radiol.2203001282] [PMID: 11526282]
https://www.igb.cnr.it/incipit/supervisors/files/2017/Ursini.pdf [25] Ramadan SZ. Methods used in computer-aided diagnosis for breast
[8] Feuer EJ, Wun LM, Boring CC, Flanders WD, Timmel MJ, Tong T. cancer detection using mammograms: A review. J Healthc Eng 2020;
The lifetime risk of developing breast cancer. J Natl Cancer Inst 1993; 2020: 1-21.
85(11): 892-7. [http://dx.doi.org/10.1155/2020/9162464] [PMID: 32300474]
[http://dx.doi.org/10.1093/jnci/85.11.892] [PMID: 8492317] [26] Liu X, Zeng Z. A new automatic mass detection method for breast
[9] Ali MA, Czene K, Eriksson L, Hall P, Humphreys K. Breast tissue cancer with false positive reduction. Neurocomputing 2015; 152:
organisation and its association with breast cancer risk. Breast Cancer 388-402.
Res 2017; 19(1): 103. [http://dx.doi.org/10.1016/j.neucom.2014.10.040]
[http://dx.doi.org/10.1186/s13058-017-0894-6] [PMID: 28877713] [27] Nelson HD, Tyne K, Naik A, Bougatsos C, Chan BK, Humphrey L.
[10] Kerlikowske K, Carney PA, Geller B, et al. Performance of screening Screening for breast cancer: An update for the U.S. Preventive
mammography among women with and without a first-degree relative Services Task Force. Ann Intern Med 2009; 151(10): 727-737,
with breast cancer. Ann Intern Med 2000; 133(11): 855-63. W237-42.
[http://dx.doi.org/10.7326/0003-4819-133-11-200012050-00009] [http://dx.doi.org/10.7326/0003-4819-151-10-200911170-00009]
[PMID: 11103055] [PMID: 19920273]
[11] Reeves GK, Pirie K, Green J, Bull D, Beral V. Comparison of the [28] Sirovich BE, Sox HC Jr. Breast cancer screening. Surg Clin North Am
effects of genetic and environmental risk factors on in situ and 1999; 79(5): 961-90.
invasive ductal breast cancer. Int J Cancer 2012; 131(4): 930-7. [http://dx.doi.org/10.1016/S0039-6109(05)70056-6] [PMID:
[http://dx.doi.org/10.1002/ijc.26460] [PMID: 21952983] 10572546]
[12] Menarche, menopause, and breast cancer risk: Individual participant [29] Kooi T, Litjens G, van Ginneken B, et al. Large scale deep learning
meta-analysis, including 118 964 women with breast cancer from 117 for computer aided detection of mammographic lesions. Med Image
epidemiological studies. Lancet Oncol 2012; 13(11): 1141-51. Anal 2017; 35: 303-12.
[http://dx.doi.org/10.1016/S1470-2045(12)70425-4] [PMID: [http://dx.doi.org/10.1016/j.media.2016.07.007] [PMID: 27497072]
23084519] [30] Pavithra S, Vanithamani R, Justin J. Computer aided breast cancer
[13] Nelson HD, Zakher B, Cantor A, et al. Risk factors for breast cancer detection using ultrasound images. Mater Today Proc 2020; 33:
for women aged 40 to 49 years: A systematic review and meta- 4802-7.
analysis. Ann Intern Med 2012; 156(9): 635-48. [http://dx.doi.org/10.1016/j.matpr.2020.08.381]
[http://dx.doi.org/10.7326/0003-4819-156-9-201205010-00006] [31] Kelly KM, Dean J, Comulada WS, Lee SJ. Breast cancer detection
[PMID: 22547473] using automated whole breast ultrasound and mammography in
[14] Maas P, Barrdahl M, Joshi AD, et al. Breast cancer risk from radiographically dense breasts. Eur Radiol 2010; 20(3): 734-42.
modifiable and nonmodifiable risk factors among white women in the [http://dx.doi.org/10.1007/s00330-009-1588-y] [PMID: 19727744]
United States. JAMA Oncol 2016; 2(10): 1295-302. [32] DeMartini W, Lehman C, Partridge S. Breast MRI for cancer detection
[http://dx.doi.org/10.1001/jamaoncol.2016.1025] [PMID: 27228256] and characterization: A review of evidence-based clinical applications.
Breast Cancer Detection Methodologies The Open Neuroimaging Journal, 2023, Volume 16 17
Acad Radiol 2008; 15(4): 408-16. of Image Information Using Data Mining Technique. In: Recent
[http://dx.doi.org/10.1016/j.acra.2007.11.006] [PMID: 18342764] Trends and Advances in Artificial Intelligence and Internet of Things
[33] Chong A, Weinstein SP, McDonald ES, Conant EF. Digital breast Intelligent Systems Reference Library. Cham: Springer 2019; 172: pp.
tomosynthesis: Concepts and clinical practice. Radiology 2019; 207-15.
292(1): 1-14. [http://dx.doi.org/10.1007/978-3-030-32644-9_22]
[http://dx.doi.org/10.1148/radiol.2019180760] [PMID: 31084476] [54] Wang Z, Li M, Wang H, et al. Breast cancer detection using extreme
[34] Debelee T G, Schwenker F, Ibenthal A, Yohannes D. Survey of deep learning machine based on feature fusion with CNN deep features.
learning in breast cancer image analysis. Evolving Sys 2020; 11: IEEE Access 2019; 7(c): 105146-58.
143-63. [http://dx.doi.org/10.1109/ACCESS.2019.2892795]
[http://dx.doi.org/10.1007/s12530-019-09297-2] [55] Tajbakhsh N, Shin JY, Gurudu SR, et al. Convolutional neural
[35] Swiecicki A, Konz N, Buda M, Mazurowski MA. A generative networks for medical image analysis: Full training or fine tuning?
adversarial network-based abnormality detection using only normal IEEE Trans Med Imaging 2016; 35(5): 1299-312.
images for model training with application to digital breast [http://dx.doi.org/10.1109/TMI.2016.2535302] [PMID: 26978662]
tomosynthesis. Sci Rep 2021; 11(1): 10276. [56] Gour M, Jain S, Sunil Kumar T. Residual learning based CNN for
[http://dx.doi.org/10.1038/s41598-021-89626-1] [PMID: 33986361] breast cancer histopathological image classification. Int J Imaging Syst
[36] Song SY, Park B, Hong S, Kim MJ, Lee EH, Jun JK. Comparison of Technol 2020; 30(3): 621-35.
digital and screen-film mammography for breast-cancer screening: A [http://dx.doi.org/10.1002/ima.22403]
systematic review and meta analysis. J Breast Cancer 2019; 22(2): [57] Gastounioti A, Oustimov A, Hsieh MK, Pantalone L, Conant EF,
311-25. Kontos D. Using convolutional neural networks for enhanced capture
[http://dx.doi.org/10.4048/jbc.2019.22.e24] [PMID: 31281732] of breast parenchymal complexity patterns associated with breast
[37] Song SY, Hong S, Jun JK. Digital mammography as a screening tool cancer risk. Acad Radiol 2018; 25(8): 977-84.
in Korea. J Korean Soc Radiol 2021; 82(1): 2-11. [http://dx.doi.org/10.1016/j.acra.2017.12.025] [PMID: 29395798]
[http://dx.doi.org/10.3348/jksr.2021.0004] [PMID: 36237465] [58] Shen L. End-to-end training for whole image breast cancer diagnosis
[38] Huang ML, Lin TY. Dataset of breast mammography images with using an all convolutional design. arXiv 2017.
masses. Data Brief 2020; 31: 105928. [http://dx.doi.org/10.1038/s41598-019-48995-4]
[http://dx.doi.org/10.1016/j.dib.2020.105928] [PMID: 32642525] [59] Turkki R, Byckhov D, Lundin M, et al. Breast cancer outcome
[39] Al-Dhabyani W, Gomaa M, Khaled H, Fahmy A. Dataset of breast prediction with tumour tissue images and machine learning. Breast
ultrasound images. Data Brief 2019; 28: 104863. Cancer Res Treat 2019; 177(1): 41-52.
[http://dx.doi.org/10.1016/j.dib.2019.104863] [PMID: 31867417] [http://dx.doi.org/10.1007/s10549-019-05281-1] [PMID: 31119567]
[40] Petrov AY, Docherty PD, Sellier M, Chase JG. Multi-frequency [60] Xie J, Liu R, Luttrell J IV, Zhang C. Deep learning based analysis of
inversion in rayleigh damped magnetic resonance elastography. histopathological images of breast cancer. Front Genet 2019; 10(FEB):
Biomed Signal Process Control 2014; 13(1): 270-81. 80.
[http://dx.doi.org/10.1016/j.bspc.2014.04.006] [http://dx.doi.org/10.3389/fgene.2019.00080] [PMID: 30838023]
[41] Mohan J, Krishnaveni V, Guo Y. MRI denoising using nonlocal [61] Cao H, Pu S, Tan W, Tong J. Breast mass detection in digital
neutrosophic set approach of Wiener filtering. Biomed Signal Process mammography based on anchor-free architecture. Comput Methods
Control 2013; 8(6): 779-91. Programs Biomed 2021; 205: 106033.
[http://dx.doi.org/10.1016/j.bspc.2013.07.005] [http://dx.doi.org/10.1016/j.cmpb.2021.106033] [PMID: 33845408]
[42] Zhou J, Zhang Y, Chang KT, et al. Diagnosis of benign and malignant [62] Thawkar S, Ingolikar R. Classification of masses in digital
breast lesions on DCE‐MRI by using radiomics and deep learning mammograms using Biogeography-based optimization technique. J
with consideration of peritumor tissue. J Magn Reson Imaging 2020; King Saud Univ Comp Info Sci 2020; 32(10): 1140-8.
51(3): 798-809. [http://dx.doi.org/10.1016/j.jksuci.2018.01.004]
[http://dx.doi.org/10.1002/jmri.26981] [PMID: 31675151] [63] Cheng HD, Shi XJ, Min R, et al. Approaches for automated detection
[43] Teertstra HJ, Loo CE, van den Bosch MAAJ, et al. Breast and classification of masses in mammograms. Pattern Recognit 2006;
tomosynthesis in clinical practice: Initial results. Eur Radiol 2010; 39(4): 646-68.
20(1): 16-24. [64] Rampun A, Morrow PJ, Scotney BW, Winder J. Fully automated
[http://dx.doi.org/10.1007/s00330-009-1523-2] [PMID: 19657655] breast boundary and pectoral muscle segmentation in mammograms.
[44] Jeong MP, Franken EA, Garg M, Fajardo LL, Niklason LT. Breast Artif Intell Med 2017; 79: 28-41.
tomosynthesis: Present considerations and future applications. [http://dx.doi.org/10.1016/j.artmed.2017.06.001] [PMID: 28606722]
Radiographics 2007; 27(S1): S231-40. [65] Akbarisena SFM, Rachmawati E, Utama DQ. Leveraging Textural
[http://dx.doi.org/10.1148/rg.27si075511] [PMID: 18180229] Features for Mammogram Classification. 2020 8th International
[45] Spanhol FA, Oliveira LS, Petitjean C, Heutte L. A dataset for breast Conference on Information and Communication Technology
cancer histopathological image classification. IEEE Trans Biomed Eng (ICoICT). Yogyakarta, Indonesi, 24-26 June 2020.
2016; 63(7): 1455-62. [http://dx.doi.org/10.1109/ICoICT49345.2020.9166311]
[http://dx.doi.org/10.1109/TBME.2015.2496264] [PMID: 26540668] [66] Wang J, Yang X, Cai H, Tan W, Jin C, Li L. Discrimination of breast
[46] Litjens G, Kooi T, Bejnordi BE, et al. A survey on deep learning in cancer with microcalcifications on mammography by deep learning.
medical image analysis. Med Image Anal 2017; 60-88. Sci Rep 2016; 6(1): 27327.
[47] Giger ML. Machine learning in medical imaging. J Am Coll Radiol [http://dx.doi.org/10.1038/srep27327] [PMID: 27273294]
2018; 15(3): 512-20. [67] Zhou L, Wei Q, Dietrich C F, et al. Lymph node metastasis prediction
[48] Schmidhuber J. Deep Learning in neural networks: An overview. from primary breast cancer US images using deep learning. Radiology
Neural Netw 2015; 61: 85-117. 2019; 294(1): 19-28.
[http://dx.doi.org/10.1016/j.neunet.2014.09.003] [68] Conti A, Duggento A, Indovina I, et al. Radiomics in breast cancer
[49] Huang G, Huang G-B, Song S, et al. Trends in extreme learning classification and prediction. Semin Cancer Biol 2021; 72: 238-50.
machines: A review. Neural Netw 2015; 61: 32-48. [69] Zheng X, Yao Z, Huang Y, et al. Deep learning radiomics can predict
[http://dx.doi.org/10.1016/j.neunet.2014.10.001] [PMID: 25462632] axillary lymph node status in early-stage breast cancer. Nat Commun
[50] Govinda E, Indira Dutt VBSS. Artificial neural networks in UWB 2021; 12(1): 4370.
image processing for early detection of breast cancer. Int J AdvSci [http://dx.doi.org/10.1038/s41467-020-15027-z]
Technol 2020; 29(5): 2717-30. [70] Naik N, Madani A, Esteva A, et al. Deep learning-enabled breast
[51] Devi RR, Anandhamala GS. Recent trends in medical imaging cancer hormonal receptor status determination from base-level H&E
modalities and challenges for diagnosing breast cancer. Biomed stains. Nat Commun 2020; 11(1): 5727.
Pharmacol J 2018; 11(3): 1649-58. [http://dx.doi.org/10.1038/s41467-020-19334-3] [PMID: 33199723]
[http://dx.doi.org/10.13005/bpj/1533] [71] Azour F, Boukerche A. Design guidelines for mammogram-based
[52] Samala RK, Chan HP, Hadjiiski LM, Helvie MA, Cha KH, Richter computer-aided systems using deep learning techniques. IEEE Access
CD. Multi-task transfer learning deep convolutional neural network: 2022; 10: 21701-26.
Application to computer-aided diagnosis of breast cancer on [http://dx.doi.org/10.1109/ACCESS.2022.3151830]
mammograms. Phys Med Biol 2017; 62(23): 8894-908. [72] Rampun A, López-Linares K, Morrow PJ, et al. Breast pectoral muscle
[http://dx.doi.org/10.1088/1361-6560/aa93d4] [PMID: 29035873] segmentation in mammograms using a modified holistically-nested
[53] Saravanan D, Joseph D, Vaithyasubramanian S. Effective Utilization edge detection network. Med Image Anal 2019; 57: 1-17.
18 The Open Neuroimaging Journal, 2023, Volume 16 Sharma et al.
[109] Kaushal Chetna, Shiveta Bhat, Deepika Koundal, Anshu Singla. [110] Shukla Prashant Kumar, Jasminder Kaur Sandhu, Anamika Ahirwar,
Recent trends in computer assisted diagnosis (CAD) system for breast Deepika Ghai, Priti Maheshwary, Piyush Kumar Shukla.
cancer diagnosis using histopathological images. Irbm 40 2019; Multiobjective genetic algorithm and convolutional neural network
211-27. based COVID-19 identification in chest X-ray images. Mathematical
Problems in Engineering 2021; (2021 ): 1-9.
This is an open access article distributed under the terms of the Creative Commons Attribution 4.0 International Public License (CC-BY 4.0), a copy of which is
available at: https://creativecommons.org/licenses/by/4.0/legalcode. This license permits unrestricted use, distribution, and reproduction in any medium, provided the
original author and source are credited.