0% found this document useful (0 votes)
12 views68 pages

Military Format

This document presents a research project by Walson Cross on developing a deep learning-based system for malaria disease prediction using convolutional neural networks (CNNs). The study aims to automate the analysis of blood smear images to provide fast and accurate malaria diagnoses, addressing the limitations of traditional diagnostic methods. The findings suggest that the proposed system can enhance healthcare delivery in resource-limited settings by improving the speed and accuracy of malaria detection.

Uploaded by

isaacishike6
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views68 pages

Military Format

This document presents a research project by Walson Cross on developing a deep learning-based system for malaria disease prediction using convolutional neural networks (CNNs). The study aims to automate the analysis of blood smear images to provide fast and accurate malaria diagnoses, addressing the limitations of traditional diagnostic methods. The findings suggest that the proposed system can enhance healthcare delivery in resource-limited settings by improving the speed and accuracy of malaria detection.

Uploaded by

isaacishike6
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 68

MALARIA DISEASE PREDICTION USING DEEP LEARNING

BY

WALSON CROSS

HA22/0101

SUBMITTED TO

THE DEPARTMENT OF COMPUTER SCIENCE SCHOOL OF


APPLIED SCIENCES

KENULE BEESON SARO-WIWA POLYTECHNIC, BORI

IN PARTIAL FULFILMENT OF THE REQUIREMENT FOR THE


AWARD OF HIGHER NATIONAL DIPLOMA (HND) IN COMPUTER
SCIENCE

SEPTEMBER, 2024

i
DECLARATION

I hereby declare that this original research work was carried out by me,
WALSON CROSS with Matriculation Number HA22/0101 and has not been
previously submitted to any school degree or certificate.

Sign:………………………………. Date:…………………………….

WALSON CROSS

ii
CERTIFICATION
This is to certify that this project was carried out by WALSON CROSS with
Matriculation Number HA22/0101 in the department of Computer Science.
School of Applied Sciences, Kenule Beeson Saro-Wiwa Polytechnic, Bori,
Nigeria, and that is accepted in fulfillment for the award of Higher National
Diploma (HND) in Computer Science.

Sign:…………………………….. Date:……………………..
DR. PIAH, Z.P.
(Supervisor)

Sign:…………………………….. Date:……………………..
DR. PIAH, Z.P.
(Head of Department)

Sign:…………………………….. Date:……………………..
DR. EZEKIEL, E. N.
(Dean, School of Applied Sciences)

iii
DEDICATION
This project work is dedicated to GOD Almighty who is the Ultimate giver of
wisdom

iv
ACKNOWLEDGEMENT

I wish to express my heartfelt gratitude to God Almighty for life, wisdom and
knowledge.

To Kenule Beeson Saro-Wiwa Polytechnic School of Applied Sciences,


Department of Computer Science, thanks for unwavering support through my
academic journey.

I extend my sincere appreciation to my dedicated supervisor Dr. Piah, Z.P


who is still the Head of Department of computer science I thank you for your
mentorship, guidance, Teaching and constructive feedback throughout this
project. Your leadership has been instrumental in shaping my academic and
professional growth.
To my lecturers Mr. Atu, Mr. Barawi, Dr. Kabari, Mrs. Angela, Mr. Kenule,
Mr. Bale, Dr. Igulu, Mr. Ebenezer, Mr. Shedrack, Mr. Stanley, Mr. Hilary,
Mrs. Duabari, Mrs. Etifit, I appreciate your expertise, guidance, and support
throughout my program. Your teachings

I am grateful to my sponsors, Walson, Ngozi Sonita, Umah Azubike, And


Onyiri Philip, for their financial and moral support throughout my program.

To my friends Goodnews, Chimax and Levi, thank you for helping me finish
this project research, I appreciate your support.

I also wish to thank my friends and course mates who have in one way or the
order help contributed to my journey through out my program, thank a lot.

I pray God bless and reward you all richly in Jesus name. Amen.

v
ABSTRACT
Malaria remains a significant global health challenge, particularly in low-income regions where access to
diagnostic tools is limited. Traditional diagnostic methods, such as microscopic examination of blood
smears, are time-consuming, labor-intensive, and prone to human error. To address these limitations, this
study explores the application of deep learning, specifically convolutional neural networks (CNNs), to
develop an automated malaria detection system. By utilizing a CNN model trained on blood smear images,
the system aims to provide fast and accurate diagnosis of malaria, reducing the reliance on skilled
technicians and improving the scalability of detection efforts. The study employed a structured research
methodology, focusing on the development, training, and evaluation of a CNN model using synthetic data
sourced from publicly available datasets. The model was designed to classify blood smear images as either
malaria-positive or malaria-negative with high precision. Key metrics such as accuracy, sensitivity, and
specificity were used to evaluate the performance of the system, and the results were compared to traditional
diagnostic methods. The findings demonstrate that the proposed deep learning-based system can
significantly improve the speed and accuracy of malaria detection, offering a cost-effective and scalable
solution for deployment in resource-limited settings. This study highlights the potential of AI-driven tools to
enhance healthcare delivery and provides a foundation for future research into automated disease detection
using deep learning.

vi
TABLE OF CONTENTS
TITLE PAGE i

DECLARATION ii

CERTIFICATION iii

DEDICATION iv

ACKNOWLEDGEMENT v

ABSTRACT vi

TABLE OF CONTENTS vii

CHAPTER ONE 1

INTRODUCTION 1

1.1 Background of the Study 1

1.2 Statement of the Problem 2

1.3 Aim and Objectives 3

1.4 Scope of the Study 3

1.5 Significance Of Study 4

1.6 Research methodology 5

1.7 Definition of Terms 6

CHAPTER 2 8

LITERATURE REVIEW 8

2.1 Conventional Malaria Image Analysis Techniques 8

2.2 Neural Networks and Malaria Image Analysis 11

2.2.1 Mathematical Principles 11

2.2.2 Neural Networks Applied To Malaria Image Data 17

2.3 Discussion of Automated Malaria Diagnosis Techniques 19


vii
CHAPTER THREE 21

ANALYSIS AND DESIGN 21

3.1 Research Methodology 21

3.2 Analysis 21

3.2.1 Existing System Analysis 21

3.2.2 Proposed System Analysis 22

3.3 Design 22

3.3.1 Architecture 23

CHAPTER 4 31

IMPLEMENTATION AND DISCUSSION OF RESULTS 31

4.1 Reason for choice of programing language 31

4.2 Documentation 32

4.3 Discussion of Results 34

4.4 Program Code 34

4.5 Program Output 34

CHAPTER 5 35

SUMMARY, CONCLUSION AND RECOMMENDATION 35

5.1 Summary 35

5.2 Conclusion 35

5.3 Recommendation 36

REFERENCE 37

Appendix A 39

Appendix B 59

viii
CHAPTER ONE

INTRODUCTION

1.1 Background of the Study


Malaria remains one of the most significant global health challenges,
particularly in tropical and subtropical regions. It is caused by Plasmodium
parasites, which are transmitted to humans through the bites of infected
female Anopheles mosquitoes. According to the World Health
Organization (WHO), there were approximately 229 million cases of
malaria worldwide in 2019, resulting in an estimated 409,000 deaths, with
the majority occurring in sub-Saharan Africa. Children under five and
pregnant women are especially vulnerable to severe malaria and its
complications.

Traditional methods for diagnosing malaria primarily involve microscopic


examination of blood smears, a technique that has been in use for over a
century. While effective, this method is labor-intensive, time-consuming,
and highly dependent on the skill and experience of the technician. The
accuracy of microscopic diagnosis can vary significantly, leading to both
false positives and false negatives. Rapid diagnostic tests (RDTs) have
been developed to provide quicker results, but they too have limitations,
including variability in sensitivity and specificity, and the need for a stable
supply chain for test kits.

In recent years, advancements in artificial intelligence (AI) and machine


learning, particularly deep learning, have opened new avenues for
improving medical diagnostics. Deep learning algorithms, especially
convolutional neural networks (CNNs), have demonstrated remarkable
success in various image recognition tasks, including medical imaging.
These algorithms can learn to identify patterns and features in images with
a high degree of accuracy, potentially surpassing human capabilities. The
1
application of deep learning to malaria diagnosis holds promise for
developing automated systems that can provide rapid, accurate, and
reliable results, thus enhancing malaria control and treatment efforts.

1.2 Statement of the Problem


Despite ongoing efforts to combat malaria, the disease continues to pose a
significant burden on public health systems, particularly in resource-
limited settings. Traditional diagnostic methods, while effective, have
inherent limitations that hinder their widespread and efficient application.
The reliance on skilled microscopists for accurate diagnosis means that
many areas with high malaria prevalence suffer from a lack of adequate
diagnostic capacity. This often leads to delayed diagnosis and treatment,
contributing to higher morbidity and mortality rates.

Moreover, the variability in the accuracy of microscopic diagnosis,


influenced by factors such as technician fatigue and varying levels of
expertise, can result in misdiagnosis. False negatives can lead to untreated
infections, allowing the disease to progress and potentially causing severe
health complications or death. On the other hand, false positives can result
in unnecessary treatment, increasing the risk of drug resistance and
wasting valuable medical resources.

Rapid diagnostic tests (RDTs) offer an alternative, but their performance


can be inconsistent, and they may not always be available in remote or
underserved areas. There is a critical need for a more reliable, efficient,
and scalable diagnostic solution that can be deployed widely, particularly
in regions with high malaria transmission rates.

This study seeks to address these challenges by developing a deep


learning-based malaria detection system that can provide accurate and

2
timely diagnosis, thereby improving patient outcomes and supporting
malaria control efforts.
[

1.3 Aim and Objectives


The aim of this study is to design and implement a malaria detection
system using deep learning techniques to automate the analysis of blood
smear images, providing fast and accurate diagnosis. The Objectives is:

1. To review the existing malaria detection methods and their


limitations.

2. To develop a convolutional neural network (CNN) model for


detecting malaria in blood smear images.

3. To evaluate the performance of the developed model using various


metrics such as accuracy, sensitivity, and specificity.

4. To compare the results of the proposed deep learning model with


traditional diagnostic methods.

5. To recommend the practical applications of the system in real-world


healthcare settings.

1.4 Scope of the Study


This study encompasses the development, implementation, and evaluation
of an automated malaria detection system using deep learning techniques.
The research will involve several key phases:

1. Data Collection and Preprocessing: The study will gather a


comprehensive dataset of blood smear images, which may include
both synthetic data generated for training purposes and real field
data obtained from medical institutions or publicly available sources.
Data preprocessing steps will include image normalization,
augmentation, and segmentation to enhance the quality and diversity
of the training dataset.

3
2. Model Development: A convolutional neural network (CNN) will
be designed and implemented specifically for the detection of
malaria parasites in blood smear images. The architecture of the
CNN will be tailored to optimize its performance in identifying
malaria-infected cells.

3. Performance Evaluation: The developed model will be evaluated


using standard metrics such as accuracy, sensitivity, specificity, and
the area under the receiver operating characteristic (ROC) curve.
The performance of the deep learning model will be compared to
that of traditional diagnostic methods, including microscopic
examination and rapid diagnostic tests.

4. Integration Study: The study will explore the potential for


integrating the developed malaria detection system into existing
healthcare infrastructure. This will involve assessing the feasibility
of deploying the system in various settings, from well-equipped
medical facilities to remote clinics with limited resources.

The ultimate goal is to create a robust, scalable, and accessible diagnostic


tool that can improve the accuracy and efficiency of malaria diagnosis,
thereby supporting efforts to control and eliminate the disease.

1.5 Significance Of Study


Computer vision methods aimed at automating malaria diagnosis is
currently an active area of research (Roy, Sharmin, Mufiz Mukta, &
Sen, 2018; Bibin, Nair, & Punitha, 2017; Pinkaew, Limpiti, & Trirat,
2015; Ghosh, Ghosh, & Kundu, 2014). However, so far, the focus of
automated malaria diagnosis has been on developing intelligent

4
systems for use in hospitals. There has been a minimal effort at
developing machine learning methods for use in Plasmodium parasites
research laboratories.
1. Public Health Impact: accurate and timely malaria diagnosis
can improve patient outcomes and reduce mortality rate
especially in resource limited diagnostic tools is limited
2. Technological Innovations: the development of a deep
learning based system for malaria detection represents a
significant technological advancement in the field of computer
vision and medical imaging.
3. Cost Of Saving: automated malaria detection could potentially
reduce the cost of diagnosis and treatment by eliminating the
need for expensive microscopes and highly trained technicians.
4. Research Opportunities: a successful malaria detection system
based on deep learning could pave way for further research into
AI based solutions for other health care challenges, such as drug
discovery and personalized medicine

1.6 Research methodology

For this project, the Object Oriented Analysis Design Methodology


(OOADM) was chosen. The structured approach allows for a clear,
systematic breakdown of the malaria detection system, starting from the
requirements gathering, system analysis, design, development, and testing.
It ensures that each phase is well-documented and the system can be
developed in modular stages.

The OOADM focuses on improving the system’s reliability and ensures


that each component is rigorously tested before integrating into the final
product. Given the sensitive nature of the application, where the detection

5
of malaria needs to be accurate, a structured approach offers a clear path to
mitigate risks and errors.

1.7 Definition of Terms


Accuracy: A metric used to evaluate the performance of a classification
model, defined as the proportion of true positive and true
negative results among the total number of cases examined.

Anopheles Mosquito: A genus of mosquito that serves as the primary


vector for transmitting malaria parasites to humans.

Convolutional Neural Network (CNN): A type of deep learning


algorithm designed for processing structured grid data, such
as images. CNNs are particularly effective for image
recognition and classification tasks.

Data Augmentation: Techniques used to increase the size and diversity of


a training dataset by applying random transformations, such
as rotation, scaling, and flipping, to the original data.

Deep Learning: A subset of machine learning that involves neural


networks with many layers (deep neural networks). These
networks are capable of learning complex patterns and
representations from large amounts of data.

Malaria: A mosquito-borne infectious disease caused by Plasmodium


parasites, leading to symptoms such as fever, chills, and
anemia. Severe cases can result in complications such as
cerebral malaria and death.

Normalization: The process of scaling individual samples to have zero


mean and unit variance, which helps improve the performance
and stability of the learning algorithm.

6
Plasmodium: A genus of parasitic protozoa that are the causative agents
of malaria. There are several species of Plasmodium that
infect humans, with Plasmodium falciparum being the most
deadly.

Receiver Operating Characteristic (ROC) Curve: A graphical plot that


illustrates the diagnostic ability of a binary classifier system
as its discrimination threshold is varied. The area under the
ROC curve (AUC) provides a single measure of overall
performance.

Sensitivity: Also known as recall, sensitivity measures the proportion of


true positives correctly identified by the model (i.e., the
ability of the model to detect positive cases).

Specificity: A metric that measures the proportion of true negatives


correctly identified by the model (i.e., the ability of the model
to identify negative cases).

7
CHAPTER 2

LITERATURE REVIEW

2.1 Conventional Malaria Image Analysis Techniques


Most algorithms proposed in literature are focused on the classification of
thin-smear Giemsa stained images, acquired through the procedure
described in section 1-1-1. They aim to automatically count all uninfected
and parasitized erythrocytes, and often follow the following steps to do so;

(1) Preprocessing the blood smear image,

(2) Segmenting the erythrocytes from the background,

(3) Extracting parasite features and

(4) Mathematically dividing the erythrocytes into classes. This


approach is schematically depicted in figure 2-1. Examples of
techniques used for each step are given below

Figure 2.1: Schematic representation of the basic image analysis


pipeline followed by most (traditional) automated malaria diagnosis
algorithms, the numbers underneath the arrows refer to the four
operations in this pipeline;

1) Preprocessing,

2) Segmentation,

3) Feature extraction and

8
4) Classification.

1. Preprocessing: Preprocessing is aimed at removing noise and


enhancing image quality, and is often the first step when performing
digital analysis on any type of image data. For noise removal, lots of
established filters exist, such as median or Gaussian. In median
filters, each pixel value is simply replaced by the median of those in
a radius surrounding it. In Gaussian filters, a Gaussian distribution
function in two dimensions is used to determine a weighted average
of each pixel’s neighborhood, which then replaces that pixel value.
These basic filters remove noise sufficiently and are often
implemented in proposed automated malaria diagnostic systems,
though more complicated filtering techniques have also been used,
Low contrast is also a common problem, which is most commonly
fixed through contrast stretching or histogram equalization
techniques.

2. Erythrocyte Segmentation: When the thin smear is of good quality,


meaning cells are separated completely and the image is in focus
and well-illuminated, segmenting the individual erythrocytes is
fairly straight-forward. It can be achieved through basic thres
holding techniques, such as Otsu’s, which optimally divides pixel
values into two bins. This works well when the image is strongly
bimodal, which can partly be achieved through preprocessing, when
bimodality can’t be achieved through preprocessing, or when the
image is blurred, K-means clustering is a good alternative to
iteratively assign pixels to foreground or background. Its
disadvantage is that is more computationally complex than
thresholding techniques. Problems with both methods arise when
cells are touching or overlapping.

9
3. Feature Extraction: In pattern recognition, feature extraction refers
to computing values from the raw (pixel) data that will optimally
provide information for the classification that you want to perform,
without loss of information or redundancies. For diagnosis of blood
slides with stained parasites, color values of pixels are obviously
informative features for determining infection. From these, features
such as occurrence matrices, local binary patterns and histogram of
oriented gradients can be computed. Some papers have proposed
specifically extracting color features only from the green channel of
an image in RGB-color space, as it provides the most contrast
between the erythrocyte and the stained parasite. Others have
suggested transforming the image to HSB-space before extracting
color features, or using a combination of both. Morphological
features, such as Granulometry and relative shape measurements,
can also be computed to aid in classification.

4. Classification: When dividing objects over classes, the objective is


to minimize inter-class variance, based on the object features
supplied. Essentially, a classification algorithm approximates a
mapping f from the input features x to the output class y, such that ˜
f(x) ≈ y. An example of a simple classification method is the earlier
mentioned ‘thresholding’, where objects are divided into classes
based on whether their value is above or below a certain threshold.
More complicated classification methods often use a training set of
pre- classified objects to find a classification strategy that minimizes
the error rate, which is called ‘supervised learning’. ‘Unsupervised
learning’, where only the input data and the cost function are known
a priori, is also possible. A great number of learning algorithms
have been developed, such as Support Vector Machine (SVM),

10
Bayesian classifiers, K-nearest neighbor classifiers, logistic
regression trees, Artificial neural network sand many more.

2.2 Neural Networks and Malaria Image Analysis


As stated previously, for malaria image classification, no standardized
comparison is currently available, making it very hard to quantify the
state-of-the-art. However, in more general image classification research,
such comparisons are possible. Image classification contests such as the
Image Net Large Scale Visual Recognition Challenge (INLSVRC), make
it very clear that the field has become dominated over recent years by deep
learning techniques which use Artificial Neural Network (ANN) [59].
Therefore, these will be discussed more in-depth here.

2.2.1 Mathematical Principles


Basic Principle

An ANN is a type of classifier, inspired by biological neural networks, in


which the feature extraction and classification are combined in one
algorithm. The most basic type of ANN is a feed forward neural network,
or multilayer perception, which is schematically depicted in figure 2-2.

Figure 2.2: Schematic depiction of a feed forward neural network with three
inputs, two outputs and one hidden layer. On the right side, the general
architecture of a single neuron is depicted.

11
They consist of an input layer with all the data input points, an output layer
in which inputs are mapped to outputs, and (optionally) any number of
hidden layers. If all no design all layers pass outputs to each other, like in
the network depicted here, the ANN is referred to as ‘fully connected’.
When many hidden layers are incorporated into the architecture of the
network, they are often referred to as ‘deep neural networks’ and the
training and application of the network are called ‘Deep Learning’. The
hidden layers consist of Artificial neurons. In each of these neurons a
combination of an Affine transformation and a non-linear activation
function are used to transform the inputs. Let the output of a single neuron
k in layer l, be denoted al k. Each neuron uses the vector of outputs of the
previous layer al−1 as inputs, the first step is to compute a weighted sum z k1
of these;

Equation 2.1

Where n is the dimension of the previous layer, wk1 ...wkn are weights of
the neuron. A bias bk is added, and the output is then computed by
applying some non- linear
activation function g;

Equation 2.2

This output is then propagated to the neurons in the next layer, where they
the same type of transformation. The total mapping of the inputs x to
outputs y is thus a function of all the weights and biases; ˆ y = f(x, W ,b).
The correct mapping from the inputs to the outputs is approximated such

12
that ˆ y ≈ y by adjusting the weights and biases during learning. Neural
networks have been proven to be universal function approximations,
meaning that any mapping can be approximated arbitrarily well, given
enough hidden units are used.

Activation Functions

The activation function plays an important role in the approximating


ability, without them, only linear mappings could be approximated.
Different commonly used activation functions are shown in figure 2.3.

Figure 2.3: different activation functions used by Artificial neurons.

The choice for which activation function to use is an important design


parameter, sigmoid and Tanh both have smooth gradients and normalize
the outputs of the neuron, As opposed to the sigmoid function, Tanh is
zero centered, which makes it more suitable for inputs that can be strongly
negative. The disadvantage of using this S-shaped activation function is
that their gradients become very small for large values, which can slow
down the learning of the network significantly. This has been termed the
vanishing gradient problem. Furthermore, both activation functions are
fairly computationally demanding. This is why Rectified linear unit
(ReLU), which is a piecewise linear function that outputs the input value
when it is positive and 0 otherwise, has become a popular choice.

13
Convolutional Layers

A Convolutional Neural Network (CNN) is a type of deep neural network


that was developed specifically with the goal of image classification in
mind, A core concept in the architecture of CNNs is the introductions of
Convolutional layer, unlike in the previously descrivbed fully connected
layers, in Convolutional layers, the input of each neuron is a function of
only a small region of the outputs of the previous layer. This input is
produced by convolving the previous layer with a small matrix of weights
called a kernel. The kernel “slides” over the original image and the
convolution of the kernel with the region surrounding the input pixel is
computed by:

Equation 2.3

where wab ∈ w00 ...wNN are the weights in the kernel W of size N × N and
xij ∈ x00 ...xnn are the values of input matrix X with size n×n. The
convolution zij is than passed through an activation function, to produce
the
output yij;

Equations 2.3 and 2.4 replace equations 2.1 and 2.2. Besides this, the
convolutional layer is implemented in the same way as the standard
network layers described above. Note that the neuron sin convolutional
layers are structured in a grid; these make convolutional layers especially
suitable for the classification of structured data such as image data. The
kernel essentially acts as a feature extraction filter, where the learnable
weights converge towards features in the image. By using the same kernel
with the same weights on the entirety of the input, an activation map of
these features is produced. The output of the convolutional layer is

14
therefore called a feature map. The Convolutional layer operation is
schematically depicted in figure 2.4.

Figure 2.4: Schematic depiction of the convolution of a 6×6 input image


with a 3×3 kernel. In order to produce a 6×6 feature map, padding is
used.

Given an n × n image X as input, and a N ×N kernel W, which slides over


the input matrix with stride 1 (meaning it moves 1 pixel for each
convolution), the size of the feature map will be n−N +1×n−N +1. When a
feature map of equal size to the input is desired, padding can be used
around the input matrix. This is also depicted in figure 2.4. Often, multiple
kernels are used in one convolutional layer to produce multiple feature
maps. If M kernels are used, the size of the output (with padding) will be n
× n× M. The convolution described above assumes a single channel input.
It is possible to have a multi-channel input to a convolutional layer. In this
case, the convolution can be described as;

15
Here, xkij refer to the pixel sin the kth input channel, the total number of
input channels is K. The kernel in this case takes the size N × N × K, but
the output remains two dimensional. Even though the kernel is now 3D,
this is still referred to as a 2D convolution, since the kernel slides over the
input only in horizontal and vertical direction. It can be thought of as a
stack of filters, where each filter is convolved with one input channel, and
the outputs of the convolution are summed.

In order to reduce the dimensionality and prevent over- fitting in CNNs,


pooling layers are often added after convolutional layers. In these, the
outputs of the convolutional layers are down-sampled. The n × n feature
map is reduced in size to n p × n p, by dividing the feature map in p × p
patches and taking some function of the values in this patch as the output.
In average pooling layers, the average of the values is passed, while Max
pooling layers pass the largest value.

It is also possible to up-sample through convolution, when a feature map


of a bigger size than the input is desired. This concept was introduced as
‘Deconvolution’, but ‘transposed convolution’ has since been suggested to
be a more accurate name. To understand the concept of transposed
convolution, first note that the convolution operation can be written as a
matrix multiplication, by rearranging the weights of the kernel into a
convolution matrix which represents all positions the kernel takes on the
input, and rearranging the input matrix into a vector. This is explained
visually for the convolution of an input image X of 3×3 with a 2×2 kernel
W in figure 2-5. Now note that if the transpose of the convolution matrix is
taken instead to produce the feature map of an image Z of size 2×2, so ˜
WT × ˜ Z = Y, this feature map will be of size 4×4, and thus has been up-
sampled by the size of the kernel.

16
Figure 2.5: Upper image: convolution of a 3×3 input matrix with a 2 ×2
kernel to create a 2×2 feature map, expressed as a matrix operation.
Lower image: transposed convolution of a 2×2 input image with that
kernel, to create 4×4 feature map, expressed as a matrix operation.

2.2.2 Neural Networks Applied To Malaria Image Data


Some research has been published on the application of neural networks to
the classification of Giemsa stained malaria-infected blood smears. Dong
et al. trained three different CNN architectures on small dataset of
segmented erythrocyteobjects, which they obtained through thresholding
and then applying aHough circle transform to blood slide images. They
used this to create training and testing sets of equal size, both with 765
non-infected and 517 infected cells in them. No performance metrics for
the segmentation were given. The existing LeNet-5, AlexNet and
GoogLeNet architectures were trained on these images, and accuracies of
96.18% 95.97% and98.17% were reported for each of the networks
respectively. This was compared with a SVM trained on the same data, in

17
a similar way as done by Das et al., which was described in section 2-1,
which achieved an accuracy of 91.66 % on the same data. Rajamaran et al.
also worked on the classification of Giemsa stained thin films. They first
segmented the erytrhocytes from blood slide images, using another
conventional cell segmentation algorithm as described in section 2-1. They
produced a database 27,558 cell images with equal instances of parasitized
and uninfected cells, which they made publicly available, and went on to
develop a CNN based classifier for. They proposed a network architecture
consisting of three blocks of two convolutional layers, the first of which
containing a max pooling layer, the second with a average pooling layer
and the third followed by directly by three fully connected layers. On the
object level, they achieved sensitivity and specificity of 93.11 % and
95.12 % respectively. The performance of their proposed network
architecture was later compared with the use of pre-existing network
architectures such as VGG-16 and ResNet-50, and slightly outperformed
by these . Gopakumar et al. proposed training a network on a focus stack
of RGB cell images, instead of just a single image per cell object. This was
claimed to improve performance in distinguishing parasites from artefacts
such as dust specks. Segmented cells were acquired with a two-stage
threshold based method. Details on the specific architecture of the CNN
used were not provided. A sensitivity and specificity of 96.98 % and
98.50 % were reported respectively. However, the estimated parasiteamia
produced by their total proposed algorithm was not very close to the
ground truth, at 173 % of the actual parasiteamia. All research described
so far combined a CNN based classifier, with a simple segmentation
method. Erythrocyte segmentation with CNN is also possible. Delgado-
Ortet et al. applied this to the classification of thin smear images.

18
2.3 Discussion of Automated Malaria Diagnosis Techniques
The methods discussed in this chapter, and their performance measures,
are summarized. Making an objective statement on the relative
performance of automated malaria image analysis techniques is difficult,
since performance measures are usually only reported on a small set of
(private) data. Often, no separate performance is reported for the
segmentation of the erythrocytes, and the performance is only evaluated in
terms of segmented cells classified correctly, making it impossible to
assess all performance of the proposed method. Reporting the specificity
and sensitivity on object-level only makes sense from a image
classification point of view, but from a clinical point of view, these
performance metrics are not very informative; number of cells identified
correctly over an entire dataset, doesn’t give insight into whether the
classification is suitable for diagnosis at patient-level. Furthermore, a
limited number of images is often used for testing, which are typically
acquired in exactly the same manner as the images used to develop and
train the algorithm were. Especially in conventional image analysis
techniques, the extracted features that are used, such as size parameters
and staining colours, can vary heavily if the images are acquired with a
different method or even just a different camera, it is doubtful these
methods will perform well when tested on images from another source. In
practice, smear and image quality can be of much lower quality than they
are under ideal lab conditions, but research that deals with the
classification sub-standard microscopic images has thus far been limited.
In general, it is clear that for an automated classification technique to be
suitable for use in diagnostics, its performance would ideally be just as
high or higher than that of a human expert. We can define the performance
of a human expert by looking at the requirements the World Health
Organization (WHO) sets on a ‘level 1’ microscopist; which are given in
tableTable 2-2: Performance requirements for WHO Microscopist
19
competence levels In terms of parasite detection, the techniques discussed
in this section are generally claimed to perform above 90%, so as good as
a level 1 microscopist. Automated species classification has also been
attempted; results thus far have not been as good as those of classifiers that
only determine infection. Even though reported sensitivities and
specificities are high, the methods reported don’t necessarily result in
accurate Parasiteamia counts; Gopakumar et al. reported the highest
performance measures of all methods discussed, but their Parasiteamia
was 73 % off. In the general field of image analysis, CNN based deep
learning techniques have shown impressive performance on image
classification, and the research published so far on the application of them
for malaria diagnosis is promising. This research is however limited; it is
mostly focussed on the classification of pre-segmented thin smear
erythrocytes, which means only part of the diagnostic process is performed
by the algorithm. Accurate cell segmentation

with help of CNN has for this specific problem, been scarcely attempted.
Furthermore, no attempts were found to verify the performance of a
classifier trained on a large set of erythrocyte images, as was done by
Rajaramanetal., on image data that was acquired with a different set-up.

20
CHAPTER THREE

ANALYSIS AND DESIGN

3.1 Research Methodology


For this project, the Object Oriented Analysis Design Methodology
(OOADM) was chosen. The structured approach allows for a clear,
systematic breakdown of the malaria detection system, starting from the
requirements gathering, system analysis, design, development, and testing.
It ensures that each phase is well-documented and the system can be
developed in modular stages.

The OOADM focuses on improving the system’s reliability and ensures


that each component is rigorously tested before integrating into the final
product. Given the sensitive nature of the application, where the detection
of malaria needs to be accurate, a structured approach offers a clear path to
mitigate risks and errors.

3.2 Analysis

3.2.1 Existing System Analysis


The conventional method for malaria detection typically involves the
manual examination of blood samples under a microscope. A trained
medical professional observes the blood smear and looks for the presence
of Plasmodium parasites, which indicate malaria. While this method has
been in use for decades, it has several limitations:

1. Time-Consuming: The manual process can take hours or even days,


especially in understaffed or overburdened healthcare centers.

2. Human Error: The accuracy of the diagnosis relies heavily on the


experience and skills of the professional conducting the test. In
regions with limited access to trained professionals, this can lead to
misdiagnosis.

21
3. Limited Scalability: The system struggles to keep up with large
populations, especially during outbreaks when swift detection is
crucial.

4. Costly: Maintaining lab equipment and ensuring a steady supply of


skilled professionals can be financially burdensome, especially for
low-income regions.

3.2.2 Proposed System Analysis


The proposed malaria detection system aims to overcome the limitations
of the existing manual process by leveraging deep learning technology.
The system will use convolutional neural networks (CNNs) to analyze
blood smear images and detect the presence of Plasmodium parasites
automatically.

Advantages of the Proposed System:

1. High Accuracy: Deep learning models, particularly CNNs, have


shown remarkable success in image recognition tasks. With a well-
trained model, the system will be able to detect malaria with a high
degree of accuracy, reducing the risk of human error.

2. Speed: Once trained, the system can process and diagnose samples
in seconds, significantly reducing the time needed for detection.

3. Scalability: The system can process a large number of samples in


parallel, making it suitable for mass screening during malaria
outbreaks.

4. Cost-Effective: After the initial setup, the system will require less
maintenance than traditional microscopy and can be deployed in
low-resource settings without the need for a highly trained
workforce.

5. Automation: The entire process, from image capture to diagnosis,


can be automated, improving efficiency and freeing up medical staff
to focus on other tasks.

3.3 Design

22
3.3.1 Architecture
The U-Net architecture consists of a contracting path, which has a classic
CNN structure as described in section 2-2, and supplements this with a
symmetric expanding path, making the network U-shaped. During the
contraction, the spatial information is reduced while feature information is
increased. The contracting path consists of 10convolutionallayers,
interspersed with max pooling layers after every second convolutional
layer. The convolutional layers all have kernel size 3×3 and use Rectified
Linear Unit (ReLU) activation functions. The pooling layers all have
window size 2×2, so the size of the input is halved in each of them. The
expanding path is nearly symmetrical to the contracting path, consisting of
another 10 convolutional layers, but feature layers are up-sampled instead
of down-sampled. These combined feature maps are the input to the first of
two successive convolutional layers, after which another up-sampling
operation follows. To create the final output, a convolution with 1 kernel
of size 1×1 and a sigmoid activation function is applied after the last
convolutional layer in the expanding path. This results in a grayscale
segmentation map of the input image.

Figure 3.1: Full network architecture used for creating a segmentation


map for a 256×256RGB input image.
The arrows denote convolutional (conv) and sampling operations, and the
blocks denote the output of each operation. The size (width× height) is
written next to the levels, and the number of channels or depth is written
23
above each block.
A version of this network was implemented in Python using the Keras
neural network library with Tensor Flow as backend. Some modifications
and improvements to the original architecture proposed by Ronneberger et
al. were made for our purpose, namely;

1. Their proposed architecture was built to segment 572×572×1


images, here, the number of input channel sisextended to three
(RGB), and to speed up learning and predictions, the input images
are scaled to 256×256.
2. The number of kernels used in the convolutional layers is greatly
reduced. The original architecture contained 64 kernels in the first
convolutional layer and doubles after every pooling layer, here we
start with 16 kernel sin the first convolutional layer. This reduces the
number of learnable parameters from 31,030,593 to 1,941,105,
making the model much more light-weight.
3. To ensure that robust features are learned and prevent over- fitting
the training data, dropout with probability of 0.5 is used in two
convolutions in the lowest layers of the contracting path, meaning
that some of the activations are randomly set to zero in each
iteration. This prevents complex co-adaptations, where high weights
are assigned to features that are only useful in the context of several
other specific features.
4. Padding was added to ensure obtain an output segmentation mask
the same size as the input segmentation mask. This was not the case
in the original architecture, where segmentation maps of size
388×388, containing only the middle region of the input image,
were predicted. For the segmentation of full images, a tiling strategy
with overlap was proposed, where the missing context in the border
regions is extrapolated by mirroring the input image. This was
found to add unnecessary complexity, and did not result in smooth
24
borders between tiles. Here, padding is used to predict full
segmentation maps instead, and the image is tiled with overlapping
border regions, on which predictions are made twice to correct for
any mistakes the padding produced.

Fig 3.2 Training Data

Training data is needed to train the network. In order to reduce


computational complexity in the training, the network was not trained to
segment full images from the AiDx set, but rather, square sections of the
images. This also increases the amount of training objects available. The
images were first down-scaled to 1024×768, and 30 random squares with
limited overlap were cropped from two of the images. Binary
segmentation masks were manually drawn for these.
In order to increase the amount of training samples, data augmentation is
used, which is needed to teach the network the desired invariance and

25
robustness properties. A teach training step, a new batch of training images
is created by randomly applying some transformations to one of the
original training images and their corresponding masks. A combination of
the following transformations was used:
 Flips Images are randomly flipped horizontally and/or vertically.
 Zooms Images are randomly zoomed up to 105 %.
 Shifts and rotations Images are shifted horizontally and/or
vertically, with a maximum of 5 % of the image size. They are also
randomly rotated with a maximum of 10

Figure 3.3: Left: 256 × 256 tile cropped out of an image in the AiDx
dataset. Right: Hand drawn binary mask used for training, Degree.
Both operations create ‘empty’ pixels, which are filled with the value
of the nearest non-empty pixel.

Elastic Deformations
Small random elastic deformations are applied, as proposed in This is
implemented by first creating a displacement field, which defines a
direction and magnitude to move each pixel in an image, and then using
bilinear interpolation to apply these fields to the images. In order to create
the field, first, a matrix the size of the image (256 × 256 in this application)
is filled with values random selected from a uniform distribution [−1, 1] .
This matrix is smoothed with a Gaussian filter, i.e. each pixel is convolved
horizontally and then vertically with a vector kernel containing sampled
values of a Gaussian distribution;

26
Here, σ =10waschosenasanappropriatestandarddeviation. The filter was
truncated at 4σ + 1, so that most of the continuous distribution area (96 %)
is within the discrete kernel. After smoothing, the displacement field is
multiplied with a scaling factor φ to achieve an appropriate distortion size.
This scaling factor was set at φ = 150, which resulted in images that were
visibly different from the input, but still had naturally shaped cells.

27
Fig 3.4 System Activity Diagram

28
Fig 3.5 System Class Diagram

29
Fig 3.6 System Sequence Diagram

30
CHAPTER 4

IMPLEMENTATION AND DISCUSSION OF RESULTS

4.1 Reason for choice of programing language

The model was developed using three main packages in python: OpenCV,
Keras and TensorFlow. These packages provide highly optimized
implementations of the machine learning algorithms which used in this
project.

OpenCV

OpenCV (Open Source Computer Vision Library) is a BSD-licensed


open-source computer vision and machine learning library. OpenCV was
mainly used for image preprocessing tasks such as conversion from RGB
images to grayscale, images downsampling and rescaling. Since OpenCV
was built to take advantage of multi-core processor systems, the functions
are highly optimized for matrix operations. OpenCV thus offered faster
implementations of the image processing tasks used in this
implementation.

Keras

”Keras is a high-level neural networks API, written in Python and capable


of running on top of TensorFlow, CNTK, or Theano. It was developed
with a focus on enabling fast experimentation” (Keras, 2018).

The extract above, taken from Keras documentation, summarizes the main
reasons why Keras has emerged as the default library for computer vision
research. Keras works on both GPU and CPU, has many different
optimization and loss functions and provides mechanisms for saving the
model at various checkpoints. The context-encoder and the main
classification models were developed using the Keras API. Keras also

31
provided the pretrained model as well as the data augmentation functions
used in this research.
TensorFlow was the back-end library for Keras. Although no code was
written directly in TensorFlow, all the Keras computations were executed
in TensorFlow. Thus, the next subsection discusses the motivation why
TensorFlow was chosen as the back-end library.

TensorFlow
TensorFlow is a machine learning software library which was developed
by Google Brain Team for research in Google. Since becoming an open-
sourced library in 2015 under Apache License 2.0., many researchers and
software companies have adopted TensorFlow for their intelligence
research. TensorFlow uses data flow graphs for all its computations.
Mathematical operations are represented as nodes while data is
represented as edges (tensors) which designed to handle multidimensional
data. This makes TensorFlow well suited for image processing
applications since images are generally processed in their spatial form.
Finally, TensorFlow provides seamless transition between CPU and GPU,
which Keras exploits to speed up training.

4.2 Documentation

All the models were trained using the Google Compute Engine Application Programming

Interface (API). This became necessary because the models could not be
trained fast enough on CPUs because they contained many parameters.
One of the models experimented for the semantic segmentation task for
instance was had 31,095,763 trainable parameters. It takes a long time to
train such models on CPUs since CPUs often execute instructions
sequentially. An interesting observation, however, is that most of the
operations in CNNs are computations on matrices. Since matrix operations
are mostly element-wise operations, they can be executed in parallel.

32
Graphical Processing Units (GPUs), designed for processing spatial data
have the ability to execute many instructions in parallel. Hence, training
on a GPU can reduce training time from days to hours.
For comparison purposes, the binary classification model was
experimented on both the GPU and on an i-3 Intel CPU with a 4GB RAM.
For twenty epochs, it took the GPU only six (6) minutes to finish training
the model while it took the CPU almost three days to train the model.
Additionally, the CPU could train on only a mini-batch size of 1 whilst the
GPU trained on a mini-batch of 42 images.
Besides the number of parameters, the input features had very high
resolution (764 x 1055 x 3). The high dimensionality of the data combined
with the large number of parameters meant that training the models
required large Random Access Memory (RAM). Personal computers,
being largely general purpose machines, are not equipped with powerful
GPUs and often have relatively smaller RAM compared to that which the
models needed. Fortunately, the computational power required for training
CNNs is accessible through Google Cloud’s API. All the models
experimented in this research were trained on a virtual machine1 provided
through the Compute Engine API. The virtual machine was equipped with
a NVIDIA Tesla P100 GPU running on Compute Unified Device
Architecture (CUDA) Toolkit 9.0 from NVIDIA. Additionally, CuDNN
7.0.5 was installed to enable Keras and TensorFlow execute instructions
on the GPU. The machine had a 16 cores CPU with a 30 GB RAM.
Finally, the virtual machine ran on the Ubuntu 16.04 LTS operating sys-
tem. The CUDA Toolkit required only a registration to use but the virtual
machine charged $0.872 per hour of usage.

33
4.3 Discussion of Results

The main concern during training was how the trained model would fair
on unseen data. This meant that choices about hyper-parameters values,
model size and degree of regularization were based largely on the
validation accuracy and validation loss. The training accuracy and training
loss were also monitored in addition to the validation metrics.

4.4 Program Code


{See Appendix A}

4.5 Program Output


{See Appendix B}

34
CHAPTER 5

SUMMARY, CONCLUSION AND RECOMMENDATION

5.1 Summary

This research explored the application of convolutional neural networks in


the detection Plasmodium parasites. In particular, a semantic segmentation
model was built to classify the different growth-cycle stages of Plasmodium
parasites. Through the use of techniques such as transfer learning and
regularization methods, the model classified pixels belonging to normal
cells, trophozoites or gametocytes with weighted accuracy of 85.86%. A
binary classification model was also built to classify Giemsa-stained thin
blood smear images into positive and negative classes. That model
achieved an accuracy of 98.78% during training but performed poorly at
test time because of large variations in the training data.

5.2 Conclusion
The aim of this work was to contribute to the development of malaria
diagnostic methods suitable for use in situ. Through review of the
literature on diagnostic methods in chapter 1, a list of requirements for
such a diagnostic method was presented in section 1-2. By combining
requirements 2 and 3; the ability to determine parasiteamia counts and
identify parasite species and stage, with requirement 5; the wish for
minimal skill and labour needed to interpret the test, we arrived at
researching the possibilities of automating the interpretation of Giemsa
stained microscopy of thin blood films. Through review of previous work
on this subject, we arrived at the use of neural networks a promising
technique for automated image interpretation. We chose to investigate
Specifically the interpretation of low magnification images, based on
image data produced by a microscope that is currently in development at
AiDx, which is portable and low cost and thus meets the 4th requirement

35
we set for a novel diagnostic test. This provided the motivation for our
main research question;

5.3 Recommendation

This section offers suggested extensions to this study. Besides employing


measures to address the limitations above, the findings of this study can
become widely applicable if the following extensions are explored.

1. A refinement of the algorithm to count the number of the different


growth-cycle stages of the parasites per image. This feature will
make it possible to estimate the density of the parasites on the images.
This information is useful to health practitioners and researchers
since it enables them to estimate the severity of the infection.

2. An improvement of the model to include all the four major growth-


cycle stages of the Plasmodium parasite. This study considered only
trophozoites and gametocytes in addition to normal cells due to
constraints on image quality. Future extensions of this project should
source data good enough to discriminate between all the different-
growth cycle stages. The importance of classifying all life-cycle
stages cannot be overstated.

36
REFERENCE
Afridi, M. J. Ross, A., & Shapiro, E. M. (2018). On automated source selection
for transfer learning in convolutional neural networks. Pattern Recognition, 73,
65-75. Retrieved from http://dx.doi.Org/10.1016/j.patcog.2017.07.019 doi:
10.1016/j.patcog.2017.07.019

Bibin, D, Nair, M. S., & Punitha, P. (2017). Malaria parasite detection from
peripheral blood smear images using deep belief networks. IEEE Access, 5,
9099-9108. doi: 10.1109/ACCESS.2017.2705642

Brynjolfsson, E., & Andrew, M. (2017). What’s driving the machine learning explosion?
Retrieved from https://hbr.org/2017/07/whats-driving-the Machine
learning explosion

Chakrabortya, K. (2015). A combined algorithm for malaria detection from thick


smear blood slides. Journal of Health & Medical Informatics, 06(01), 1–
6. doi: 10.4172/ 2157-7420.1000179

Chollet, F., et al. (2015). Keras. Retrieved from https://keras.io Cires, D. C., &
Giusti, A. (2012). Deep Neural Networks Segment Neuronal Membranes
in Electron Microscopy Images. NIPS, 2852-2860.

Delves, M., Plouffe, D., Scheurer, C., Meister, S., Wittlin, S., Elizabeth, A.,
Leroy, D. (2012). The activities of current antimalarial drugs on the life
cycle stages of plasmodium : a comparative study with human and rodent
parasites. PLoS Med, 9(2), e1001169. doi: 10.1371/journal.pmed.1001169

Diaz, G., Gonzalez, F. A., & Romero, E. (2009). A semi-automatic method for
quantification and classification of erythrocytes infected with malaria
parasites in microscopic images. Journal of Biomedical Informatics, 42(2), 296-307.
Retrieved from http://dx.doi.org/10.1016Zj.jbi.2008.11.005 doi: 10.1016/
j.jbi.2008.11.005

Ghosh, S., Ghosh, A., & Kundu, S. (2014). Estimating malaria parasitaemia in
images of thin smear of human blood. CSI Transactions on ICT, 2(1), 43-48.
Retrieved from http://link.springer.com/10.10 07/s4 0 012-014-0 0 43-7 doi:
10 .1007/s40012-014-0043-7

Gonzales, G. (2016). Giemsa staining of malaria blood films (Tech. Rep.). World Health

37
Organization.

Goodfellow, I. J., Pouget-Abadie, J., Mirza, M., Xu, B., Warde-Farley, D., Ozair,
S., ... Bengio, Y. (2014). Generative Adversarial Networks., 1-9. Retrieved
from http:// arxiv.org/abs/14 0 6.2 6 61 doi:
10.1001/jamainternmed.2016.8245

Gopakumar, G. P., Swetha, M., Sai Siva, G., & Sai Subrahmanyam, G. R. (2017).
Convolutional neural network-based malaria diagnosis from focus stack of
blood smear images acquired using custom-built slide scanner. Journal of
Biophotonics, 11(January 2017), e201700003. doi: 10.1002/jbio.201700003

Gupta, S., Girshick, R., & Arbel, P. (2014). Learning Rich Features from RGB-D
Images for Object Detection and Segmentation. CoRR, abs/1407.5. Retrieved
from http:// arxiv.org/abs/1407.5736

He, K., Zhang, X., Ren, S., & Sun, J. (2015). Delving Deep into Rectifiers :
surpassing human-level performance on ImageNet classification. CoRR,
abs/1502.0. Retrieved from http://arxiv.org/abs/1502.01852

Kahama-Maro, J., D ’acremont, V., Mtasiwa, D., Genton, B., & Lengeler, C.
(2011). Low quality of routine microscopy for malaria at different levels
of the health system in Dar es Salaam. Malaria Journal, 10. Retrieved from
http://www .malariajournal.com/content/10/1/332 doi: 10.1186/1475-
2875-10 -332

38
Appendix A
SOURCE CODE

{"cells":[{"cell_type":"markdown","metadata":{"id":"zq-
cdzHww1WW"},"source":["#**Maleria Detection
System**"]},{"cell_type":"markdown","metadata":{"id":"gUd8ySMdWFFQ"},"so
urce":["#**Collecting
Dataset**"]},{"cell_type":"code","execution_count":null,"metadata":{"colab":{"b
ase_uri":"https://localhost:8080/"},"executionInfo":{"elapsed":60114,"status":"ok"
,"timestamp":1719924484802,"user":{"displayName":"O
Sender","userId":"03525772944662445972"},"user_tz":-
60},"id":"YBiSUVD5WJLm","outputId":"db602398-2a64-4ca5-953b-
bb83e154d5bc"},"outputs":[{"name":"stdout","output_type":"stream","text":["Mo
unted at /content/drive\n"]}],"source":["from google.colab import
drive\n","drive.mount('/content/drive')"]},{"cell_type":"code","execution_count":n
ull,"metadata":{"colab":{"base_uri":"https://localhost:8080/","height":55},"executi
onInfo":{"elapsed":6729,"status":"ok","timestamp":1719924538511,"user":{"displ
ayName":"O Sender","userId":"03525772944662445972"},"user_tz":-
60},"id":"yjA10M2O5P2m","outputId":"22dd1ec0-e1c1-4f8f-dffd-
c13b8b064d19"},"outputs":[{"data":{"text/html":["\n"," <input type=\"file\"
id=\"files-55b06ba0-5920-48ba-96cd-3608ae3ec96e\" name=\"files[]\" multiple
disabled\n"," style=\"border:none\" />\n"," <output id=\"result-55b06ba0-
5920-48ba-96cd-3608ae3ec96e\">\n"," Upload widget is only available when
the cell has been executed in the\n"," current browser session. Please rerun this
cell to enable.\n"," </output>\n"," <script>// Copyright 2017 Google
LLC\n","//\n","// Licensed under the Apache License, Version 2.0 (the
\"License\");\n","// you may not use this file except in compliance with the
License.\n","// You may obtain a copy of the License at\n","//\n","//
http://www.apache.org/licenses/LICENSE-2.0\n","//\n","// Unless required by
applicable law or agreed to in writing, software\n","// distributed under the License
is distributed on an \"AS IS\" BASIS,\n","// WITHOUT WARRANTIES OR
CONDITIONS OF ANY KIND, either express or implied.\n","// See the License
for the specific language governing permissions and\n","// limitations under the
License.\n","\n","/**\n"," * @fileoverview Helpers for google.colab Python
39
module.\n"," */\n","(function(scope) {\n","function span(text, styleAttributes = {})
{\n"," const element = document.createElement('span');\n"," element.textContent
= text;\n"," for (const key of Object.keys(styleAttributes)) {\n","
element.style[key] = styleAttributes[key];\n"," }\n"," return
element;\n","}\n","\n","// Max number of bytes which will be uploaded at a
time.\n","const MAX_PAYLOAD_SIZE = 100 * 1024;\n","\n","function
_uploadFiles(inputId, outputId) {\n"," const steps = uploadFilesStep(inputId,
outputId);\n"," const outputElement = document.getElementById(outputId);\n","
// Cache steps on the outputElement to make it available for the next call\n"," // to
uploadFilesContinue from Python.\n"," outputElement.steps = steps;\n","\n","
return _uploadFilesContinue(outputId);\n","}\n","\n","// This is roughly an async
generator (not supported in the browser yet),\n","// where there are multiple
asynchronous steps and the Python side is going\n","// to poll for completion of
each step.\n","// This uses a Promise to block the python side on completion of
each step,\n","// then passes the result of the previous step as the input to the next
step.\n","function _uploadFilesContinue(outputId) {\n"," const outputElement =
document.getElementById(outputId);\n"," const steps =
outputElement.steps;\n","\n"," const next =
steps.next(outputElement.lastPromiseValue);\n"," return
Promise.resolve(next.value.promise).then((value) => {\n"," // Cache the last
promise value to make it available to the next\n"," // step of the generator.\n","
outputElement.lastPromiseValue = value;\n"," return
next.value.response;\n"," });\n","}\n","\n","/**\n"," * Generator function which is
called between each async step of the upload\n"," * process.\n"," * @param {string}
inputId Element ID of the input file picker element.\n"," * @param {string}
outputId Element ID of the output display.\n"," * @return {!Iterable<!Object>}
Iterable of next steps.\n"," */\n","function* uploadFilesStep(inputId, outputId)
{\n"," const inputElement = document.getElementById(inputId);\n","
inputElement.disabled = false;\n","\n"," const outputElement =
document.getElementById(outputId);\n"," outputElement.innerHTML =
'';\n","\n"," const pickedPromise = new Promise((resolve) => {\n","
inputElement.addEventListener('change', (e) => {\n","
resolve(e.target.files);\n"," });\n"," });\n","\n"," const cancel =
document.createElement('button');\n","

40
inputElement.parentElement.appendChild(cancel);\n"," cancel.textContent =
'Cancel upload';\n"," const cancelPromise = new Promise((resolve) => {\n","
cancel.onclick = () => {\n"," resolve(null);\n"," };\n"," });\n","\n"," // Wait
for the user to pick the files.\n"," const files = yield {\n"," promise:
Promise.race([pickedPromise, cancelPromise]),\n"," response: {\n"," action:
'starting',\n"," }\n"," };\n","\n"," cancel.remove();\n","\n"," // Disable the input
element since further picks are not allowed.\n"," inputElement.disabled =
true;\n","\n"," if (!files) {\n"," return {\n"," response: {\n"," action:
'complete',\n"," }\n"," };\n"," }\n","\n"," for (const file of files) {\n"," const
li = document.createElement('li');\n"," li.append(span(file.name, {fontWeight:
'bold'}));\n"," li.append(span(\n"," `(${file.type || 'n/a'}) - ${file.size} bytes, `
+\n"," `last modified: ${\n"," file.lastModifiedDate ?
file.lastModifiedDate.toLocaleDateString() :\n"," 'n/a'} -
`));\n"," const percent = span('0% done');\n","
li.appendChild(percent);\n","\n"," outputElement.appendChild(li);\n","\n","
const fileDataPromise = new Promise((resolve) => {\n"," const reader = new
FileReader();\n"," reader.onload = (e) => {\n","
resolve(e.target.result);\n"," };\n","
reader.readAsArrayBuffer(file);\n"," });\n"," // Wait for the data to be
ready.\n"," let fileData = yield {\n"," promise: fileDataPromise,\n","
response: {\n"," action: 'continue',\n"," }\n"," };\n","\n"," // Use a
chunked sending to avoid message size limits. See b/62115660.\n"," let position
= 0;\n"," do {\n"," const length = Math.min(fileData.byteLength - position,
MAX_PAYLOAD_SIZE);\n"," const chunk = new Uint8Array(fileData,
position, length);\n"," position += length;\n","\n"," const base64 =
btoa(String.fromCharCode.apply(null, chunk));\n"," yield {\n"," response:
{\n"," action: 'append',\n"," file: file.name,\n"," data:
base64,\n"," },\n"," };\n","\n"," let percentDone = fileData.byteLength
=== 0 ?\n"," 100 :\n"," Math.round((position / fileData.byteLength) *
100);\n"," percent.textContent = `${percentDone}% done`;\n","\n"," } while
(position < fileData.byteLength);\n"," }\n","\n"," // All done.\n"," yield {\n","
response: {\n"," action: 'complete',\n"," }\n"," };\n","}\n","\n","scope.google
= scope.google || {};\n","scope.google.colab = scope.google.colab ||
{};\n","scope.google.colab._files = {\n"," _uploadFiles,\n","

41
_uploadFilesContinue,\n","};\n","})(self);\n","</script>
"],"text/plain":["<IPython.core.display.HTML
object>"]},"metadata":{},"output_type":"display_data"},{"data":{"text/plain":["{}
"]},"execution_count":3,"metadata":{},"output_type":"execute_result"}],"source":[
"from google.colab import
files\n","files.upload()\n"]},{"cell_type":"code","execution_count":null,"metadata"
:{"id":"H-xs2H0-fWDS"},"outputs":[],"source":["import os\n","import
zipfile\n","\n","# Make a directory for the kaggle.json
file\n","os.makedirs('/root/.kaggle/', exist_ok=True)\n","\n","# Move kaggle.json to
the correct location\n","!mv kaggle.json /root/.kaggle/\n","\n","# Set permissions
for the kaggle.json file\n","!chmod 600 /root/.kaggle/kaggle.json\n","\n","# Set the
working directory\n","os.makedirs('/content/drive/My
Drive/Projects/Maleria_Detections_System',
exist_ok=True)\n","os.chdir('/content/drive/My
Drive/Projects/Maleria_Detections_System')\n"]},{"cell_type":"code","execution_
count":null,"metadata":{"colab":{"base_uri":"https://localhost:8080/"},"executionI
nfo":{"elapsed":472942,"status":"ok","timestamp":1719850471946,"user":{"displa
yName":"O Sender","userId":"03525772944662445972"},"user_tz":-
60},"id":"v1iTrPd05Q2l","outputId":"a5f21cfc-9614-4714-e9e0-
35057404f844"},"outputs":[{"name":"stdout","output_type":"stream","text":["mv:
cannot stat 'kaggle.json': No such file or directory\n","chmod: cannot access
'/root/.kaggle/kaggle.json': No such file or directory\n","Dataset URL:
https://www.kaggle.com/datasets/iarunava/cell-images-for-detecting-
malaria\n","License(s): unknown\n","Downloading cell-images-for-detecting-
malaria.zip to /content/drive/My
Drive/Projects/Maleria_Detections_System\n","100% 675M/675M [00:33<00:00,
24.0MB/s]\n","100% 675M/675M [00:33<00:00,
21.0MB/s]\n"]}],"source":["\n","# Download the dataset\n","!kaggle datasets
download -d iarunava/cell-images-for-detecting-malaria\n","\n","# Unzip the
dataset\n","with zipfile.ZipFile(\"cell-images-for-detecting-malaria.zip\", 'r') as
zip_ref:\n","
zip_ref.extractall()\n"]},{"cell_type":"markdown","metadata":{"id":"Wovf3kO-
XMyl"},"source":["#**Preprocessing
Data**"]},{"cell_type":"code","execution_count":null,"metadata":{"id":"iFsA3yh

42
TXC_R"},"outputs":[],"source":["import glob\n","import numpy as np\n","from
PIL import Image\n","import cv2\n","from tqdm import
tqdm"]},{"cell_type":"code","execution_count":null,"metadata":{"colab":{"base_u
ri":"https://localhost:8080/"},"executionInfo":{"elapsed":2823,"status":"ok","times
tamp":1719924735131,"user":{"displayName":"O
Sender","userId":"03525772944662445972"},"user_tz":-
60},"id":"QCdrfHBRXcJF","outputId":"95ee3ad9-c6a3-4d30-cec7-
e61ba66a3a10"},"outputs":[{"name":"stdout","output_type":"stream","text":["PAR
ASITIZED: 3500\n","UNINFECTED: 0\n"]}],"source":["# Correct the file
paths\n","PARASITIZED = glob.glob('/content/drive/My
Drive/Projects/Maleria_Detections_System/cell_images/Parasitized/*.png')\n","U
NINFECTED = glob.glob('/content/drive/My
Drive/Projects/Maleria_Detections_System/cell_images/Uninfected/*.png')\n","\n"
,"print('PARASITIZED:', len(PARASITIZED[:3500]))\n","print('UNINFECTED:',
len(UNINFECTED[:3500]))"]},{"cell_type":"code","execution_count":null,"meta
data":{"colab":{"base_uri":"https://localhost:8080/"},"executionInfo":{"elapsed":1
223642,"status":"ok","timestamp":1719925967584,"user":{"displayName":"O
Sender","userId":"03525772944662445972"},"user_tz":-
60},"id":"Ow1S7kpDYaAY","outputId":"642caaa4-c62e-4337-e27a-
a6ed806e9b96"},"outputs":[{"name":"stdout","output_type":"stream","text":["[*]
1\n","[*] 2\n","[*] 3\n","[*] 4\n","[*] 5\n","[*] 6\n","[*] 7\n","[*] 8\n","[*]
9\n","[*] 10\n","[*] 11\n","[*] 12\n","[*] 13\n","[*] 14\n","[*] 15\n","[*]
16\n","[*] 17\n","[*] 18\n","[*] 19\n","[*] 20\n","[*] 21\n","[*] 22\n","[*]
23\n","[*] 24\n","[*] 25\n","[*] 26\n","[*] 27\n","[*] 28\n","[*] 29\n","[*]
30\n","[*] 31\n","[*] 32\n","[*] 33\n","[*] 34\n","[*] 35\n","[*] 36\n","[*]
37\n","[*] 38\n","[*] 39\n","[*] 40\n","[*] 41\n","[*] 42\n","[*] 43\n","[*]
44\n","[*] 45\n","[*] 46\n","[*] 47\n","[*] 48\n","[*] 49\n","[*] 50\n","[*]
51\n","[*] 52\n","[*] 53\n","[*] 54\n","[*] 55\n","[*] 56\n","[*] 57\n","[*]
58\n","[*] 59\n","[*] 60\n","[*] 61\n","[*] 62\n","[*] 63\n","[*] 64\n","[*]
65\n","[*] 66\n","[*] 67\n","[*] 68\n","[*] 69\n","[*] 70\n","[*] 71\n","[*]
72\n","[*] 73\n","[*] 74\n","[*] 75\n","[*] 76\n","[*] 77\n","[*] 78\n","[*]
79\n","[*] 80\n","[*] 81\n","[*] 82\n","[*] 83\n","[*] 84\n","[*] 85\n","[*]
86\n","[*] 87\n","[*] 88\n","[*] 89\n","[*] 90\n","[*] 91\n","[*] 92\n","[*]
93\n","[*] 94\n","[*] 95\n","[*] 96\n","[*] 97\n","[*] 98\n","[*] 99\n","[*]

43
100\n","[*] 101\n","[*] 102\n","[*] 103\n","[*] 104\n","[*] 105\n","[*]
106\n","[*] 107\n","[*] 108\n","[*] 109\n","[*] 110\n","[*] 111\n","[*]
112\n","[*] 113\n","[*] 114\n","[*] 115\n","[*] 116\n","[*] 117\n","[*]
118\n","[*] 119\n","[*] 120\n","[*] 121\n","[*] 122\n","[*] 123\n","[*]
124\n","[*] 125\n","[*] 126\n","[*] 127\n","[*] 128\n","[*] 129\n","[*]
130\n","[*] 131\n","[*] 132\n","[*] 133\n","[*] 134\n","[*] 135\n","[*]
136\n","[*] 137\n","[*] 138\n","[*] 139\n","[*] 140\n","[*] 141\n","[*]
142\n","[*] 143\n","[*] 144\n","[*] 145\n","[*] 146\n","[*] 147\n","[*]
148\n","[*] 149\n","[*] 150\n","[*] 151\n","[*] 152\n","[*] 153\n","[*]
154\n","[*] 155\n","[*] 156\n","[*] 157\n","[*] 158\n","[*] 159\n","[*]
160\n","[*] 161\n","[*] 162\n","[*] 163\n","[*] 164\n","[*] 165\n","[*]
166\n","[*] 167\n","[*] 168\n","[*] 169\n","[*] 170\n","[*] 171\n","[*]
172\n","[*] 173\n","[*] 174\n","[*] 175\n","[*] 176\n","[*] 177\n","[*]
178\n","[*] 179\n","[*] 180\n","[*] 181\n","[*] 182\n","[*] 183\n","[*]
184\n","[*] 185\n","[*] 186\n","[*] 187\n","[*] 188\n","[*] 189\n","[*]
190\n","[*] 191\n","[*] 192\n","[*] 193\n","[*] 194\n","[*] 195\n","[*]
196\n","[*] 197\n","[*] 198\n","[*] 199\n","[*] 200\n","[*] 201\n","[*]
202\n","[*] 203\n","[*] 204\n","[*] 205\n","[*] 206\n","[*] 207\n","[*]
208\n","[*] 209\n","[*] 210\n","[*] 211\n","[*] 212\n","[*] 213\n","[*]
214\n","[*] 215\n","[*] 216\n","[*] 217\n","[*] 218\n","[*] 219\n","[*]
220\n","[*] 221\n","[*] 222\n","[*] 223\n","[*] 224\n","[*] 225\n","[*]
226\n","[*] 227\n","[*] 228\n","[*] 229\n","[*] 230\n","[*] 231\n","[*]
232\n","[*] 233\n","[*] 234\n","[*] 235\n","[*] 236\n","[*] 237\n","[*]
238\n","[*] 239\n","[*] 240\n","[*] 241\n","[*] 242\n","[*] 243\n","[*]
244\n","[*] 245\n","[*] 246\n","[*] 247\n","[*] 248\n","[*] 249\n","[*]
250\n","[*] 251\n","[*] 252\n","[*] 253\n","[*] 254\n","[*] 255\n","[*]
256\n","[*] 257\n","[*] 258\n","[*] 259\n","[*] 260\n","[*] 261\n","[*]
262\n","[*] 263\n","[*] 264\n","[*] 265\n","[*] 266\n","[*] 267\n","[*]
268\n","[*] 269\n","[*] 270\n","[*] 271\n","[*] 272\n","[*] 273\n","[*]
274\n","[*] 275\n","[*] 276\n","[*] 277\n","[*] 278\n","[*] 279\n","[*]
280\n","[*] 281\n","[*] 282\n","[*] 283\n","[*] 284\n","[*] 285\n","[*]
286\n","[*] 287\n","[*] 288\n","[*] 289\n","[*] 290\n","[*] 291\n","[*]
292\n","[*] 293\n","[*] 294\n","[*] 295\n","[*] 296\n","[*] 297\n","[*]
298\n","[*] 299\n","[*] 300\n","[*] 301\n","[*] 302\n","[*] 303\n","[*]

44
304\n","[*] 305\n","[*] 306\n","[*] 307\n","[*] 308\n","[*] 309\n","[*]
310\n","[*] 311\n","[*] 312\n","[*] 313\n","[*] 314\n","[*] 315\n","[*]
316\n","[*] 317\n","[*] 318\n","[*] 319\n","[*] 320\n","[*] 321\n","[*]
322\n","[*] 323\n","[*] 324\n","[*] 325\n","[*] 326\n","[*] 327\n","[*]
328\n","[*] 329\n","[*] 330\n","[*] 331\n","[*] 332\n","[*] 333\n","[*]
334\n","[*] 335\n","[*] 336\n","[*] 337\n","[*] 338\n","[*] 339\n","[*]
340\n","[*] 341\n","[*] 342\n","[*] 343\n","[*] 344\n","[*] 345\n","[*]
346\n","[*] 347\n","[*] 348\n","[*] 349\n","[*] 350\n","[*] 351\n","[*]
352\n","[*] 353\n","[*] 354\n","[*] 355\n","[*] 356\n","[*] 357\n","[*]
358\n","[*] 359\n","[*] 360\n","[*] 361\n","[*] 362\n","[*] 363\n","[*]
364\n","[*] 365\n","[*] 366\n","[*] 367\n","[*] 368\n","[*] 369\n","[*]
370\n","[*] 371\n","[*] 372\n","[*] 373\n","[*] 374\n","[*] 375\n","[*]
376\n","[*] 377\n","[*] 378\n","[*] 379\n","[*] 380\n","[*] 381\n","[*]
382\n","[*] 383\n","[*] 384\n","[*] 385\n","[*] 386\n","[*] 387\n","[*]
388\n","[*] 389\n","[*] 390\n","[*] 391\n","[*] 392\n","[*] 393\n","[*]
394\n","[*] 395\n","[*] 396\n","[*] 397\n","[*] 398\n","[*] 399\n","[*]
400\n","[*] 401\n","[*] 402\n","[*] 403\n","[*] 404\n","[*] 405\n","[*]
406\n","[*] 407\n","[*] 408\n","[*] 409\n","[*] 410\n","[*] 411\n","[*]
412\n","[*] 413\n","[*] 414\n","[*] 415\n","[*] 416\n","[*] 417\n","[*]
418\n","[*] 419\n","[*] 420\n","[*] 421\n","[*] 422\n","[*] 423\n","[*]
424\n","[*] 425\n","[*] 426\n","[*] 427\n","[*] 428\n","[*] 429\n","[*]
430\n","[*] 431\n","[*] 432\n","[*] 433\n","[*] 434\n","[*] 435\n","[*]
436\n","[*] 437\n","[*] 438\n","[*] 439\n","[*] 440\n","[*] 441\n","[*]
442\n","[*] 443\n","[*] 444\n","[*] 445\n","[*] 446\n","[*] 447\n","[*]
448\n","[*] 449\n","[*] 450\n","[*] 451\n","[*] 452\n","[*] 453\n","[*]
454\n","[*] 455\n","[*] 456\n","[*] 457\n","[*] 458\n","[*] 459\n","[*]
460\n","[*] 461\n","[*] 462\n","[*] 463\n","[*] 464\n","[*] 465\n","[*]
466\n","[*] 467\n","[*] 468\n","[*] 469\n","[*] 470\n","[*] 471\n","[*]
472\n","[*] 473\n","[*] 474\n","[*] 475\n","[*] 476\n","[*] 477\n","[*]
478\n","[*] 479\n","[*] 480\n","[*] 481\n","[*] 482\n","[*] 483\n","[*]
484\n","[*] 485\n","[*] 486\n","[*] 487\n","[*] 488\n","[*] 489\n","[*]
490\n","[*] 491\n","[*] 492\n","[*] 493\n","[*] 494\n","[*] 495\n","[*]
496\n","[*] 497\n","[*] 498\n","[*] 499\n","[*] 500\n","[*] 501\n","[*]
502\n","[*] 503\n","[*] 504\n","[*] 505\n","[*] 506\n","[*] 507\n","[*]

45
508\n","[*] 509\n","[*] 510\n","[*] 511\n","[*] 512\n","[*] 513\n","[*]
514\n","[*] 515\n","[*] 516\n","[*] 517\n","[*] 518\n","[*] 519\n","[*]
520\n","[*] 521\n","[*] 522\n","[*] 523\n","[*] 524\n","[*] 525\n","[*]
526\n","[*] 527\n","[*] 528\n","[*] 529\n","[*] 530\n","[*] 531\n","[*]
532\n","[*] 533\n","[*] 534\n","[*] 535\n","[*] 536\n","[*] 537\n","[*]
538\n","[*] 539\n","[*] 540\n","[*] 541\n","[*] 542\n","[*] 543\n","[*]
544\n","[*] 545\n","[*] 546\n","[*] 547\n","[*] 548\n","[*] 549\n","[*]
550\n","[*] 551\n","[*] 552\n","[*] 553\n","[*] 554\n","[*] 555\n","[*]
556\n","[*] 557\n","[*] 558\n","[*] 559\n","[*] 560\n","[*] 561\n","[*]
562\n","[*] 563\n","[*] 564\n","[*] 565\n","[*] 566\n","[*] 567\n","[*]
568\n","[*] 569\n","[*] 570\n","[*] 571\n","[*] 572\n","[*] 573\n","[*]
574\n","[*] 575\n","[*] 576\n","[*] 577\n","[*] 578\n","[*] 579\n","[*]
580\n","[*] 581\n","[*] 582\n","[*] 583\n","[*] 584\n","[*] 585\n","[*]
586\n","[*] 587\n","[*] 588\n","[*] 589\n","[*] 590\n","[*] 591\n","[*]
592\n","[*] 593\n","[*] 594\n","[*] 595\n","[*] 596\n","[*] 597\n","[*]
598\n","[*] 599\n","[*] 600\n","[*] 601\n","[*] 602\n","[*] 603\n","[*]
604\n","[*] 605\n","[*] 606\n","[*] 607\n","[*] 608\n","[*] 609\n","[*]
610\n","[*] 611\n","[*] 612\n","[*] 613\n","[*] 614\n","[*] 615\n","[*]
616\n","[*] 617\n","[*] 618\n","[*] 619\n","[*] 620\n","[*] 621\n","[*]
622\n","[*] 623\n","[*] 624\n","[*] 625\n","[*] 626\n","[*] 627\n","[*]
628\n","[*] 629\n","[*] 630\n","[*] 631\n","[*] 632\n","[*] 633\n","[*]
634\n","[*] 635\n","[*] 636\n","[*] 637\n","[*] 638\n","[*] 639\n","[*]
640\n","[*] 641\n","[*] 642\n","[*] 643\n","[*] 644\n","[*] 645\n","[*]
646\n","[*] 647\n","[*] 648\n","[*] 649\n","[*] 650\n","[*] 651\n","[*]
652\n","[*] 653\n","[*] 654\n","[*] 655\n","[*] 656\n","[*] 657\n","[*]
658\n","[*] 659\n","[*] 660\n","[*] 661\n","[*] 662\n","[*] 663\n","[*]
664\n","[*] 665\n","[*] 666\n","[*] 667\n","[*] 668\n","[*] 669\n","[*]
670\n","[*] 671\n","[*] 672\n","[*] 673\n","[*] 674\n","[*] 675\n","[*]
676\n","[*] 677\n","[*] 678\n","[*] 679\n","[*] 680\n","[*] 681\n","[*]
682\n","[*] 683\n","[*] 684\n","[*] 685\n","[*] 686\n","[*] 687\n","[*]
688\n","[*] 689\n","[*] 690\n","[*] 691\n","[*] 692\n","[*] 693\n","[*]
694\n","[*] 695\n","[*] 696\n","[*] 697\n","[*] 698\n","[*] 699\n","[*]
700\n","[*] 701\n","[*] 702\n","[*] 703\n","[*] 704\n","[*] 705\n","[*]
706\n","[*] 707\n","[*] 708\n","[*] 709\n","[*] 710\n","[*] 711\n","[*]

46
712\n","[*] 713\n","[*] 714\n","[*] 715\n","[*] 716\n","[*] 717\n","[*]
718\n","[*] 719\n","[*] 720\n","[*] 721\n","[*] 722\n","[*] 723\n","[*]
724\n","[*] 725\n","[*] 726\n","[*] 727\n","[*] 728\n","[*] 729\n","[*]
730\n","[*] 731\n","[*] 732\n","[*] 733\n","[*] 734\n","[*] 735\n","[*]
736\n","[*] 737\n","[*] 738\n","[*] 739\n","[*] 740\n","[*] 741\n","[*]
742\n","[*] 743\n","[*] 744\n","[*] 745\n","[*] 746\n","[*] 747\n","[*]
748\n","[*] 749\n","[*] 750\n","[*] 751\n","[*] 752\n","[*] 753\n","[*]
754\n","[*] 755\n","[*] 756\n","[*] 757\n","[*] 758\n","[*] 759\n","[*]
760\n","[*] 761\n","[*] 762\n","[*] 763\n","[*] 764\n","[*] 765\n","[*]
766\n","[*] 767\n","[*] 768\n","[*] 769\n","[*] 770\n","[*] 771\n","[*]
772\n","[*] 773\n","[*] 774\n","[*] 775\n","[*] 776\n","[*] 777\n","[*]
778\n","[*] 779\n","[*] 780\n","[*] 781\n","[*] 782\n","[*] 783\n","[*]
784\n","[*] 785\n","[*] 786\n","[*] 787\n","[*] 788\n","[*] 789\n","[*]
790\n","[*] 791\n","[*] 792\n","[*] 793\n","[*] 794\n","[*] 795\n","[*]
796\n","[*] 797\n","[*] 798\n","[*] 799\n","[*] 800\n","[*] 801\n","[*]
802\n","[*] 803\n","[*] 804\n","[*] 805\n","[*] 806\n","[*] 807\n","[*]
808\n","[*] 809\n","[*] 810\n","[*] 811\n","[*] 812\n","[*] 813\n","[*]
814\n","[*] 815\n","[*] 816\n","[*] 817\n","[*] 818\n","[*] 819\n","[*]
820\n","[*] 821\n","[*] 822\n","[*] 823\n","[*] 824\n","[*] 825\n","[*]
826\n","[*] 827\n","[*] 828\n","[*] 829\n","[*] 830\n","[*] 831\n","[*]
832\n","[*] 833\n","[*] 834\n","[*] 835\n","[*] 836\n","[*] 837\n","[*]
838\n","[*] 839\n","[*] 840\n","[*] 841\n","[*] 842\n","[*] 843\n","[*]
844\n","[*] 845\n","[*] 846\n","[*] 847\n","[*] 848\n","[*] 849\n","[*]
850\n","[*] 851\n","[*] 852\n","[*] 853\n","[*] 854\n","[*] 855\n","[*]
856\n","[*] 857\n","[*] 858\n","[*] 859\n","[*] 860\n","[*] 861\n","[*]
862\n","[*] 863\n","[*] 864\n","[*] 865\n","[*] 866\n","[*] 867\n","[*]
868\n","[*] 869\n","[*] 870\n","[*] 871\n","[*] 872\n","[*] 873\n","[*]
874\n","[*] 875\n","[*] 876\n","[*] 877\n","[*] 878\n","[*] 879\n","[*]
880\n","[*] 881\n","[*] 882\n","[*] 883\n","[*] 884\n","[*] 885\n","[*]
886\n","[*] 887\n","[*] 888\n","[*] 889\n","[*] 890\n","[*] 891\n","[*]
892\n","[*] 893\n","[*] 894\n","[*] 895\n","[*] 896\n","[*] 897\n","[*]
898\n","[*] 899\n","[*] 900\n","[*] 901\n","[*] 902\n","[*] 903\n","[*]
904\n","[*] 905\n","[*] 906\n","[*] 907\n","[*] 908\n","[*] 909\n","[*]
910\n","[*] 911\n","[*] 912\n","[*] 913\n","[*] 914\n","[*] 915\n","[*]

47
916\n","[*] 917\n","[*] 918\n","[*] 919\n","[*] 920\n","[*] 921\n","[*]
922\n","[*] 923\n","[*] 924\n","[*] 925\n","[*] 926\n","[*] 927\n","[*]
928\n","[*] 929\n","[*] 930\n","[*] 931\n","[*] 932\n","[*] 933\n","[*]
934\n","[*] 935\n","[*] 936\n","[*] 937\n","[*] 938\n","[*] 939\n","[*]
940\n","[*] 941\n","[*] 942\n","[*] 943\n","[*] 944\n","[*] 945\n","[*]
946\n","[*] 947\n","[*] 948\n","[*] 949\n","[*] 950\n","[*] 951\n","[*]
952\n","[*] 953\n","[*] 954\n","[*] 955\n","[*] 956\n","[*] 957\n","[*]
958\n","[*] 959\n","[*] 960\n","[*] 961\n","[*] 962\n","[*] 963\n","[*]
964\n","[*] 965\n","[*] 966\n","[*] 967\n","[*] 968\n","[*] 969\n","[*]
970\n","[*] 971\n","[*] 972\n","[*] 973\n","[*] 974\n","[*] 975\n","[*]
976\n","[*] 977\n","[*] 978\n","[*] 979\n","[*] 980\n","[*] 981\n","[*]
982\n","[*] 983\n","[*] 984\n","[*] 985\n","[*] 986\n","[*] 987\n","[*]
988\n","[*] 989\n","[*] 990\n","[*] 991\n","[*] 992\n","[*] 993\n","[*]
994\n","[*] 995\n","[*] 996\n","[*] 997\n","[*] 998\n","[*] 999\n","[*]
1000\n","[*] 1001\n","[*] 1002\n","[*] 1003\n","[*] 1004\n","[*] 1005\n","[*]
1006\n","[*] 1007\n","[*] 1008\n","[*] 1009\n","[*] 1010\n","[*] 1011\n","[*]
1012\n","[*] 1013\n","[*] 1014\n","[*] 1015\n","[*] 1016\n","[*] 1017\n","[*]
1018\n","[*] 1019\n","[*] 1020\n","[*] 1021\n","[*] 1022\n","[*] 1023\n","[*]
1024\n","[*] 1025\n","[*] 1026\n","[*] 1027\n","[*] 1028\n","[*] 1029\n","[*]
1030\n","[*] 1031\n","[*] 1032\n","[*] 1033\n","[*] 1034\n","[*] 1035\n","[*]
1036\n","[*] 1037\n","[*] 1038\n","[*] 1039\n","[*] 1040\n","[*] 1041\n","[*]
1042\n","[*] 1043\n","[*] 1044\n","[*] 1045\n","[*] 1046\n","[*] 1047\n","[*]
1048\n","[*] 1049\n","[*] 1050\n","[*] 1051\n","[*] 1052\n","[*] 1053\n","[*]
1054\n","[*] 1055\n","[*] 1056\n","[*] 1057\n","[*] 1058\n","[*] 1059\n","[*]
1060\n","[*] 1061\n","[*] 1062\n","[*] 1063\n","[*] 1064\n","[*] 1065\n","[*]
1066\n","[*] 1067\n","[*] 1068\n","[*] 1069\n","[*] 1070\n","[*] 1071\n","[*]
1072\n","[*] 1073\n","[*] 1074\n","[*] 1075\n","[*] 1076\n","[*] 1077\n","[*]
1078\n","[*] 1079\n","[*] 1080\n","[*] 1081\n","[*] 1082\n","[*] 1083\n","[*]
1084\n","[*] 1085\n","[*] 1086\n","[*] 1087\n","[*] 1088\n","[*] 1089\n","[*]
1090\n","[*] 1091\n","[*] 1092\n","[*] 1093\n","[*] 1094\n","[*] 1095\n","[*]
1096\n","[*] 1097\n","[*] 1098\n","[*] 1099\n","[*] 1100\n","[*] 1101\n","[*]
1102\n","[*] 1103\n","[*] 1104\n","[*] 1105\n","[*] 1106\n","[*] 1107\n","[*]
1108\n","[*] 1109\n","[*] 1110\n","[*] 1111\n","[*] 1112\n","[*] 1113\n","[*]
1114\n","[*] 1115\n","[*] 1116\n","[*] 1117\n","[*] 1118\n","[*] 1119\n","[*]

48
1120\n","[*] 1121\n","[*] 1122\n","[*] 1123\n","[*] 1124\n","[*] 1125\n","[*]
1126\n","[*] 1127\n","[*] 1128\n","[*] 1129\n","[*] 1130\n","[*] 1131\n","[*]
1132\n","[*] 1133\n","[*] 1134\n","[*] 1135\n","[*] 1136\n","[*] 1137\n","[*]
1138\n","[*] 1139\n","[*] 1140\n","[*] 1141\n","[*] 1142\n","[*] 1143\n","[*]
1144\n","[*] 1145\n","[*] 1146\n","[*] 1147\n","[*] 1148\n","[*] 1149\n","[*]
1150\n","[*] 1151\n","[*] 1152\n","[*] 1153\n","[*] 1154\n","[*] 1155\n","[*]
1156\n","[*] 1157\n","[*] 1158\n","[*] 1159\n","[*] 1160\n","[*] 1161\n","[*]
1162\n","[*] 1163\n","[*] 1164\n","[*] 1165\n","[*] 1166\n","[*] 1167\n","[*]
1168\n","[*] 1169\n","[*] 1170\n","[*] 1171\n","[*] 1172\n","[*] 1173\n","[*]
1174\n","[*] 1175\n","[*] 1176\n","[*] 1177\n","[*] 1178\n","[*] 1179\n","[*]
1180\n","[*] 1181\n","[*] 1182\n","[*] 1183\n","[*] 1184\n","[*] 1185\n","[*]
1186\n","[*] 1187\n","[*] 1188\n","[*] 1189\n","[*] 1190\n","[*] 1191\n","[*]
1192\n","[*] 1193\n","[*] 1194\n","[*] 1195\n","[*] 1196\n","[*] 1197\n","[*]
1198\n","[*] 1199\n","[*] 1200\n","[*] 1201\n","[*] 1202\n","[*] 1203\n","[*]
1204\n","[*] 1205\n","[*] 1206\n","[*] 1207\n","[*] 1208\n","[*] 1209\n","[*]
1210\n","[*] 1211\n","[*] 1212\n","[*] 1213\n","[*] 1214\n","[*] 1215\n","[*]
1216\n","[*] 1217\n","[*] 1218\n","[*] 1219\n","[*] 1220\n","[*] 1221\n","[*]
1222\n","[*] 1223\n","[*] 1224\n","[*] 1225\n","[*] 1226\n","[*] 1227\n","[*]
1228\n","[*] 1229\n","[*] 1230\n","[*] 1231\n","[*] 1232\n","[*] 1233\n","[*]
1234\n","[*] 1235\n","[*] 1236\n","[*] 1237\n","[*] 1238\n","[*] 1239\n","[*]
1240\n","[*] 1241\n","[*] 1242\n","[*] 1243\n","[*] 1244\n","[*] 1245\n","[*]
1246\n","[*] 1247\n","[*] 1248\n","[*] 1249\n","[*] 1250\n","[*] 1251\n","[*]
1252\n","[*] 1253\n","[*] 1254\n","[*] 1255\n","[*] 1256\n","[*] 1257\n","[*]
1258\n","[*] 1259\n","[*] 1260\n","[*] 1261\n","[*] 1262\n","[*] 1263\n","[*]
1264\n","[*] 1265\n","[*] 1266\n","[*] 1267\n","[*] 1268\n","[*] 1269\n","[*]
1270\n","[*] 1271\n","[*] 1272\n","[*] 1273\n","[*] 1274\n","[*] 1275\n","[*]
1276\n","[*] 1277\n","[*] 1278\n","[*] 1279\n","[*] 1280\n","[*] 1281\n","[*]
1282\n","[*] 1283\n","[*] 1284\n","[*] 1285\n","[*] 1286\n","[*] 1287\n","[*]
1288\n","[*] 1289\n","[*] 1290\n","[*] 1291\n","[*] 1292\n","[*] 1293\n","[*]
1294\n","[*] 1295\n","[*] 1296\n","[*] 1297\n","[*] 1298\n","[*] 1299\n","[*]
1300\n","[*] 1301\n","[*] 1302\n","[*] 1303\n","[*] 1304\n","[*] 1305\n","[*]
1306\n","[*] 1307\n","[*] 1308\n","[*] 1309\n","[*] 1310\n","[*] 1311\n","[*]
1312\n","[*] 1313\n","[*] 1314\n","[*] 1315\n","[*] 1316\n","[*] 1317\n","[*]
1318\n","[*] 1319\n","[*] 1320\n","[*] 1321\n","[*] 1322\n","[*] 1323\n","[*]

49
1324\n","[*] 1325\n","[*] 1326\n","[*] 1327\n","[*] 1328\n","[*] 1329\n","[*]
1330\n","[*] 1331\n","[*] 1332\n","[*] 1333\n","[*] 1334\n","[*] 1335\n","[*]
1336\n","[*] 1337\n","[*] 1338\n","[*] 1339\n","[*] 1340\n","[*] 1341\n","[*]
1342\n","[*] 1343\n","[*] 1344\n","[*] 1345\n","[*] 1346\n","[*] 1347\n","[*]
1348\n","[*] 1349\n","[*] 1350\n","[*] 1351\n","[*] 1352\n","[*] 1353\n","[*]
1354\n","[*] 1355\n","[*] 1356\n","[*] 1357\n","[*] 1358\n","[*] 1359\n","[*]
1360\n","[*] 1361\n","[*] 1362\n","[*] 1363\n","[*] 1364\n","[*] 1365\n","[*]
1366\n","[*] 1367\n","[*] 1368\n","[*] 1369\n","[*] 1370\n","[*] 1371\n","[*]
1372\n","[*] 1373\n","[*] 1374\n","[*] 1375\n","[*] 1376\n","[*] 1377\n","[*]
1378\n","[*] 1379\n","[*] 1380\n","[*] 1381\n","[*] 1382\n","[*] 1383\n","[*]
1384\n","[*] 1385\n","[*] 1386\n","[*] 1387\n","[*] 1388\n","[*] 1389\n","[*]
1390\n","[*] 1391\n","[*] 1392\n","[*] 1393\n","[*] 1394\n","[*] 1395\n","[*]
1396\n","[*] 1397\n","[*] 1398\n","[*] 1399\n","[*] 1400\n","[*] 1401\n","[*]
1402\n","[*] 1403\n","[*] 1404\n","[*] 1405\n","[*] 1406\n","[*] 1407\n","[*]
1408\n","[*] 1409\n","[*] 1410\n","[*] 1411\n","[*] 1412\n","[*] 1413\n","[*]
1414\n","[*] 1415\n","[*] 1416\n","[*] 1417\n","[*] 1418\n","[*] 1419\n","[*]
1420\n","[*] 1421\n","[*] 1422\n","[*] 1423\n","[*] 1424\n","[*] 1425\n","[*]
1426\n","[*] 1427\n","[*] 1428\n","[*] 1429\n","[*] 1430\n","[*] 1431\n","[*]
1432\n","[*] 1433\n","[*] 1434\n","[*] 1435\n","[*] 1436\n","[*] 1437\n","[*]
1438\n","[*] 1439\n","[*] 1440\n","[*] 1441\n","[*] 1442\n","[*] 1443\n","[*]
1444\n","[*] 1445\n","[*] 1446\n","[*] 1447\n","[*] 1448\n","[*] 1449\n","[*]
1450\n","[*] 1451\n","[*] 1452\n","[*] 1453\n","[*] 1454\n","[*] 1455\n","[*]
1456\n","[*] 1457\n","[*] 1458\n","[*] 1459\n","[*] 1460\n","[*] 1461\n","[*]
1462\n","[*] 1463\n","[*] 1464\n","[*] 1465\n","[*] 1466\n","[*] 1467\n","[*]
1468\n","[*] 1469\n","[*] 1470\n","[*] 1471\n","[*] 1472\n","[*] 1473\n","[*]
1474\n","[*] 1475\n","[*] 1476\n","[*] 1477\n","[*] 1478\n","[*] 1479\n","[*]
1480\n","[*] 1481\n","[*] 1482\n","[*] 1483\n","[*] 1484\n","[*] 1485\n","[*]
1486\n","[*] 1487\n","[*] 1488\n","[*] 1489\n","[*] 1490\n","[*] 1491\n","[*]
1492\n","[*] 1493\n","[*] 1494\n","[*] 1495\n","[*] 1496\n","[*] 1497\n","[*]
1498\n","[*] 1499\n","[*] 1500\n","[*] 1501\n","[*] 1502\n","[*] 1503\n","[*]
1504\n","[*] 1505\n","[*] 1506\n","[*] 1507\n","[*] 1508\n","[*] 1509\n","[*]
1510\n","[*] 1511\n","[*] 1512\n","[*] 1513\n","[*] 1514\n","[*] 1515\n","[*]
1516\n","[*] 1517\n","[*] 1518\n","[*] 1519\n","[*] 1520\n","[*] 1521\n","[*]
1522\n","[*] 1523\n","[*] 1524\n","[*] 1525\n","[*] 1526\n","[*] 1527\n","[*]

50
1528\n","[*] 1529\n","[*] 1530\n","[*] 1531\n","[*] 1532\n","[*] 1533\n","[*]
1534\n","[*] 1535\n","[*] 1536\n","[*] 1537\n","[*] 1538\n","[*] 1539\n","[*]
1540\n","[*] 1541\n","[*] 1542\n","[*] 1543\n","[*] 1544\n","[*] 1545\n","[*]
1546\n","[*] 1547\n","[*] 1548\n","[*] 1549\n","[*] 1550\n","[*] 1551\n","[*]
1552\n","[*] 1553\n","[*] 1554\n","[*] 1555\n","[*] 1556\n","[*] 1557\n","[*]
1558\n","[*] 1559\n","[*] 1560\n","[*] 1561\n","[*] 1562\n","[*] 1563\n","[*]
1564\n","[*] 1565\n","[*] 1566\n","[*] 1567\n","[*] 1568\n","[*] 1569\n","[*]
1570\n","[*] 1571\n","[*] 1572\n","[*] 1573\n","[*] 1574\n","[*] 1575\n","[*]
1576\n","[*] 1577\n","[*] 1578\n","[*] 1579\n","[*] 1580\n","[*] 1581\n","[*]
1582\n","[*] 1583\n","[*] 1584\n","[*] 1585\n","[*] 1586\n","[*] 1587\n","[*]
1588\n","[*] 1589\n","[*] 1590\n","[*] 1591\n","[*] 1592\n","[*] 1593\n","[*]
1594\n","[*] 1595\n","[*] 1596\n","[*] 1597\n","[*] 1598\n","[*] 1599\n","[*]
1600\n","[*] 1601\n","[*] 1602\n","[*] 1603\n","[*] 1604\n","[*] 1605\n","[*]
1606\n","[*] 1607\n","[*] 1608\n","[*] 1609\n","[*] 1610\n","[*] 1611\n","[*]
1612\n","[*] 1613\n","[*] 1614\n","[*] 1615\n","[*] 1616\n","[*] 1617\n","[*]
1618\n","[*] 1619\n","[*] 1620\n","[*] 1621\n","[*] 1622\n","[*] 1623\n","[*]
1624\n","[*] 1625\n","[*] 1626\n","[*] 1627\n","[*] 1628\n","[*] 1629\n","[*]
1630\n","[*] 1631\n","[*] 1632\n","[*] 1633\n","[*] 1634\n","[*] 1635\n","[*]
1636\n","[*] 1637\n","[*] 1638\n","[*] 1639\n","[*] 1640\n","[*] 1641\n","[*]
1642\n","[*] 1643\n","[*] 1644\n","[*] 1645\n","[*] 1646\n","[*] 1647\n","[*]
1648\n","[*] 1649\n","[*] 1650\n","[*] 1651\n","[*] 1652\n","[*] 1653\n","[*]
1654\n","[*] 1655\n","[*] 1656\n","[*] 1657\n","[*] 1658\n","[*] 1659\n","[*]
1660\n","[*] 1661\n","[*] 1662\n","[*] 1663\n","[*] 1664\n","[*] 1665\n","[*]
1666\n","[*] 1667\n","[*] 1668\n","[*] 1669\n","[*] 1670\n","[*] 1671\n","[*]
1672\n","[*] 1673\n","[*] 1674\n","[*] 1675\n","[*] 1676\n","[*] 1677\n","[*]
1678\n","[*] 1679\n","[*] 1680\n","[*] 1681\n","[*] 1682\n","[*] 1683\n","[*]
1684\n","[*] 1685\n","[*] 1686\n","[*] 1687\n","[*] 1688\n","[*] 1689\n","[*]
1690\n","[*] 1691\n","[*] 1692\n","[*] 1693\n","[*] 1694\n","[*] 1695\n","[*]
1696\n","[*] 1697\n","[*] 1698\n","[*] 1699\n","[*] 1700\n","[*] 1701\n","[*]
1702\n","[*] 1703\n","[*] 1704\n","[*] 1705\n","[*] 1706\n","[*] 1707\n","[*]
1708\n","[*] 1709\n","[*] 1710\n","[*] 1711\n","[*] 1712\n","[*] 1713\n","[*]
1714\n","[*] 1715\n","[*] 1716\n","[*] 1717\n","[*] 1718\n","[*] 1719\n","[*]
1720\n","[*] 1721\n","[*] 1722\n","[*] 1723\n","[*] 1724\n","[*] 1725\n","[*]
1726\n","[*] 1727\n","[*] 1728\n","[*] 1729\n","[*] 1730\n","[*] 1731\n","[*]

51
1732\n","[*] 1733\n","[*] 1734\n","[*] 1735\n","[*] 1736\n","[*] 1737\n","[*]
1738\n","[*] 1739\n","[*] 1740\n","[*] 1741\n","[*] 1742\n","[*] 1743\n","[*]
1744\n","[*] 1745\n","[*] 1746\n","[*] 1747\n","[*] 1748\n","[*] 1749\n","[*]
1750\n","[*] 1751\n","[*] 1752\n","[*] 1753\n","[*] 1754\n","[*] 1755\n","[*]
1756\n","[*] 1757\n","[*] 1758\n","[*] 1759\n","[*] 1760\n","[*] 1761\n","[*]
1762\n","[*] 1763\n","[*] 1764\n","[*] 1765\n","[*] 1766\n","[*] 1767\n","[*]
1768\n","[*] 1769\n","[*] 1770\n","[*] 1771\n","[*] 1772\n","[*] 1773\n","[*]
1774\n","[*] 1775\n","[*] 1776\n","[*] 1777\n","[*] 1778\n","[*] 1779\n","[*]
1780\n","[*] 1781\n","[*] 1782\n","[*] 1783\n","[*] 1784\n","[*] 1785\n","[*]
1786\n","[*] 1787\n","[*] 1788\n","[*] 1789\n","[*] 1790\n","[*] 1791\n","[*]
1792\n","[*] 1793\n","[*] 1794\n","[*] 1795\n","[*] 1796\n","[*] 1797\n","[*]
1798\n","[*] 1799\n","[*] 1800\n","[*] 1801\n","[*] 1802\n","[*] 1803\n","[*]
1804\n","[*] 1805\n","[*] 1806\n","[*] 1807\n","[*] 1808\n","[*] 1809\n","[*]
1810\n","[*] 1811\n","[*] 1812\n","[*] 1813\n","[*] 1814\n","[*] 1815\n","[*]
1816\n","[*] 1817\n","[*] 1818\n","[*] 1819\n","[*] 1820\n","[*] 1821\n","[*]
1822\n","[*] 1823\n","[*] 1824\n","[*] 1825\n","[*] 1826\n","[*] 1827\n","[*]
1828\n","[*] 1829\n","[*] 1830\n","[*] 1831\n","[*] 1832\n","[*] 1833\n","[*]
1834\n","[*] 1835\n","[*] 1836\n","[*] 1837\n","[*] 1838\n","[*] 1839\n","[*]
1840\n","[*] 1841\n","[*] 1842\n","[*] 1843\n","[*] 1844\n","[*] 1845\n","[*]
1846\n","[*] 1847\n","[*] 1848\n","[*] 1849\n","[*] 1850\n","[*] 1851\n","[*]
1852\n","[*] 1853\n","[*] 1854\n","[*] 1855\n","[*] 1856\n","[*] 1857\n","[*]
1858\n","[*] 1859\n","[*] 1860\n","[*] 1861\n","[*] 1862\n","[*] 1863\n","[*]
1864\n","[*] 1865\n","[*] 1866\n","[*] 1867\n","[*] 1868\n","[*] 1869\n","[*]
1870\n","[*] 1871\n","[*] 1872\n","[*] 1873\n","[*] 1874\n","[*] 1875\n","[*]
1876\n","[*] 1877\n","[*] 1878\n","[*] 1879\n","[*] 1880\n","[*] 1881\n","[*]
1882\n","[*] 1883\n","[*] 1884\n","[*] 1885\n","[*] 1886\n","[*] 1887\n","[*]
1888\n","[*] 1889\n","[*] 1890\n","[*] 1891\n","[*] 1892\n","[*] 1893\n","[*]
1894\n","[*] 1895\n","[*] 1896\n","[*] 1897\n","[*] 1898\n","[*] 1899\n","[*]
1900\n","[*] 1901\n","[*] 1902\n","[*] 1903\n","[*] 1904\n","[*] 1905\n","[*]
1906\n","[*] 1907\n","[*] 1908\n","[*] 1909\n","[*] 1910\n","[*] 1911\n","[*]
1912\n","[*] 1913\n","[*] 1914\n","[*] 1915\n","[*] 1916\n","[*] 1917\n","[*]
1918\n","[*] 1919\n","[*] 1920\n","[*] 1921\n","[*] 1922\n","[*] 1923\n","[*]
1924\n","[*] 1925\n","[*] 1926\n","[*] 1927\n","[*] 1928\n","[*] 1929\n","[*]
1930\n","[*] 1931\n","[*] 1932\n","[*] 1933\n","[*] 1934\n","[*] 1935\n","[*]

52
3442\n","[*] 3443\n","[*] 3444\n","[*] 3445\n","[*] 3446\n","[*] 3447\n","[*]
3448\n","[*] 3449\n","[*] 3450\n","[*] 3451\n","[*] 3452\n","[*] 3453\n","[*]
3454\n","[*] 3455\n","[*] 3456\n","[*] 3457\n","[*] 3458\n","[*] 3459\n","[*]
3460\n","[*] 3461\n","[*] 3462\n","[*] 3463\n","[*] 3464\n","[*] 3465\n","[*]
3466\n","[*] 3467\n","[*] 3468\n","[*] 3469\n","[*] 3470\n","[*] 3471\n","[*]
3472\n","[*] 3473\n","[*] 3474\n","[*] 3475\n","[*] 3476\n","[*] 3477\n","[*]
3478\n","[*] 3479\n","[*] 3480\n","[*] 3481\n","[*] 3482\n","[*] 3483\n","[*]
3484\n","[*] 3485\n","[*] 3486\n","[*] 3487\n","[*] 3488\n","[*] 3489\n","[*]
3490\n","[*] 3491\n","[*] 3492\n","[*] 3493\n","[*] 3494\n","[*] 3495\n","[*]
3496\n","[*] 3497\n","[*] 3498\n","[*] 3499\n","[*]
3500\n"]}],"source":["import cv2\n","from PIL import Image\n","import numpy as
np\n","\n","# Initialize empty lists to hold the dataset images and their
corresponding labels\n","dataset = []\n","label = []\n","id = 0 # Counter to keep
track of the number of processed images\n","\n","# Loop through the first 3500
images in the PARASITIZED list\n","for img in PARASITIZED[:3500]:\n"," #
Read the image using OpenCV\n"," image = cv2.imread(img)\n","\n"," #
Convert the image to a PIL Image in RGB format\n"," image =
Image.fromarray(image, 'RGB')\n","\n"," # Resize the image to 150x150
pixels\n"," image = image.resize((150, 150))\n","\n"," # Append the processed
image (as a numpy array) to the dataset list\n","
dataset.append(np.array(image))\n","\n"," # Append the label '1' (indicating
'Parasitized') to the label list\n"," label.append(1)\n","\n"," # Increment the
image counter\n"," id += 1\n","\n"," # Print the current count of processed
images\n"," print('[*] ',
id)\n"]},{"cell_type":"code","execution_count":null,"metadata":{"id":"i0-
GIsTxZffH"},"outputs":[],"source":["# Reset the image counter\n","id =
0\n","\n","# Loop through the first 3500 images in the UNINFECTED list\n","for
img in UNINFECTED[:3500]:\n"," # Read the image using OpenCV\n","
image = cv2.imread(img)\n","\n"," # Convert the image to a PIL Image in RGB
format\n"," image = Image.fromarray(image, 'RGB')\n","\n"," # Resize the
image to 150x150 pixels\n"," image = image.resize((150, 150))\n","\n"," #
Append the processed image (as a numpy array) to the dataset list\n","
dataset.append(np.array(image))\n","\n"," # Append the label '0' (indicating
'Uninfected') to the label list\n"," label.append(0)\n","\n"," # Increment the

53
ZWWyj0Wjg4uKCf/7zn7hz506N/ly5cqXB+2qMDRs24JNPPkF4eDj69u1rsO+ffv
oJ5eXl4rItW7aItxdNTaPR4Pfff8fmzZvFZaWlpfj4449NetzKCs6fKzbl5eX417/+Z
dLjNpS1tTUiIyORlpaGS5cuicvPnj2Lbdu2ydgzorqxgkRkQT799NNa55eZMmUK
Fi5ciIyMDPTo0QNvvPEGbGxssHr1apSVlRnM19OxY0f06dMHwcHBaNq0KQ4
ePIgvv/wSkyZNAgD88ssv6Nu3L4YOHYqOHTvCxsYGmzZtQkFBAYYNGwY
AcHFxwapVqzBy5Eg89dRTGDZsGFq0aIH8/Hx8++236N69O1asWNGgfd3Pl19
+CScnJ5SXl4szae/duxeBgYFITU01aDtu3Dh8+eWXiIqKwtChQ3Hu3DmDipep/e
1vf8OKFSswfPhwTJkyBR4eHli/fr048aSp5vuJiIiAm5sbYmNj8eabb0KhUODzzz8
3q1tc8+fPx44dO9C9e3e8/vrrqKiowIoVK9C5c2ccPnxY7u4R1cCARGRBVq1aVe
vy0aNHo1OnTvjhhx8wa9YsJCYmQq/XIywsDF988YU4BxIAvPnmm9i8eTN27N
iBsrIytGnTBgsXLsTf//53AIC3tzeGDx+OzMxMfP7557CxsYG/vz/+85//YMiQIeJ
+XnnlFXh6euK9997DkiVLUFZWBi8vL/Ts2VN8yqqh+6rP66+/DuDe02DNmzd
HUFAQPv30U7zyyiuws7MzaKvRaLBs2TK8//77mDp1KkJCQrBlyxZMmzat4Rf
5IVTOPzR58mR8+OGHcHJywqhRoxAREYEhQ4aYbIbuZs2aiec5d+5cuLm54d
VXX0Xfvn0NbkHKKTg4GNu2bcP06dMxb948eHt7491338XJkycb9JQdkdQUgjn
9JwYR0SMoKSkJb731Fn777Td4eXnJ3R2zEh0dXe+0E0Ry4RgkIiIjun37tsH30tJ
SrF69Gn5+fo99OKp+bc6cOYOtW7cavAaGyFzwFhsRkRENHjwYrVu3RlBQEIq
Li/HFF1/g1KlTWL9+vdxdk13btm0xevRotG3bFhcuXMCqVatga2uLGTNmyN01
ohoYkIiIjEij0eCTTz7B+vXrUVFRgY4dOyIlJQUxMTFyd012UVFR+Pe//w2tVgs
7OzuEh4fjn//8J/z8/OTuGlENHINEREREVA3HIBERERFVw4BEREREVA3HID
WSXq/HpUuX4OzsbLLJ34iIiMi4BEHAjRs34OnpCSuruutEDEiNdOnSJYN3Wx
EREZHluHjxIlq1alXnegakRnJ2dgZw7wJXvuOKiIiIzJtOp4O3t7f4O14XBqRGqry
t5uLiwoBERERkYe43PIaDtImIiIiqYUAiIiIiqoYBiYiIiKgajkEiIiJZVFRU4M6dO
3J3gx4xTZo0gbW19UPvhwGJiIgkJQgCtFotioqK5O4KPaJcXV2hVqsfap5CBiQiI
pJUZThq2bIlHBwcONkuGY0gCLh16xYKCwsBAB4eHo3eFwMSERFJpqKiQgx
HzZo1k7s79Aiyt7cHABQWFqJly5aNvt3GQdpERCSZyjFHDg4OMveEHmWV/7
4eZowbAxIREUmOt9XIlIzx74sBiYiIiKgaBiQiIiKZ+Pj4ICkpqcHtd+/eDYVCwSc
AJcCAREREdB8KhaLez/z58xu13wMHDmDChAkNbh8REYHLly9DpVI16ngNx
SDGp9jMTnn5VVRU3ESTJs1gY1P/m4aJiEgaly9fFv++ceNGxMfH4/Tp0+IyJycn
8e+CIKCiogI2Nvf/iW3RosUD9cPW1hZqtfqBtqHGYQXJzJw4MQzZ2b7444/Ncn
eFiIj+S61Wix+VSgWFQiF+P3XqFJydnbFt2zYEBwfDzs4OP/74I86dO4eBAwfC
3d0dTk5O6NatG7777juD/Va/xaZQKPDJJ59g0KBBcHBwgJ+fHzZvrvo9qF7ZWb
duHVxdXbF9+3YEBATAyckJUVFRBoHu7t27ePPNN+ ","outputId":"3b833583-
a613-48f8-b1ec-

54
85lcnFofqtSC/aRq7HVm+N4l0tgraSg5E8r3WUGgqQwcKUa9kP1+bfzcSj/9GLku
f6WsjDD1g8ccRcySbOtJOPMIQvV3deKnAgs7xAoK961d22VLgPZGY+1/DVdS
CFRBqb9RHRvBHiudzbaOlsgm7vad61qoUT+jxjit40fAveozQxxzvqCSeKnTjYag
Vphp2oD0h4Qr8hJqY3VNeUzOHiCRl1PhXXmul3dprVDodtIW8RtPlTeTmGAX
EDUg2NGFZvJHD3AthMAQk3BcAeKSS4Jgy76JLNvFjF4TR7suNL6JAilkyNE
5oL+OaVp5mc6js907TcDZ993wNoueP4jiidFq/+zszEyes0MN7QQnoIZrIQQAdJJ
A7e7zLTivHciFKkk3HuUb5ZATFFoLyEyOkrlKn5V6XE028zhOH3o8ECPb5YQ
h7vFIyhTanGIhFWWcU+GFJhBk6wUDBaaqd/XGFeBMKsv0JycnePaZZ7utycG
LEKhGrnNIGNX81vQhQV+a5S/urilO2YMor+7wCUpa4SjzJVuo0sQ1m78Qpf65
d3KYOBzV51DI6ZNRPZU1M0EkZeZvCGYLGY88/zoB4B1+ETlSzQ80Bbg9C
EI7g8Wp2wSg3bQ4Pzvf2ihX75bjRPRzRPQ0Ef2WCPtbRPTbRPQ5IvplIrornv0k
ET1ORL9DRP/BVqiuRDcc06DRZobE8TM30t4j+TCy5RHf9jFO5pu6j90tObVhrj
ylp9IxhKuijNU8t9xlT23jGfLizjlbdiyAJVnRhyVC1KtH3b9rLxS2H9/xx+EZyqebS
f0YeyDpRwF8DzP/GwB+F8BPAgARvQHAuwD8azbN/0hEq9moHQ17eENcmb
nKJfGTSZPjx3LGA+AaIXfHU7lyDxfjqZvtvbTvVWK0nCAG4D3zru5jl4pn6tgP
YC78KEBiHoRtK+58vNZg/9+2tdQrBJj51wH8iyjsV5jZnRD6cZiThgBzIOmHmf
mUmf85zPkDb5mR3iqYA0w3eP755/HC88/jhRdewIsvvminWioJ/MwNpB8sE12
UFj4TV1TzAWUvrz4TAsffHKooGNJcZdy8OJ0TCXUZwRpUr5UC7mxMNxLQ
jfd3TB6vb4E0DyJBZA8XSt88VTs0ymfFHD6BPw/g5+31K2GEgoM7kHTn2Gw
2uPfiPchR7hvHN9Bt/G4xpn65S5Y03dh0l71HZTt3k5yksCkqLcnDmMEGoprO0
SUUcpsnWi+yppgWwUz8cUzthHvQGSSaQI/ykXB+10B2LeMnCQEi+ikAFwA+
NCJt74GkU8DMODs7C8NsJYeTdXpalfJBip7iIkzBxSHLgmMi8EDnMJ640Rgg3
zJpZlR5S+9PqHMUunyixOT+lxJAfg/5uSpeyE1y9v/zfpS80UKAiH4EwA8AeBt3
bsu9HkhaQtM0cEuSuzX37mAH23K01qy2cLeGPB4FKCHMX24U5Kc3SwWFQ
sbggPkL8/vF/ahKHZCoL2qODrKhQ1k/H1vq9ZlYPcqf5vwD4FfqgWBn/0lBIDIj
843UiarRRpF+XmPGDxuPFmwbow4kJaK3A/grAH6Qme+JRx8B8C4iukFEjwF4
PYD/bzqZ86DlFm1rNmBsuRVDLooo78VIk4K6srpFJEDiXHK/BL/WnHLx8pQh
ba1F4nqilt2hudhzoZxXhgPF4+K7KYK+k9idrhIM1UbfqRsGLBchhb+fxCiya9sW
7abFZtOa68oNQ8di7IGkPwngBoCPWqn1cWb+z5j5C0T0CwC+CGMm/Bgzb/Sc
dwtmxvnZOTarje+516s1VuvViJZaoQD3qKUmjutZusWvrKUVGkAvPxf8CP2vG
Qm2xEGWybj8UCmlyhDLJZ7V3MnWSfD+grmzAmAOuoxQOD05RbvpOqkpK
wSrSt36MtsaInZgDsRomgYPPPAAHnjggWASRlwf3iMf6bYsb4IEtTZnd8ZcMA
TIQQHpQqioERJgt6yqQ5bucoKofMkhsbToc1iEmSWxe17EFB9GorEcKKs6Uy
Uy62bVBDR2k4QEIREZ6yM3Qm7qieRIlLNCo7kGX/+Dr+P0ZL5VggKfYuY3x

55
4FXe8ZgDWoM6LhX9qMD6VLlHDsEzOd43uVldUKGvRZCJ6vhRuGZtWYx2
T1QdAfKPZ+xO64jbn7Ibxnypb/2UaPv4Xw03YpAaz44+7+oLQFuDgLJY4z24N
QFFiHQ9UUZZWScMC58zVhTEKq+d1TKDiWRAhpBOSK73nocj8U9fn/8rfOz
SsYEzokIjsWc2svXqP/2O2pLP+QGIaaMTvBvf2pQimstBLyjTTBtvLM3EY8QB
NqHFE4mzWTwWoEgTs2NRG7VtkdFHF01l0uIc0pBfpmx5tyYKiLmYZJES1N
6Y2l6BNN/7a8uCOTmZgqt0V4XXgh4rWTapqFjcG2FQNu2uHfvPs7PL4JwZ4e5
2WC3b9/GarUK7TMSal8EreMO3WDcNaDQ/A+0gvQqKgdR401SDGW2fl00Z+
pID/fgfCn63RI8w2quDMXK8X4GUj5L8sLhy/v40YIjIuD+i/dw/96J8fhn/D5np2fb
3xJP4NoKAaB/c8amafLPKXUi5uJ5tpSOAHfpQnSODuJl1YdiIpewvxdOTi0Y2Q
7DkuJ3GiukHIYbznEPHjxQyIgdjezDBhed4OJ8g9P7p8Vhv20PCca41kKgbwy2K
ARQda5PAO9UyjW8nJmgZibNCkpYXiblAS03HD3QBMkQVMQf6Qwrax9RvJ
7iu8iiJhXtoCojoXIkvh8ibDabnff0fbjWQqAP4drv5KlgxPwH1RhfboDqel6WZgL
QCQTJJCotmgAIHRvVU2UTFFL1ZKjzqM7xsR+ml6QhvXKFokThf6oACLQB
VX1QMga6hUb2umnMvoO8WYTA1UFPj+QZuSZN1MD9fkr0ZwAADRxJREF
UefS5jjlJpBUe9+5jEZZRY4TkBcFMGDJwkYNk3oj4qmwL1U8ALoTPiYiwudjs
Zzi0gEUIbBuOEcXBEdlI8pbguShpZ7FDUYUcFtxdqxup3ddpASPKqhFWcYC6
OjAHCm86wWceeNXffv+L802Bqv1gEQKjMK7r0ezAOMSPTvgpxSaOWWCkl
asxu6oyTEQde0c+Na/FdIwROSlre9yYGpHv4PTCbpd55J1/tSWIw0ntez3z7Wdw
cX5xUD6AGIsQ2BkCv3vgR5Bqs1th5swAY050R6iTTJWdbT1Xg5voCo9ykj3k6
NzVRJWiQBmoyJjyIormPOhxsrh8afdj/mMwahXhdULcsxkMZzK5YrC29QcCQ
PR8vv8aTMaAwmMH18DUWnazssPUzPyHNX+92dmRg77Y6lLiA5cDiyZQC9
XLNb3HLbePaNyBo0tLUyio4hyljhGr5QOpcpqu4sIoB+gYpMprNnsucaduleMl6
SL2lp1+0afTmTk86KUOA4sQGIKAd7Zj4+U8+UG48xfYBudHENTGN7RFzteC
+3hwjADQ8x1Gc6/J3z/EAWoI91+8j9OzM7SbzFwTMqMDbbudtjIXFiFQibAtkH
QBTQIn13F3r/za7skf6U3UIwimsbZqEm3D96gWPBQMyJV5FWXoJp+Wc7cbE
oFwcbHB6cmpGfZT8gUgDgw5XCxCoA/UqdEE66Af8V0zBwl1z8X/yoMwhiW
GAkHgFVIk9nzkrS+Tb9IXeYKKt73hRQT8O+RU6X4CNA0gDLNsHleh++ZsD
TTXJrgFt5nTgA6f9z0WIdADIgI1DbpTQOyYv20Z9d96uLE4RzsKSuSCsBGxA/
dCDdmDub3+zcKRk7CwIRZQ4jLJKgudIzbeSoyabhN5NxPwKmARAkWwOQg
S8APTnVAYmBWZ/HJNL6tgeOcfmeOppLabdZCN8U5lus29tvM6N2YJiaNPC8
+WLusxnOhDRNjkfAGXDIsQKKBtGSf3T9A05qwCogZHR0e4ceMGol0Btwvvl
ScrAITKnm3YogH3yoQdbWXhTJieaJpDMeerU4yf9Kr0cn2fUNhxJ/dP0LZu5Sn
h/Ox85yv+toFFCBTAzDg5OfH3ThjcvHkjUBlrjIJYlXW3xraUQWJxkRyiEvHVZ
p9p8Z6hOKSDkxhRGq5kohHwfNeT79yjbU7UcTzG2VOIe3z/3n1cbC68s++QZw

56
EOwagDScWzv0xETETfZe+JiP57eyDp54joTdsgepdgexag/wO8YwiEtIvKNqj+5
hxuOOm6f3TbjYsiggWOxaytLyMipStBS6ykmQuW+Jw5nev9HVW5Z0HlQK8b
JqQzr8WrpvoQQRLKCNvCVcHYA0lBRK8G8CcB/IEI/n6YswZeD3O60N+ZTu
LhwCwJda2MOkbUhEENNNXY5pXTHLwEsJFSZkoHHVXEzKKSt82Gni+5KE
trsxotkF3dUvcdrBQOvv8VQq85wMy/TkSvUR79DMwBJP9QhL0TwN+zJxJ9nIj
uEtGjzPzkHMTuG8yMlltsnEpoJYA79jxgGafz9tqcmXBNEMTGMudMEcWfTuJ
RksSNOdpUsructc33K/ilGBTFyyHvtM9MXyKzwYwZAOI0pnOtMG9ZMO4Ho
3wCRPROAE8w82ejYZJXAvhDce8OJL0yQuDs7AwvPM/WX0RoVg3u3LkDA
CDmVBDMCUVbVWez5b2FKiclvVuf4NoDJFMWBzhJ3qgxkiAC4ez0DOfn5+b
QjziKTba52FyKyT9DMVgIENFtAH8NxhQYjW0fSLoNMDMuzi/8RhFEhPV67
YXArCDR4Ae0u0Fj11I7KAmL2TBDphk6gwlRUR30Vh8Rzs7OcXJyos/+u+IYo
wm8DsBjAJwW8CoAnyait+CADyTdBoJTgkON2kB0Xwkz93i9ZIfGIv84zSzzV
Q7IzK0xBZJ37vHHSFlXKvcq2vs1GLyUmJk/z8yvYObXMPNrYFT+NzHzUzA
Hkv45O0rwVgDPXhV/QA7GT0fdXnLyz80q84cYdH+63yoduJPDg3XUXFLkH
HkU/QUPzC9FQc5yl3WY1Kx1sPpDQa/I7L8xGHUgKTN/IBP9HwF4B4DHAdw
D8KMz0XmQYHueYKdCyu66UwOcoNB6ItXGtdmMm1w7BnOPyA/EEFNEM
qxcKJRJw8mdqat20/lvmpaMY/DQHCE7wrU9kHQuOL9ADqvVCsfHR7h957Y/e
NSYBsp0HbkYvYcnyWdRHD2PoMXNeMx3BcUnGXr4AEm306yk2AvPCSi9j1
nzwWA89+zz2FxszIw/Qnd9tbEcSLoNMIv1BZnnq/XKaALMocPPwk+lzaikWfac
pAAcqNxNBIC8753nGN1nHAQMbC4ucCEY/xA6w31hEQIzoNSA3E6zIGESc
MS/hQn18Wk4IuM64gL+yYkTVuJqdGRSi863WEIy9qbFLAxv2gJSx2DnF+Do
VUIaOsfCVZz5NxaLENgyOkdhA1DrOzoWc/OdWh+3x+7A1JzBm1qx8gy98AF
U54Mp2jkwRVwluUoCABIquCYsGEjnUGj59nj5/YNoGzBtWDQ7yuAPGL2+js
AYixDYMtq2xcXFBe7fu4+Q0TuWaKjBarUyS5RtOLNLq5+X2FCDZtWYRU0i
z82mxWbT2jTlXo6oMSfi2GWxzDmbuMww6/XK0szq8loiwmrVoGlWekevCCx
NKSEC2naD9ry1jrwa6hJiAEvnogUYLEJgyzA+gw1OT05ClhRnb62aBjdu3sCam
uDQ0rZtcXZ2BlYcVqvVGkd0hFXTmJ7WahebtsX5+VnVWvfVaoXVaoWmaXB
+fo7NZvhEGaPlHMNYPYyz87NE9qxWKwBrNM0qVUgKAkApDe2mxdnZeeC
HGeIacXGv82hAjEUIbBnOcVhyHq7XaxwdHYGOus3BmIHNpjV72CnMeXR0h
NWqAR0dBY15c7HBSSZNmsca6/URVusVTk9PizTm4Hp5I4A2OLl/ksRZr9cgA
o6Pj7t0/r8gt+AuHssgMidJn56c4uzsbDCtC3QsQuAAEB582k09LJmtfhISdUmGD
/eLlXGTTGQCUbcJZ4ZgwfkckZovPxUW+WXIC8ZhEQKHADf3ZUSa4JJyPW
x9XqMwcKgy6wyE8i5q4kUKzInlBKIDgfdRjTZTSbkalGyr8AMPqgkQOTF7aV
ps+TlxKDMGvwngRQDf2jctAt+FhZ4+HBpNCz1l/BFmfnkceBBCAACI6JPalM

57
Z9YaGnH4dG00LPOCzmwIIF1xyLEFiw4JrjkITA+/dNQISFnn4cGk0LPSNwMD
6BBQsW7AeHpAksWLBgD9i7ECCitxPR79gDS963JxpeTUS/SkRfJKIvENGP2/
C/QURPENFn7N87dkjT7xPR5225n7RhDxHRR4noy/b3wR3R8kdFHXyGiJ4jop/
Ydf1oB+Hk6mQXB+Fk6PlbRPTbtsxfJqK7Nvw1RHRf1NXfnZue0UhO2NnhH4
AVgN8D8FoAxwA+C+ANe6DjUZh9EgHgJQB+F8AbAPwNAP/lnurm9wF8VxT
2XwN4n71+H4Cf3tM3ewrAH9l1/QD4PgBvAvBbfXUCs83dP4aZevRWAJ/YET
1/EsDaXv+0oOc1Mt4h/e1bE3gLgMeZ+SvMfAbgwzAHmOwUzPwkM3/aXj8P4
Esw5yUcGt4J4IP2+oMA/tQeaHgbgN9j5q/uumBm/nUA/yIKztWJPwiHmT8O4C4
RPbptepj5V5jZrcT6OMyO2weNfQuB3GEle4M9bel7AXzCBv0Fq9r93K7UbwsG
8CtE9Cl7RgMAPMLd7s1PAXhkh/Q4vAvA3xf3+6ofh1ydHELb+vMw2ojDY0T0
m0T0z4jo39kxLVnsWwgcFIjoAQD/AMBPMPNzMGcpvg7AvwlzitJ/u0Ny/jgzvw
nmfMcfI6Lvkw/Z6Jg7HdohomMAPwjg/7BB+6yfBPuokxyI6KcAXAD4kA16EsC
/zMzfC+AvAfjfieil+6JPYt9CoPqwkm2DiI5gBMCHmPmXAICZv8HMGzZb7vxP
MObLTsDMT9jfpwH8si37G06ltb9P74oei+8H8Glm/oalbW/1I5Crk721LSL6EQA
/AODPWMEEZj5l5m/b60/B+ML+lV3Q04d9C4HfAPB6InrM9jLvgjnAZKcgs+H
cBwB8iZn/tgiXNuR/DCA5nn1L9Nwhope4axhn02/B1M27bbR3IzwMdhf4YQhT
YF/1EyFXJ3s5CIeI3g5zUO8PMvM9Ef5yIlrZ69fCnNz9lW3TU4V9eyZhvLi/Cy
MZf2pPNPxxGDXycwA+Y//eAeB/A/B5G/4RAI/uiJ7XwoyUfBbAF1y9AHgYw
McAfBnA/wPgoR3W0R0A3wbwMhG20/qBEUBPAjiHsfHfk6sTmFGB/8G2q88
DePOO6Hkcxhfh2tHftXH/E/stPwPg0wD+o320de1vmTG4YME1x77NgQULFuw
ZixBYsOCaYxECCxZccyxCYMGCa45FCCxYcM2xCIEFC645FiGwYME1xyIE
Fiy45vj/AdrOk5Rn3qe8AAAAAElFTkSuQmCC\n","text/plain":["<Figure size
432x288 with 1
Axes>"]},"metadata":{"tags":[]},"output_type":"display_data"}],"source":["n =
190\n","\n","img = X_test[n]\n","plt.imshow(img)\n","\n","input_img =
np.expand_dims(img, axis=0)\n","\n","prediction =
model.predict(input_img)\n","prediction = int(prediction)\n","if prediction ==
0:\n"," print(\"The Cell Is Not Infected So The Person Has Not Malaria.\")\n","elif
prediction == 1:\n"," print(\"The Cell Is Infected So The Person Has
Malaria.\")\n","else:\n","
pass"]},{"cell_type":"code","execution_count":null,"metadata":{"id":"Ait42I_JvLj
j"},"outputs":[],"source":[]}],"metadata":{"accelerator":"GPU","colab":{"gpuType
":"T4","toc_visible":true,"provenance":[]},"kernelspec":{"display_name":"Python
3","name":"python3"}},"nbformat":4,"nbformat_minor":0}

58
Appendix B
SAMPLE OUTPUT

59
60

You might also like