RiceBioS
RiceBioS
5, MARCH 1, 2022
1558-1748 © 2022 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission.
See https://www.ieee.org/publications/rights/index.html for more information.
Authorized licensed use limited to: KGiSL Institute of Technology. Downloaded on March 11,2024 at 11:43:37 UTC from IEEE Xplore. Restrictions apply.
JOSHI et al.: RiceBioS: IDENTIFICATION OF BIOTIC STRESS IN RICE CROPS USING EDGE-AS-A-SERVICE 4617
Authorized licensed use limited to: KGiSL Institute of Technology. Downloaded on March 11,2024 at 11:43:37 UTC from IEEE Xplore. Restrictions apply.
4618 IEEE SENSORS JOURNAL, VOL. 22, NO. 5, MARCH 1, 2022
TABLE I
C OMPARISON OF E XISTING S OLUTIONS W ITH R ICE B IO S
B. Research Gaps
The solutions compared above have in their own specific
way contributed in rice crop biotic stress monitoring and
disease prediction by either proposing an accurate classifica-
Fig. 3. Conceptual view of RiceBioS sensing system provisioning
tion model, image processing analysis or a fusion of sensed edge-as-a-service.
data and images to make decisions. A feature wise tabulated
description of the discussed prior art in subsection II-A is TABLE II
given in Table I. The major issues that were not found to be R ICE C ROP I MAGE D ATASET D ISTRIBUTION
diligently addressed in most of the prior art are:
• Few attempts made to perform AI-driven crops’ biotic
stress identification.
• Automated biotic stress detection and classification meth-
ods were not provided.
• Absence of edge-based solutions focusing on crop health.
• Require high computational resources to deploy classifi- fungal biotic stress conditions are two among the many stress
cation models. conditions or diseases that normally infect any rice crop.
To create the three way classification image dataset of the
III. R ICE B IO S: T HE P ROPOSED S ENSING S YSTEM above mentioned rice diseases, the whole plant and its infected
We propose a system RiceBioS which enables the rice images were taken into consideration. The entire work of
growing farmer to conveniently identify the current state of building and collection of dataset was carried out with the help
the cultivated rice crop. The algorithm segregates the state of our collaborator - Indira Gandhi Krishi Vishvavidyalaya,
of the rice crop in three different prominent categories. The Raipur. To ensure that the framework does not completely rely
crop can either be healthy or stressed. In case the status on the image capturing capabilities of the device the dataset
predicted by the algorithm is stressed, the undergone biotic has a mix of images taken from smartphone camera (Samsung
stress is broadly categorised into two types - bacterial and Galaxy M11) and professional digital camera (Nikon D5600
fungal. This classification of a crop into healthy state, biotic DSLR) with different aperture, focusing abilities and camera
stressed - bacterial or biotic stressed - fungal, helps the farmer quality. The proposed framework is versatile enough to comply
to preempt future consequences to the crop and thus take to variations in image size, capturing environment such as
suitable action. These predictions are done on the device brightness and illumination values. The image dataset was
without having to push any data to a centralized server. The validated by agriculture experts at IGKV, Raipur. A brief detail
functional representation of the process is given in Fig. 3. about the image dataset is given in the Table II.
Authorized licensed use limited to: KGiSL Institute of Technology. Downloaded on March 11,2024 at 11:43:37 UTC from IEEE Xplore. Restrictions apply.
JOSHI et al.: RiceBioS: IDENTIFICATION OF BIOTIC STRESS IN RICE CROPS USING EDGE-AS-A-SERVICE 4619
Fig. 4. Sample of image preprocessing results include (a) Original Fig. 5. (a) Original image, (b) Perspective transformation, (c) Affine
image, (b) Grayscale converted image, (c) ‘a’ channel of LAB color space, transformation, and (d) Rotation.
(d) Histogram equalization output, (e) Dilated resultant and (f) 2D Otsu
adaptive threshold output.
Authorized licensed use limited to: KGiSL Institute of Technology. Downloaded on March 11,2024 at 11:43:37 UTC from IEEE Xplore. Restrictions apply.
4620 IEEE SENSORS JOURNAL, VOL. 22, NO. 5, MARCH 1, 2022
an algorithm based on the deep learning techniques. It takes Algorithm 1 Rice_BioS- To Identify Biotic Stress and
image as an input and materialises on learned weights which Category of Stress
ultimately helps the algorithm in differentiating between dif- Input: Img_Leaf
ferent objects in a single frame. We have an RGB image which Inference: Healthy or Biotic Stressed - Bacterial or Fungal
has been separated by its three color planes — Red, Green,
and Blue. The proposed network architecture is graphically 1: Img_Leaf = Rice Crop Leaf Image
demonstrated in Fig. 6. 2: OUTPUT = Final prediction
The convolutional layer primarily helps in capturing the 3: TEMP = Resultant consequent image from intermediate
low-level features such as edges, color, gradient orientation, layers
etc. Here, minute details such as rice crop leaf texture, *: Computation is done at the edge (Smartphone)
stress causing pests image characteristics, texture and color **: Communication with cloud
coding of infected portions of leaf. The initial convolution 4: for each Img_Leaf do
layers extract high level features and subsequent layers extract 5: if ReadFromSmartphoneCamera(Img_Leaf) == 1 then
these low level features. The input image is made to slide 6: SendToApp(Img_Leaf)*;
over a considered kernel matrix to perform the convolution 7: end if
operation [20]. Forward propagation is the straight forward 8: Enhanced_Img_Leaf = Preprocess(Img_Leaf)*;
adopted input driven approach which is explained in Eq. 2 9: TEMP = InitializeInputFeatures(Enhanced_Img_Leaf)*;
(k)
m
n 10: for i in layer_n do
qi j = Wrk y(i+r)( j +n) + b, (2) 11: TEMP = OUT(layer_n)*;
i=1 j =1 12: end for
where qi j constitutes the resultant of the sum of weight bias b 13: if layer_n == FC or actication then
and product of Wrk , propagation path, and y(i+r)( j +n) , the input 14: Result = LabelBinarizer(TEMP)**;
features. 15: end if
A feedback based closed loop backward propagation helps 16: if Result == 2’b00 then
in eliminating redundant irregularities in the network and make 17: OUTPUT = Healthy;
is responsive to a versatile input. Backward propagation path 18: else
and gradient is calculated in Eq. 3 and Eq. 4 19: if Result == 2’b01 then
20: OUTPUT = Biotic Stressed-Bacterial;
∂V N−n+1
M−m+1 ∂ V ∂qi j
k
21: else
= , (3)
∂ Wr
i=1 j =1
∂qikj ∂ Vq(k) 22: if Result == 2’b11 then
23: OUTPUT = Biotic Stressed-Fungal;
∂V N−n+1
M−m+1 ∂V 24: end if
= y(i+r)( j +n) , (4)
∂ Wr ∂qikj 25: end if
i=1 j =1
26: end if
where V is the resultant vector (which is to be given as an 27: end for
input to the pooling layer) with respect to propagation path
Wr aggregated over a range of M, N discrete time intervals
taking into consideration forward propagation vector qi j [21].
The pooling layer shown in Fig. 6 takes care of the reduc- for the model to understand the features of the captured image
tion of features that are convolved in the previous layers. such as texture, color, shape and size etc. Once the model
In order to enable edge computing, keeping less computational is trained through the entire dataset, it is able to spot the
power as a requisite, reducing the dimensionality of the variations in the above mentioned features and thus classifies
features becomes quintessential. Noise suppression is done the image into a particular kind of irregularity (biotic stress).
through max pooling. Average pooling in some cases is used The Algorithm 1 (Ri ce_Bi oS) explains in the order of
for this purpose but after evaluating the initial results, max precedence image acquisition, transfer to mobile applica-
pooling was preferred. The above process designs a blueprint tion, preprocessing (explained in Section III B), forward and
Authorized licensed use limited to: KGiSL Institute of Technology. Downloaded on March 11,2024 at 11:43:37 UTC from IEEE Xplore. Restrictions apply.
JOSHI et al.: RiceBioS: IDENTIFICATION OF BIOTIC STRESS IN RICE CROPS USING EDGE-AS-A-SERVICE 4621
Authorized licensed use limited to: KGiSL Institute of Technology. Downloaded on March 11,2024 at 11:43:37 UTC from IEEE Xplore. Restrictions apply.
4622 IEEE SENSORS JOURNAL, VOL. 22, NO. 5, MARCH 1, 2022
TABLE III
T RAINING PARAMETERS OF D IFFERENT M ODELS
TABLE IV
T EST FOR OVERFITTING OF D ATA
Fig. 10. RiceBioS mobile application real time testing results with
Inception v3, MobileNet v2 and Resnet50 are given in (a) Varying Lighting Conditions: gives higher prediction accuracy in well
Table III. The results shows that with the same dataset, the lit environment and (b) Changing Background Noise: versatile in adopting
performance of RiceBioS is best among AlexNet, VGG16, to background noise in image samples.
Inception v3, MobileNet v2 and Resnet50 [22].
C. Prototype Level Deployment of Mobile Application octa core CPU. The application was tested to examine the
computational performance of RiceBioS (Table V), while the
The model, post training was stored as a Keras saved
test accuracy of sample images is shown in Fig. 7. The
model. In order to make it deployable in a mobile application
application does not require an internet connection to compute
environment, it was first freezed and compressed in form
the results but has a requirement of approximately 160MB
of a Tensorflow Lite saved model and then exported to the
of obb files to be downloaded once installed. This helps
android application development environment [23]. The model
the android application to be packaged well and scalable
RiceBioS had to be freezed to enable the predictions to be
to more number of low and mid-range smartphones which
done at the device itself and not at the cloud server. The
are expected to be affordable by the targeted rural farm-
compressed, pruned and light weighted model is inscribed
ing audience. An extensive test of the mobile application
into the user interface of the application developed. The
illustrated in Fig. 11 was done and the results (as shown
model accompanied with the mobile application to enable
in Fig. 10) demonstrate its performance in varying image
edge computing makes it a complete product with minimum
capturing scenarios, thus giving the optimal usage conditions
compromise in performance.
for attaining best results. The application not being heavy and
Figure 11 presents working schematics of the application
requiring negligible background activity does not slow down
that depict the real time functionality of the RiceBioS frame-
the already burdened low range chipset. The performance
work in an android smartphone. Chronologically, the app test-
of the application in low end or budget smartphones shows that
ing done by farmers, launching and image capturing interfaces
the end user experience is bankable. The user interface of the
shown in Fig. 11(a), (b) and (c) are followed by real-time pre-
application has been carefully designed to cater to the needs
dictions of biotic stress conditions. While Fig. 11(d) predicts
of farmers. The graphical user interface that has less clutter
that the rice crop is healthy, different samples when tested
on the screen, more intuitive and comprehensible information
show infected or stressed status of bacterial biotic stress and
makes it a plug and play solution for any farmer irrespective
fungal biotic stress as presented in Fig. 11(e) and Fig. 11(f)
of the language or technological skill of the concerned user.
respectively.
The application occupies a ROM of 176MB and RAM
of approximately 512MB while in use. For testing purpose D. Field Testing and Deployment
the application was tested with a variety of smartphones. The experimental study and deployment of RiceBioS was
These smartphone models were Redmi 9 Prime, Samsung done in a remote village Fulkarra located in Gariyaband dis-
Galaxy M11 and Oneplus Nord. The processors on which the trict, Chhattisgarh, India. The village does not have consistent
computational requirement of this application was prototyped LTE internet connectivity and is majorly connected by EDGE
were Mediatek Helio G80, Qualcomm Snapdragon 450 and networks. 4 rice crop fields of varied maturity levels were
Qualcomm Snapdragon 765G. Both these processor chips considered for the testing of the solution. In order to get
are based out of 14nm finfet technology having a 1.8 GHz the end user (farmers) perspective, 35 farmer volunteers were
Authorized licensed use limited to: KGiSL Institute of Technology. Downloaded on March 11,2024 at 11:43:37 UTC from IEEE Xplore. Restrictions apply.
JOSHI et al.: RiceBioS: IDENTIFICATION OF BIOTIC STRESS IN RICE CROPS USING EDGE-AS-A-SERVICE 4623
Fig. 11. (a) Farmer capturing image using smartphone camera to test developed application; RiceBioS deployed in an Android application (b) app
launched from homescreen, (c) first screen or capturing screen, (d) result for sample 1, (e) result for sample 2, and (f) result for sample 3.
TABLE V
R ICE B IO S M OBILE A PPLICATION T ESTING P ERFORMANCE ON H ANDHELD D EVICES
Authorized licensed use limited to: KGiSL Institute of Technology. Downloaded on March 11,2024 at 11:43:37 UTC from IEEE Xplore. Restrictions apply.
4624 IEEE SENSORS JOURNAL, VOL. 22, NO. 5, MARCH 1, 2022
[2] V. Udutalapally, S. P. Mohanty, V. Pallagani, and V. Khandelwal, [13] M. Bach-Pages and G. M. Preston, “Methods to quantify biotic-induced
“SCrop: A novel device for sustainable automatic disease prediction, stress in plants,” in Host-Pathogen Interactions. New York, NY, USA:
crop selection, and irrigation in internet-of-agro-things for smart agri- Humana Press, 2018, pp. 241–255.
culture,” IEEE Sensors J., vol. 21, no. 16, pp. 17525–17538, Aug. 2021. [14] A. V. Zubler and J.-Y. Yoon, “Proximal methods for plant stress detection
[3] Y. Zhu et al., “Genetic diversity and disease control in rice,” Nature, using optical sensors and machine learning,” Biosensors, vol. 10, no. 12,
vol. 406, no. 6797, pp. 718–722, 2000. p. 193, Nov. 2020.
[4] S. K. Roy, S. Misra, N. S. Raghuwanshi, and S. K. Das, “AgriSens: IoT- [15] N. Mastrodimos, D. Lentzou, C. Templalexis, D. I. Tsitsigiannis,
based dynamic irrigation scheduling system for water management of and G. Xanthopoulos, “Development of thermography methodology
irrigated crops,” IEEE Internet Things J., vol. 8, no. 6, pp. 5023–5030, for early diagnosis of fungal infection in table grapes: The case
Mar. 2021. of aspergillus carbonarius,” Comput. Electron. Agricult., vol. 165,
[5] S. Kaur, S. Pandey, and S. Goel, “Semi-automatic leaf disease detection Oct. 2019, Art. no. 104972.
and classification system for soybean culture,” IET Image Process., [16] E.-C. Oerke, P. Fröhling, and U. Steiner, “Thermographic assessment
vol. 12, no. 6, pp. 1038–1048, 2018. of scab disease on apple leaves,” Precis. Agricult., vol. 12, no. 5,
[6] N. Goel, D. Jain, and A. Sinha, “Prediction model for automated leaf pp. 699–715, Oct. 2011.
disease detection & analysis,” in Proc. IEEE 8th Int. Advance Comput. [17] P. Pal, R. P. Sharma, S. Tripathi, C. Kumar, and D. Ramesh, “2.4 GHz
Conf. (IACC), Dec. 2018, pp. 360–365. RF received signal strength based node separation in WSN monitoring
[7] H. Wang, G. Li, Z. Ma, and X. Li, “Image recognition of plant diseases infrastructure for millet and Rice vegetation,” IEEE Sensors J., vol. 21,
based on backpropagation networks,” in Proc. 5th Int. Congr. Image no. 16, pp. 18298–18306, Aug. 2021.
Signal Process., Oct. 2012, pp. 894–900. [18] R. Dwivedi, S. Dey, C. Chakraborty, and S. Tiwari, “Grape disease
[8] G. Zhou, W. Zhang, A. Chen, M. He, and X. Ma, “Rapid detection detection network based on multi-task learning and attention features,”
of Rice disease based on FCM-KM and faster R-CNN fusion,” IEEE IEEE Sensors J., vol. 21, no. 16, pp. 17573–17580, Aug. 2021.
Access, vol. 7, pp. 143190–143206, 2019. [19] X. Xie, X. Zhang, B. He, D. Liang, D. Zhang, and L. Huang, “A system
[9] W.-L. Chen, Y.-B. Lin, F.-L. Ng, C.-Y. Liu, and Y.-W. Lin, “RiceTalk: for diagnosis of wheat leaf diseases based on Android smartphone,”
Rice blast detection using Internet of Things and artificial intelligence Proc. SPIE, vol. 10155, Oct. 2016, Art. no. 1015526.
technologies,” IEEE Internet Things J., vol. 7, no. 2, pp. 1001–1010, [20] S. Joshi, D. K. Verma, G. Saxena, and A. Paraye, “Issues in training a
Feb. 2020. convolutional neural network model for image classification,” in Proc.
[10] M. J. Hasan, S. Mahbub, M. S. Alom, and M. Abu Nasim, “Rice disease ICACDS. Singapore: Springer, 2019, pp. 282–293.
identification and classification by integrating support vector machine [21] N. Skatchkovsky, H. Jang, and O. Simeone, “Spiking neural networks—
with deep convolutional neural network,” in Proc. 1st Int. Conf. Adv. Part II: Detecting spatio-temporal patterns,” IEEE Commun. Lett.,
Sci., Eng. Robot. Technol. (ICASERT), May 2019, pp. 1–6. vol. 25, no. 6, pp. 1741–1745, Jun. 2021.
[11] N. Yang, Y. Qian, H. S. El-Mesery, R. Zhang, A. Wang, and J. Tang, [22] E. C. Too, L. Yujian, S. Njuki, and L. Yingchun, “A compar-
“Rapid detection of Rice disease using microscopy image identification ative study of fine-tuning deep learning models for plant dis-
based on the synergistic judgment of texture and shape features and ease identification,” Comput. Electron. Agric., vol. 161, pp. 272–279,
decision tree–confusion matrix method,” J. Sci. Food Agricult., vol. 99, Jun. 2019.
no. 14, pp. 6589–6600, 2019. [23] J. Park, Y. Kwon, Y. Park, and D. Jeon, “Microarchitecture-aware
[12] Y. Kim, J.-H. Roh, and H. Kim, “Early forecasting of Rice blast disease code generation for deep learning on single-ISA heterogeneous multi-
using long short-term memory recurrent neural networks,” Sustainability, core mobile processors,” IEEE Access, vol. 7, pp. 52371–52378,
vol. 10, no. 2, p. 34, Dec. 2017. 2019.
Authorized licensed use limited to: KGiSL Institute of Technology. Downloaded on March 11,2024 at 11:43:37 UTC from IEEE Xplore. Restrictions apply.