Smart Crop Protection System Using Deep Learning"
Smart Crop Protection System Using Deep Learning"
Article Info
Received: 24-10-2024 Revised: 04-11-2024 Accepted: 12-11-2024 Published: 22-11-2024
Abstract :
Agriterrorism with regard to animal damage greatly affects the crop yield for farmers, resulting
to some of them recording large losses. Farm animals like buffaloes, cows, goats and birds
trespass in the fields trample the crops and this can only be destructive for farmers since they
cannot constantly protect their shambas. Measures such as the use of barriers, wire fences, or
personnel vigilance yield most of the time insufficient results. In addition to scarecrows, which
enemies can easily bypass with many animals, farmers also employ human effigies.To control
these problems, we introduce an AI-based Scarecrow system using video processing in real-
time for crop protection from wildlife. The system uses a camera to record videos and analyzes
them with YOLOv3, an object detection model together with OpenCV and the COCO names
database. If any animal or bird is identified, then the system produces a sound alerting the
animal not to invade the compound. Moreover, if an animal has been sensed for more than one
minute consecutively, the system will alert the farmer sending him/her an e-mail and dialing
the farmer`s phone number. This approach thus provides an efficient and automated way of
protecting crops than depending on deterrent measures.
Keywords : Crop Protection, Animal Detection, AI-based Scarecrow, Object Detection,
Agricultural Monitoring, Sound Alerts, Automated Notifications, Wildlife Deterrent
1. INTRODUCTION:
Livestock continues to pose a threat to crops through invasion and trespass, resulting in crop
loss and huge loses. In the interest of adequately protecting crops, the farming industry needs
productiveness solutions that are, effective, dependable, and preemptive. The objective of this
project is to look at the potential and opportunities of applying the Disruptive Technology of
Deep Learning to meet these challenges.
The traditional method of crop protection involves periodic inspection and/or the use of barriers
that are either manually operated or mechanically activated, they have been found to be quite
Page | 1
Journal of Science and Technology
ISSN: 2456-5660 Volume 9, Issue 11 (November -2024)
www.jst.org.in DOI:https://doi.org/10.46243/jst.2024.v9.i11.pp01- 20
ineffective, inaccurate and cannot be easily or effectively scaled up for large fields. These
methods involve a lot of labor and in addition do not offer real-time automatic reactions to
threats meaning crops are at risk of damage and the farmer’s effectiveness in maximizing yields
and protection is severely diminished.
To overcome these obstacles, the project employs surveillance systems based on Deep
Learning, that employ both computer vision and machine learning in order to detect threats on
the fly. The system can detect animals or intruders in the field using a YOLO (You Only Look
Page | 2
Journal of Science and Technology
ISSN: 2456-5660 Volume 9, Issue 11 (November -2024)
www.jst.org.in DOI:https://doi.org/10.46243/jst.2024.v9.i11.pp01- 20
Once) object detection algorithms. This guarantees fast, efficient and autonomous responses,
significantly minimizing the time and energy that is taken while monitoring.
This paper highlights many benefits that can be derived from the Deep Learning-based system
over the conventional means of protecting crops: First, the fact that the system can run an
intrusion detection and intrusion classification in real time guarantee minimal crop loss from
intrusions. Second, computer vision algorithms allow for a high level of accuracy in the
identification of multiple types of threats to act on, so actions are only taken when required.
Third, the system supports smart alerting tools such as sound playback, email, and phone alert,
which will in one way assist the farmers to be in a position to attend to exigent circumstances
promptly whether there is necessity of moving to the field or not. Last but not least the
incorporation of camera monitoring makes it possible to get visual proof which may be used in
order to explain to farmers in order to increase confidence.
The system also helps in identifying potential threats more accurately using machine learning,
means that as often as new threats or more detailed scenarios appear, the system begins to
recognize them more accurately. Real-time video processing melded with the capability of
sending instantaneous email and phone alerts is a perfect solution for crop guarding against
possible perils.
This project work seeks to redesign the way crops are protected by incorporating a more
improve, autonomous and efficient Deep Learning technology. Its effective and efficient
features such as the real-time monitoring of the crops, provision of smart alerts, and prompt
means of cultivating and detecting the problem area would not only gain less manual labor but
also less threat on the crops leading to high and secure productivity of farmers.
2. LITERATURE SURVEY:
Adami, D., Ojo, M. O., & Giordano, S. (2021) Development of an Embedded Edge-AI-based
Animal Repelling System for Crop Protection: Design and Evaluation. As part of this novel
work, this paper proposes a smart agriculture system with emphasis on animal trespass
protection, particularly Uganda’s ungulates such as wild boar and deer through the edge
computing. It uses YOLOv3 and the faster Tiny YOLO for real-time object identification on
systems such as the Raspberry Pi and NVIDIA Jetson Nano. The system uses species-specific
ultrasound signals to scare the animals and use IoT for linkages and control. The work evaluates
the performance of models, power consumption, and the cost-to-performance ratio for the
concept of edge-AI to help farmers in decision-making.
J Redmon, J & Farhadi, A, (2018). YOLOv3: An Incremental Improvement. As a continuation
of the previous YOLO model development this paper covers the third version of the model
which is widely used in real-time object detection. As a part of the series, there is YOLOv3
that splits the image into grids and then determines the coordinates of the boxes that probably
contain an object and also the class probabilities of an object. It is good for real time
applications since it will provide fast results without compromising accuracy, for instance in
agriculture, the discrimination algorithm will easily detect intruders on the farmland and
prompt defensive actions.
Viola, P. & Jones, M, (2001). A boosted cascade of simple features for real-time object
detection. This brief study initiates the use of the Haar Cascade classifier, employed in face
detection. In the approach, positive and negative images are employed to build a set of
classifiers for on-line objects detection similar to farmer’s face recognition or identification of
strangers in your system. It can be connected with certain alarms to inform when an intruder is
sensed around secured areas of agriculture.
Reddy KV, Goutham V Knowledge and practice: An Indian perspective Analysis of the year
2024. Edge AI in Sustainable Farming: An IoT Framework Based on Deep Learning to Protect
Page | 3
Journal of Science and Technology
ISSN: 2456-5660 Volume 9, Issue 11 (November -2024)
www.jst.org.in DOI:https://doi.org/10.46243/jst.2024.v9.i11.pp01- 20
Crops from Wildlife Predation. This paper proposes a real-time animal intrusion detection
system based on TinyML with a lightweight deep learning model, namely EvoNet. It
incorporates Internet of Thing for surveillance and prevention, with add-on that allow farmer
to have a bird-eye view of threats from a remote control intelligent rover. The system integrates
high energy efficiency, and high detection accuracy, it is a stable solution for protection of
crops.
Korche, M., Tokse, S., Shirbhate, S., & Jolhe, S. P. (2021). Smart Crop Protection System.
This study proposes a microcontroller-based crop protection system utilizing sensors and GSM
modules for alerting farmers of nearby animal intrusions. Traditional approaches such as
scarecrows and electric fences are complemented by modern technologies like motion sensors
and wireless communication, offering real-time alerts and automated responses to protect
farmlands from animals
S.NO TITLE AUTHOR(S) & ALGORITHMS DESCRIPTION
YEAR /
METHODS
Page | 4
Journal of Science and Technology
ISSN: 2456-5660 Volume 9, Issue 11 (November -2024)
www.jst.org.in DOI:https://doi.org/10.46243/jst.2024.v9.i11.pp01- 20
04 IOT Based Crop P.Navaneetha , PIR Sensors , This paper proposes
Protection et al. Ultrasonic an IoT-based system
System Against Sensors . that detects animal
Birds and Wild movements using PIR
Animals Attacks and ultrasonic-
. sensors. It activates
sound and light to
divert animals .
3. METHODOLOGY:
The idea that can be implemented by applying Deep Learning in the context of the proposed
system is the real-time protection of the crops by detecting animals and intruders. The above
system aims at having the ability to automatically recognize threats and inform the farmer using
sounds playback, emails, and phone calls. Real time video processing is based on the YOLO
(You Only Look Once) algorithm for object detection at the heart of the system.
The system is divided into two primary components: Object Recognition and Detection and
Alert System.
In this Object Detection component, the system employs the use of a camera to scan the crop
field. The camera records live video streams, which then are analyzed with the help of the
YOLOv3 algorithm. YOLOv3 breaks up the video frame into cells and computes a likelihood
of danger or threat, such as recognition of animals or unfamiliar people by computing bounding
boxes and class probabilities. This makes it possible for the system to identify multiple objects
and do so in real time. When the system recognizes certain person for example the farmer no
action is taken. However, whenever an unfamiliar person or an animal is sensed, the systems
for alerting begin the process.
The Alert Mechanism consists of three steps:
1. Sound Playback: It triggers an alarm tone loud enough to scare out an intruder or
animal once it has been noticed. Playsound library is used to play the sound file.
2. Email Notification: To the farmer, an email alert message is sent informing of the
detected object (animal or person) and an image of it captured by the system. This
makes it possible to quickly confirm the circumstances. The email is usually sent
through SMTP and the image is taken by the camera.
3. Phone Call Alert: In case of this threat is not muted, the system dials the farmer
automatically, guaranteeing the response. Earlier, it was designed to use Twilio for
phone call alerts, however, the current system can be migrated to use Amazon Connect
for a more elastic solution.
Page | 5
Journal of Science and Technology
ISSN: 2456-5660 Volume 9, Issue 11 (November -2024)
www.jst.org.in DOI:https://doi.org/10.46243/jst.2024.v9.i11.pp01- 20
Before activating the alert mechanisms, the system identifies the detected person through face
detection using the Haar Cascade classifiers submodule. If the person is unrecognized, the
system goes to activate the alert function.
Each detection event includes date and time of occurrence, type of threat detected – animal or
person – and action that the system has taken. This quickly makes way for future analytical
works of trends and patterns concerning intrusions relevant in the protection of farmers’ crops.
Therefore, the Smart Crop Protection System established the use of Deep Learning-based
object detection, real-time alerts, and automated notification for a reliable protection of crops.
Efforts made in the design of the project include the use of YOLOv3 for detection of the
intruders and a multimedia alert system so as to ensure farmers are provided with an efficient
tool to prevent crop damage from people or animals.
Proposed Architecture:
Input taken
preprocessing Training the data
from camera
If it is animal
If animal found
more then Calculate Weight
Alaram on
particular time CFG
if it is person
Detect
owner or not
Owner unknown
The Smart Crop Protection System shall ensure protection to crops from intruders and animals;
it is to include Deep Learning for surveillance and notification. The architecture entails both
the object detection and the means for passing of the alert, which combines the security
systems. The system by design constantly watches the crop field and gives alarms any time it
detects a stranger or an animal. User Modules have features including; the capability to identify
Page | 6
Journal of Science and Technology
ISSN: 2456-5660 Volume 9, Issue 11 (November -2024)
www.jst.org.in DOI:https://doi.org/10.46243/jst.2024.v9.i11.pp01- 20
animals or intruders. When a potential threat is identified, the system plays sound to elude the
intruder, also at the same time an image of the intruder together with a notification email is
sent the farmer. In addition, if required, the system can even make a phone call to the farmer
notifying him of an ongoing intrusion.
Admin Modules are designed and implemented to address the system’s detection activity more
on the management and monitoring aspect. In case of an event detection process the admin has
an opportunity to view the logs that contain information about the time of detection, photos,
possible actions made concerning the event. This helps to ensure that the built system works
according to plan offering adequate levels of crop protection security and the necessary alerts.
The design of the system allows for flawless integration of the object detection component with
the alert system with a tight security and control over the system.
STEP-1: Camera feed in real time: By capturing live feed in the field, the system employs the
use of a camera that provides video feedback.
STEP-2: Superior video analysis with YOLOv3: The frames captured by the video cameras are
analyzed with the use of YOLOv3 that is capable of identifying the presence of any animals or
any unfamiliar persons getting into the field.
STEP-3: Intruder detected: This is where the system identifies, through its cameras, an intruder;
an animal or a person.
STEP-4: Sound alert activated: The system plays a loud sound using playsound library to
caution the intruder instanly.
STEP-5: Email notification sent: On detecting an object, the system takes a picture of the object
and then sends an email to the farmer containing the picture The email is delivered by a system
transport called SMTP server.
STEP-6: Phone call alert: If the threat continues then the system dials a phone number,
preferably of the farmer and informs him about the threat through Amazon connect or any other
telephony alert service.
STEP-7: The detection event is logged into the system and contains information regarding the
object that has been detected as well as the time of detection and what occurred in response to
the event.
STEP-8: Admins can get a list to look at the detection history of all instances: The admin has
access to a list they can use to view all the images that the software detected, along with some
timestamp and action.
Page | 7
Journal of Science and Technology
ISSN: 2456-5660 Volume 9, Issue 11 (November -2024)
www.jst.org.in DOI:https://doi.org/10.46243/jst.2024.v9.i11.pp01- 20
function through a capability that embodies the spatial hierarchy of features inherent in input
images such as edges, textures and objects. Convolution Operation: The convolution operation
applies a filter or kernel to indentify features from the input image. Every filter you slide across
the image performs dot product analysis of filter with input at each position and gives you
feature map.
Page | 8
Journal of Science and Technology
ISSN: 2456-5660 Volume 9, Issue 11 (November -2024)
www.jst.org.in DOI:https://doi.org/10.46243/jst.2024.v9.i11.pp01- 20
Real-time Detection: For the object detection the system employs YOLOv3 algorithm, which
is very effective for identifying animals and intruders in the field. Each detection is rendered
online and in real time for timely reaction to the problem.
Face Recognition for Identity Verification: For human detection the primary object detection
technique used is Haar Cascade classifiers for detection of farmer’s face. If the detected person
does not belong to the owner category, an alert process is initiated in this case.
Multimedia Alerts: The system utilizes the ability to play a sound, send an email with a
picture, and a phone call that guarantees that the farmer is informed of intrusion.
Transparency and Accountability: Every detection event is recorded and the farmers are able
to study previous intrusions to better Le the manage the security threats.
PROPOSED ALGORITHM:
YOLOv3 as an acronym for You Only Look Once, version three is a real-time object detection
framework that is purely based on convolutional neural network. It partitions the input image
into a grid and forecasts the parameters of the objects’ shape and the probability of the image
belonging to a particular class. From previous versions, YOLOv3 brings several enhancements
concerning the correct rate and detection of smaller objects and in more complex environments.
As one of the unique features of YOLOv3, the model can detect a wide range of objects.
simultaneously. This is attained by an object detection procedure that divides the input image
into a grid, and each cell predicts positions of 4 boundaries and the probability of an object
exist inichten within its boundaries. These predictions are then fine-tuned when using the
anchor boxes which in turn boost detection accuracy. Such a feature of multiple classes is vital
in situations that require Like it in a case of shape recognition for traffic surveillance or robotics
where several objects are recognized in a single frame. The next characteristic worth discussing
is that YOLOv3 is a very fast model - this means that it is suitable for real-time projects.
Where f(xl)f(x_l)f(xl) is the output of convolution layers and xlx_lxl is the input to the layer.
Multi-scale Detection: YOLOv3 performs detection at three different scales to improve the
detection of small objects. It predicts bounding boxes at three different feature map sizes,
capturing objects of varying sizes by detecting on:
A 13x13 feature map (for large objects).
A 26x26 feature map (for medium objects).
A 52x52 feature map (for small objects).
Page | 9
Journal of Science and Technology
ISSN: 2456-5660 Volume 9, Issue 11 (November -2024)
www.jst.org.in DOI:https://doi.org/10.46243/jst.2024.v9.i11.pp01- 20
Anchor Boxes: Each grid cell predicts multiple bounding boxes with pre-defined shapes
known as anchor boxes. YOLOv3 refines these anchor boxes during training to better match
the actual objects.
Anchor Box Loss:
Class Prediction: YOLOv3 predicts class probabilities using logistic regression. Each box
predicts the probability of containing a particular object class, and this is done using binary
cross-entropy loss rather than softmax, allowing for multiple labels per box.
Class Loss:
The Smart Crop Protection System uses YOLOv3 algorithm to identify animals and intruders
in real time mode. Here's how YOLOv3's architecture supports your system's objectives:
Multi-object Detection: With regards to feature that gives YOLOv3 advantage over other
object detection models is that it scans for several objects in a single go, which is very important
when it comes to looking for different animals and intruders on the field. This makes the
monitoring very efficient and able to alert security of an intrusion very promptly.
Anchor Boxes and Real-time Capability: The anchor boxes and the multi-scale detection
feature in the model make the system capable of detecting animals of different sizes which are
crucial for product differentiation of a small and a large potential threat. This form of
processing is important for real time alerts like playback of any sound to ward off animals or
send an e-mail to the farmer.
High Accuracy for Critical Tasks: YOLOv3’s architecture makes it easy to detect the exact
location and identity of the farmer or any other familiar object as compared to detecting a
completely new threat. This effectively means that instead of being flooded by several alerts,
the system gets to highlight what is important to the farmers by eliminating actual fakers.
The proposed system incorporating YOLOv3 provides the ability to coordinate and perform
real-time surveillance functions, alert handling and detection of multiple objects accurately
within a dense environment – all of which would be valuable for agricultural protection.
Page | 10
Journal of Science and Technology
ISSN: 2456-5660 Volume 9, Issue 11 (November -2024)
www.jst.org.in DOI:https://doi.org/10.46243/jst.2024.v9.i11.pp01- 20
.. .. .. .. ..* . . .* …
Fig 2
How proposed algorithm Works
Real-Time Video Feed: The system is constantly recording real time video feed of a camera
tracking the crop field.
Frame Processing: That is why, the captured frames are introduced to the YOLOv3 algorithm
for the subsequent analysis. Compared with Fast R-CNN that takes multiple passes over the
image, YOLO achieves object detection in real time.
Grid Division: YOLO divides the image into an S×S grid, To achieve this YOLO splits the
picture into 823530 S×S rectangles. In each cell of the grid, regression of bounding boxes and
classification of class probabilities for objects with center in that cell are done.
Bounding Box Prediction: In particular, for each grid cell YOLO assigns a fixed number of
bounding boxes.
Each bounding box includes:
Coordinates: Center coordinates (x,y) and width (w) and height (h).
Confidence Score: Shows the probability that the box contains an object, how accurate
a box the model is.
Class Probability Prediction: Each grid cell they also compute the class probabilities of new
objects as well. These probabilities tell us the probability of the given classes, for example,
animal, person and so on.
Non-Maximum Suppression: After estimating bounding boxes and class probabilities, YOLO
finally has suppressed the overlapping, redundant boxes using the non-maximum suppression
method. This retains only the box of higher confidence for each object that has been detected
from a scene.
Page | 11
Journal of Science and Technology
ISSN: 2456-5660 Volume 9, Issue 11 (November -2024)
www.jst.org.in DOI:https://doi.org/10.46243/jst.2024.v9.i11.pp01- 20
Object Detection: The resulting frames contain rectangular outlines drawn around animals or
intruders detected (depending on the network used) marked with their class and respective
confidence levels.
Real-Time Alerts: When the system detects an unknown person or animal with a confidence
score above a predefined threshold, it triggers the alert mechanisms: Sound Playback: To scare
away the intruder. Email Notification: The farmer is notified through an email containing a
picture of the captured object.
Phone Call Alert: If the threat is still active after 15 minutes then a call is automatically placed
to the farmer.
Logging and Monitoring: Every triggering event is documented and includes the object type
detected, time when the detection was made and the actions that followed. This leads to
capability to watch and analyze intrusions over a time space.
Advantages Associated with Employing YOLO on our Project
1. Speed: Due to the latter usage of the architectural components of YOLO, the
identified threats can be processed in real-time.
2. Accuracy: It reduces false positives by making sure that alerts distance itself from
false negative detections whilst YOLO has high accuracy that makes it identify multiple
objects accurately.
3. Scalability: This means that the system can be loaded on edge devices and used in
the different contexts of agriculture.
Page | 12
Journal of Science and Technology
ISSN: 2456-5660 Volume 9, Issue 11 (November -2024)
www.jst.org.in DOI:https://doi.org/10.46243/jst.2024.v9.i11.pp01- 20
Predicted Predicted Predicted: Farmer Predicted:
: Animal : Intruder (Owner) No
Detection
True False Negative
Actual: Animal False Negative (FN) False Negative
Positive (FN)
(FN)
(TP)
False Negative True
Actual: Intruder False Negative (FN) False Negative
(FN) Positive
(FN)
(TP)
Actual: Farmer False Positive False Positive
True Negative (TN) False Negative
(Owner) (FP) (FP) (FN)
Actual: False Positive False Positive
False Positive (FP) True Negative (TN)
No (FP) (FP)
Detection
Breakdown of Terms:
True Positive (TP): The system correctly detects an animal or intruder (correct classification).
True Negative (TN): The system correctly recognizes the farmer or correctly identifies that no
detection should be made.
False Positive (FP): The system mistakenly detects an animal or intruder when it is actually the
farmer, or there was no detection.
False Negative (FN): The system fails to detect the presence of an animal or intruder, or
mistakenly classifies the farmer as a threat.
To further evaluate your system, you can calculate the following performance metrics:
precion TP/(TP+FP)
Recall also called sensitivity TP/(TP+FN)
Example Scenario:
Let's say your system runs 100 tests with the following results:
60 animals were correctly identified as animals (TP).
5 animals were wrongly classified as intruders (FN).
10 intruders were correctly identified as intruders (TP).
8 farmers were wrongly identified as animals or intruders (FP).
15 instances had no detection, and the system correctly did not raise an alert (TN).
Page | 13
Journal of Science and Technology
ISSN: 2456-5660 Volume 9, Issue 11 (November -2024)
www.jst.org.in DOI:https://doi.org/10.46243/jst.2024.v9.i11.pp01- 20
The matrix might look like this:
Fig 3
Statistical measures:
Fig 4
Page | 14
Journal of Science and Technology
ISSN: 2456-5660 Volume 9, Issue 11 (November -2024)
www.jst.org.in DOI:https://doi.org/10.46243/jst.2024.v9.i11.pp01- 20
4. EXPERIMENTAL RESULTS:
A. Owner Training Image:
In the Smart Crop Protection System the first step is to train the system for identification of the
owner of the crops. The employed training phase acquires the owner’s images to develop a
perfect identification model. Taking a note of advanced machine learning, the owner and the
system understands that only a stranger gets inside the car and makes it alert. The important
combination of the owner training is as follows: the owner training must create a zone of safety
around the crop area, reduce instances of false alarms, and improve the performance of the
system..
Page | 15
Journal of Science and Technology
ISSN: 2456-5660 Volume 9, Issue 11 (November -2024)
www.jst.org.in DOI:https://doi.org/10.46243/jst.2024.v9.i11.pp01- 20
signifying a probable security infringement in the owner’s compound. This feature is important
for the purpose of preventing theft or destruction of the crops which strengthens the need for a
protective system for farming produce.
Page | 16
Journal of Science and Technology
ISSN: 2456-5660 Volume 9, Issue 11 (November -2024)
www.jst.org.in DOI:https://doi.org/10.46243/jst.2024.v9.i11.pp01- 20
.
Fig 10:mail notification
Page | 17
Journal of Science and Technology
ISSN: 2456-5660 Volume 9, Issue 11 (November -2024)
www.jst.org.in DOI:https://doi.org/10.46243/jst.2024.v9.i11.pp01- 20
5. CONCLUSION:
It was pointed out that the Smart Crop Protection System was able to implement
the use of artificial intelligence to improve agricultural safeguarding. Using
enhanced identification technologies, the system can guarantee that alerts are
provided only when an unfamiliar human and/or animal is detected. The
objectives of the project were achieved by developing a method of tracking crop
areas and hence reducing risks such as theft and damage.
Using real-time image processing, sound alarms, and notifications, the system
presents a perfect solution for the crops protection. This new concept does not
only amplify security but also brings Or Good to owners as they are assured of
the security of their investments in the agricultural field. Thus, with further
development in agricultural technology, the Smart Crop Protection System
proves the importance of adopting the application of AI in farming activities
achieving smarter and more secure farming.
6. FUTURE SCOPE:
The Smart Crop Protection System Using AI offers immense potential for further innovation
and scalability. As AI and machine learning technologies continue to advance, future
Page | 18
Journal of Science and Technology
ISSN: 2456-5660 Volume 9, Issue 11 (November -2024)
www.jst.org.in DOI:https://doi.org/10.46243/jst.2024.v9.i11.pp01- 20
improvements in the system can make it easier to use, help more farmers get access to
affordable crop protection tools, and open up new policy options for safeguarding crops.
1. Enhanced AI Models: With the continuous evolution of AI, the system can incorporate
more accurate object detection and image classification models. This could lead to more
precise identification of animals and people, reducing false alarms and enhancing
system reliability.
2. Integration with IoT: By integrating with the Internet of Things (IoT), the system can
interact with sensors, cameras, and automated devices in real-time. For example, if an
animal is detected, IoT devices could automatically trigger fencing mechanisms or
initiate protective measures in the field, offering a fully automated crop protection
solution.
3. Cloud-Based Monitoring: A cloud-based architecture could allow farmers to access live
feeds, notifications, and reports from remote locations. This would also enable AI
model updates to be pushed remotely, ensuring that the system stays up-to-date with
the latest improvements.
4. Data-Driven Farming Insights: The system could collect and analyze data over time to
provide insights into animal movement patterns, weather conditions, and potential
threats. These data-driven insights could help farmers make better decisions regarding
crop protection and farming practices.
5. Integration with Government Policies: In the future, systems like this could be
integrated with government initiatives to protect crops on a larger scale. Governments
could incentivize the adoption of such AI-based systems by offering subsidies or
linking them with insurance programs, ultimately enhancing food security.
7. REFERENCES:
[1] M. A. Hossain, A. Haque, and M. Z. Hossain, “Smart agriculture monitoring and animal
intrusion detection using AI,” Journal of Agricultural Informatics, vol. 11, no. 2, pp. 35-42,
2021.
[2] J. K. Patel and M. C. Gupta, “AI-based animal detection for crop protection,” IEEE
Access, vol. 9, pp. 47538-47550, 2020.
[3] R. Sharma and S. Kumar, “The role of IoT and AI in modern farming,” International
Journal of Emerging Trends in Engineering Research, vol. 8, no. 4, pp. 1192-1201, 2020.
[4] S. N. Singh and V. Singh, “A review on AI-driven solutions for smart farming,”
Agricultural Engineering International: CIGR Journal, vol. 21, no. 3, pp. 105-117, 2019.
[5] A. P. Mahesh and K. S. Rao, “Machine learning approaches for animal intrusion detection
and crop protection,” International Journal of Advanced Science and Technology, vol. 29, no.
5, pp. 4025-4036, 2020.
[6] P. Zhang and D. Huang, “Application of AI in sustainable agriculture: A comprehensive
review,” Sustainable Computing: Informatics and Systems, vol. 32, pp. 100524, 2022.
[7] M. Chakraborty and P. Ghosh, “AI and IoT integration in smart crop protection systems,”
Advances in Science, Technology and Engineering Systems Journal, vol. 6, no. 1, pp. 234-
240, 2021.
[8] Swathi, A., V. Swathi, Shilpa Choudhary, and Munish Kumar. "Wearable Gait
Authentication: A Framework for Secure User Identification in Healthcare." Optimized
Predictive Models in Healthcare Using Machine Learning (2024): 195-214.
[9] Gowroju, Swathi, Shilpa Choudhary, Sandhya Raajaani, and Regula Srilakshmi.
"Semantic Segmentation of Aerial Images Using Pixel Wise Segmentation." Advances in
Page | 19
Journal of Science and Technology
ISSN: 2456-5660 Volume 9, Issue 11 (November -2024)
www.jst.org.in DOI:https://doi.org/10.46243/jst.2024.v9.i11.pp01- 20
Aerial Sensing and Imaging (2024): 145-164.
[10] Gowroju, Swathi, Shilpa Choudhary, Medipally Rishitha, Singanaboina Tejaswi,
Lankala Shashank Reddy, and Mallepally Sujith Reddy. "Drone‐Assisted Image Forgery
Detection Using Generative Adversarial Net‐Based Module." Advances in Aerial Sensing and
Imaging (2024): 245-266.
[11] Gowroju, Swathi, and Saurabh Karling. "Multinational Enterprises' Digital
Transformation, Sustainability, and Purpose: A Holistic View." In Driving Decentralization
and Disruption With Digital Technologies, pp. 108-123. IGI Global, 2024.
[12] Gowroju, Swathi, V. Swathi, and Ankita Tiwari. "Handwriting and Speech‐Based
Secured Multimodal Biometrics Identification Technique." Multimodal Biometric and
Machine Learning Technologies: Applications for Computer Vision (2023): 227-250.
[13] Gowroju, Swathi, V. Swathi, J. Narasimha Murthy, and D. Sai Kamesh. "Real-
Time Object Detection and Localization for Autonomous Driving." Handbook of Artificial
Intelligence (2023): 112.
[14] Gowroju, Swathi, G. Mounika, D. Bhavana, Shaik Abdul Latheef, and A.
Abhilash. "Artificial Intelligence–Based Active Virtual Voice Assistant." Explainable
Machine Learning Models and Architectures (2023): 81-103.
[15] Gowroju, Swathi, and N. Santhosh Ramchander. "Applications of Drones—A
Review." Drone Technology: Future Trends and Practical Applications (2023): 183-206.
Page | 20