0% found this document useful (0 votes)
20 views24 pages

Batch 21

The document describes a project report on developing a hand gesture controlled drone. It was submitted by four students - Keertana Vyas, Mahesh Kiran Booruga, Md Afroj Alam Ansari, and Naveen Kumar R S at Visvesvaraya Technological University, under the guidance of Dr. Jenitta J. The report details the design and implementation of a drone that can be controlled by hand gestures using computer vision techniques, eliminating the need for remote controls. It aims to provide an intuitive way for human-drone interaction through agent-less communication between the operator and drone.

Uploaded by

Bharath Raj S
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views24 pages

Batch 21

The document describes a project report on developing a hand gesture controlled drone. It was submitted by four students - Keertana Vyas, Mahesh Kiran Booruga, Md Afroj Alam Ansari, and Naveen Kumar R S at Visvesvaraya Technological University, under the guidance of Dr. Jenitta J. The report details the design and implementation of a drone that can be controlled by hand gestures using computer vision techniques, eliminating the need for remote controls. It aims to provide an intuitive way for human-drone interaction through agent-less communication between the operator and drone.

Uploaded by

Bharath Raj S
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 24

VISVESVARAYA TECHNOLOGICAL UNIVERSITY

Jnana Sangama, Belagavi - 590018

Project Report
On
Hand Gesture Controlled Drone
Submitted in partial fulfilment of the requirement for the award of degree of

BACHELOR OF ENGINEERING
IN
ELECTRONICS AND COMMUNICATION ENGINEERING
Submitted by

Keertana Vyas – 1AM20EC036


Mahesh Kiran Booruga – 1AM20EC043
Md Afroj Alam Ansari – 1AM20EC045
Naveen Kumar R S – 1AM20EC053

Under the guidance of


Dr. Jenitta J
Professor

DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING

AMC ENGINEERING COLLEGE


Approved by AICTE, Permanently Affiliated to VTU, Belagavi, Accredited by NAAC & NBA
2023-2024
AMC ENGINEERING COLLEGE
Affiliated to VTU, Belagavi, Approved by AICTE, Accredited by NAAC, New Delhi

#18th Km, Bannerghatta Road, Kalkere, Bengaluru-83

DEPARTMENT OF ELECTRONICS & COMMUNICATION


ENGINEERING
(NBA Accredited)

CERTIFICATE

This is to certify that the project work entitled “Hand Gesture Controlled Drone” is a
bonafide work carried out by Keertana Vyas (1AM20EC036), Mahesh Kiran Booruga
(1AM20EC043), Md Afroj Alam Ansari (1AM20EC45), Naveen Kumar R S
(1AM20EC053), In partial fulfilment for the award of Bachelor of Engineering in Electronics
and Communication of the Visvesvaraya Technological University, Belagavi during the year
2023-24. It is certified that all corrections/suggestions indicated for internal assessment have
been incorporated in the report. The project report has been approved as it satisfies the academic
requirements in respect of Project work prescribed for the said Degree.

Signature of the Guide Signature of the HOD Signature of the Principal

Dr. Jenitta J Dr. G Shivakumar Dr. R Nagaraja


DECLARATION

We, the students of VII semester B.E in Electronics and Communication Engineering, AMC
Engineering College, Bengaluru, hereby declare that the project work entitled “Hand Gesture
Controlled Drone” submitted to the Visvesvaraya Technological University during the
academic year 2023-24, is a record of an original work done by us under the guidance of Dr.
Jenitta J, Professor, Department of Electronics and Communication Engineering, AMC
Engineering college, Bengaluru. This project work is submitted in partial fulfilment of the
requirements for the award of the degree of Bachelor of Engineering in Electronics and
Communication Engineering. The results embodied in this report have not been submitted to
any other university or institute for the award of any degree.

Date: 19/12/2023 Keertana Vyas – 1AM20EC036

Place: Bengaluru Mahesh Kiran Booruga – 1AM20EC043

Md Afroj Alam Ansari – 1AM20EC045

Naveen Kumar R S – 1AM20EC053


ABSTRACT

Drones are conventionally controlled using joysticks, remote controllers, mobile applications,
and embedded computers. A few significant issues with these approaches are that drone control
is limited by the range of electromagnetic radiation and susceptible to interference noise. In
this study we propose the use of hand gestures as a method to control drones. We investigate
the use of computer vision methods to develop an intuitive way of agent-less communication
between a drone and its operator. Computer vision-based methods rely on the ability of camera
to detect the hand gesture, recognise and process the gesture to the desired command and
wirelessly send the signal to the drone. The drone then has to follow the received command
and successfully perform the desired action. The drone is equipped with a camera to capture
surrounding images, or view for live navigation of the drone. The proposed framework involves
a few key parts toward an ultimate action to be taken. They are: image segregation from the
video streams of front camera, creating a robust and reliable image recognition based on
segregated images, and finally conversion of classified gestures into actionable drone
movement, such as take off, landing, hovering and so forth.
Hand Gesture Controlled Drone

CHAPTER-1
INTRODUCTION
Drones, also known as unmanned aerial vehicles, are on the rise in recreational and in a wide
range of industrial applications, such as security, defence, agriculture, energy, insurance and
hydrology. Drones are essentially special flying robots that perform functionalities like
capturing images, recording videos and sensing multimodal data from its environment. There
are 4 types of drones based on their shape, size, rotor count and take-off method: Fixed-wing,
Fixed-wing hybrid, Single Rotor and Multirotor. Because of their versatility and small size,
multirotor drones can operate where humans cannot, collect multimodal data, and intervene in
occasions. Moreover, with the use of a guard hull, multirotor drones are very sturdy in
collisions, which make them even more valuable for exploring uncharted areas. Other
important factor of drone craze is that it can be equipped with different payloads, not only a
camera. For example, the purpose of agriculture monitoring, for instance, the use of multiple
sensors such as video and thermal infrared cameras are of benefit. For weather forecasting or
fog and cloud monitoring a system of sensors like thermometer, gyroscope, barometer etc. is
equipped. Drones are especially useful in risky missions. And for this project the most
important factor of all is the ability of programming a drone.
The Conventional way to operate a drone is by Remote Control. Here a set of pre-defined
commands which are already processed are incurred in the drone processor and is controlled
remotely by the user. The remote can either be a joystick type or a touch system which is
connected to a smartphone or PC/Laptop. The major drawback of a Remote control is the
limited sequence of actions and the complexity in learning. Some drones can be automatic by
pre-programming it for the task. In this work, we investigate an alternative method of
controlling multirotor drones using hand gestures as the main channel of communication. We
propose a framework that maps segregated images from video stream as one of a set of
commands/gestures. The camera can capture visual instructors from the operator, which
eliminates the control device, leading the way for agent-less communication. Note: Although
we intend on using only our palm, for easy understand throughout the project we use the term
Hand Gesture.

Dept. of ECE, AMCEC 2023-24 5


Hand Gesture Controlled Drone

To completely eliminate the use of any control devices, after thorough research and analysis
we decided to use Computer vision technology as a method to build the agentless
communication between the user and drone, technically known as HDI (Human-Drone
Interaction). Computer Vision is a trending field of computer science that focuses on enabling
computers to identify and understand objects and people visually or by images and videos. It
is a field of AI where a machine is trained to accurately function like the human eye.
The system consists of 4 functional steps:
 Captures the hand gestures with the help of a camera.
 Successful recognition of the hand gestures.
 Conversion of the recognized gesture to a specific command for the drone.
 Successful implementation of the command by the drone.

The proposed system should function in real-time and the accuracy in detecting and recognizing
the gestures is an essential factor to care about.

Dept. of ECE, AMCEC 2023-24 6


Hand Gesture Controlled Drone

HISTORY OF FIELD OF INTEREST

A drone, in technological terms, is an unmanned aircraft. Essentially, a drone is a flying robot


that can be remotely controlled or fly autonomously through software-controlled flight plans
in their embedded systems, working in conjunction with onboard sensors and GPS. But what
is the history of drones?

Nikola Tesla is testing his radio-controlled boat for the first time in a New York Pond in
Madison Square Garden, to the amazement of the crowd. He was able to even control the lights
on his RC boat, making the people think he was either a magician or that a small monkey was
inside, controlling the craft.

This is the beginning of every radio-controlled aircraft as we know it today, thanks to the genius
of Tesla.

The first ever operational drone was developed in the year 1917 by the British and was named,
Aerial Target. In the span of just 1 year, the American Kettering Bug was developed.
Although both the UAVs were operating fine, they were never utilized in military purposes.
Later in the mid 1920’s Etienne Omnichen invented the first working Quadcopter, called the
Omnichen 2 and it managed to fly for 360m, establishing a new world record. In the same year
he flew for 1 km circle in 7 minutes and 40 seconds.

In 1935, the term Drone started to be in use which was inspired by the UAV model Queen
Bee.

The first commercial drone which was given permission by FAA (Federal Aviation
Administration) was the DJI (Da-Jiang Innovations) drone company in the year 2006.

After a long 4 years the first Drone which revolutionised drone technology was built by Parrot
Group. The ideology of AR drones was introduced where a drone can be directly connected to
your smartphones and controlled.

Dept. of ECE, AMCEC 2023-24 7


Hand Gesture Controlled Drone

With each revolution, the main aim of the drone manufacturers was to increase the range of
control of the drone. Which was when the DJI-Phantom 3 was introduced, with an outstanding
range of 5km radius in the year 2016.

Slowly the invention of Autonomous Drones has changed the drone technology completely.
Autonomous technology, where a drone is given a predefined function and it uses GPS to do
the said function. Autonomous drones led to many innovative models based on different
applications.

The use of AI in everything to make the world innovative has been trending for the past 10
years. Computer Vision is such a branch of AI where a machine can classify objects visually.
The study of Computer Vision first started in the early 1960s. The first ever use of computer
vision was done by Google in their Google Lens. It was launched on October 3rd, 2017 to
visually get information about objects by just pointing at it. Currently researches include where
Computer Vision is utilized to segregate objects, navigations and many other applications.

Dept. of ECE, AMCEC 2023-24 8


Hand Gesture Controlled Drone

PROBLEM DEFINITION

The Current and conventional method of controlling a drone is by Remote Control. But the use
of Remote control has the below mentioned problems:

• Limited Actions
– Classical interactions tools like keyboard, mouse, touchscreen, etc. may limit
the way we use the system.
– May not contain some desired actions.

• Cannot change/add Controls


– Remotes come in complete hardware or pre-defined software.
– The software cannot be altered.

• Faulty Hardware
– Any damage in the remote may lead to repairs or replace in the system which
can be expensive.

After consideration and analysis, we come up with the idea of Hand Gesture Control of a
Drone where there is no problem of hardware portability, hardware fault and a fully
functional system with a good accuracy.

Dept. of ECE, AMCEC 2023-24 9


Hand Gesture Controlled Drone

MOTIVATION OF THE PROJECT

This project created a platform to learn about the unmanned aerial vehicles such as the
quadcopter. This expands the scope of the Electrical Engineering to include the control and the
understanding of the mathematical components. The quadcopter has many applications that an
interested to develop security systems, mapping and reconnaissance especially in a disaster and
dangerous area. It also opens up the possibilities to broaden the understanding and application
of more control systems, stabilization, artificial Intelligence and computer Image processing as
it applies to the quadcopter.

With that being said, the main motivation behind this project is our curiosity to explore the
various control schemes for small-scale drones. The paper "Design and Development of Voice
Control System for Micro Unmanned Aerial Vehicles" talks about various drone control
methodologies such as Radio, GCS, Gesture, Voice, Joystick, PC, FPV, and Autonomous. It is
observed that situational awareness is at a medium level for Radio and Gesture UAV control
methods, whereas situational awareness is high for the voice control method. In this project,
we will work on gesture controlling, and later we will go up to voice control and also other
advanced controls.

The motivation for this project also raised from the need to implement these different control
methods in a low-cost portable and scalable embedded platform with computation at the edge,
without relying on external resources for its working.

Dept. of ECE, AMCEC 2023-24 1


0
Hand Gesture Controlled Drone

OBJECTIVE OF THE PROJECT


The objective of this project is to build a Hand Gesture Control System to control a drone and
that can satisfy the following functions respectively:

 To capture the hand gestures within the frame of the camera.

 To successfully recognize the hand gestures.

 To computationally convert the recognized gesture into a specific command for the
drone.

 To successfully implement the specific command by the drone.

 To smoothly perform all the functions mentioned with minimalistic error and good
accuracy.

Dept. of ECE, AMCEC 2023-24 1


1
Hand Gesture Controlled Drone

CHAPTER-2
RELATED WORK

1. Ondˇrej Spinka, ˇ St ˇ epˇ an Kroupa, Zden ´ ek Hanz ˇ alek, “Control System for
Unmanned Aerial Vehicles”, 5th IEEE International Conference on Industrial
Informatics, July 2007.
• This paper deals with autopilot design for autonomous Unmanned Aerial Vehicles is
introduced in this paper.
• Networked hierarchical distributed control system is being proposed and its hardware and
software structure is briefly described.
• We are introduced to RAMA (Remotely Operated Aerial Model Autopilot). It is an open
project whose purpose is to design a universal lightweight and compact control system for
small UAVs.
• The paper also gives us a knowledge on different Control System Variables and Measured
Vehicle Variables.
• RAMA should be fully autonomous control system for any kind of UAV. Due to its
lightweight and compact structure, it can be used even in small UAVs.

2. Mirmojtaba Gharibi, Raouf Boutaba, Steven L. Waslander,” Internet of Drones”, In


IEEE Access, vol. 4, pp. 1148-1162, January 2016.
• In this paper, we are introduced to The Internet of Drones (IoD) a layered network control
architecture designed mainly for coordinating the access of unmanned aerial vehicles to
controlled airspace, and providing navigation services between locations referred to as
nodes.
• IoD provides generic services for various drone applications such as package delivery,
traffic surveillance, search and rescue and more.
• This paper presents a conceptual model of how such an architecture can be organized.
• The main contribution of this paper is formulating a complex and multifaceted problem and
showing how in an abstract level, it is related to the vast amount of existing literature on
the three existing networks, namely air traffic control, cellular network, and the Internet.

Dept. of ECE, AMCEC 2023-24 1


2
Hand Gesture Controlled Drone

• Many drone applications can benefit from a unified framework that coordinates their access
to the airspace and helps them navigate to the points of interest where they have to perform
a task.

3. Tuton Chandra Mallick, Mohammad Ariful Islam Bhuyan, Mohammed Saifuddin


Munna,” Design & Implementation of an UAV (Drone) with Flight Data Record”,
2016 International Conference on Innovations in Science, Engineering and
Technology (ICISET), Dhaka, Bangladesh, October 2016.
• This paper proposes the development of an autonomous unmanned aerial vehicle (UAV)
which is controlled by wireless technology through graphical user interface (GUI).
• In this paper we are introduced to IMU 9DOF (3-axis accelerometer, 3-axis gyroscope &
3-axis magnetometer), a 9-axis motion tracking module, which ensures smooth movement,
graceful motion and trajectory tracing in UAVs.
• In this paper, they aimed to design a quad copter that will try stable its position according
to preferred altitude.
• Several PID loops designed to get better stability and performance in different mode.
• Their work resulted in a system, which is capable to fly in different mode without
complexity.
• The main modes are manual mode, hover mode, auto mode and return to launch mode.

4. Kathiravan Natarajan, Truong-Huy D. Nguyen, Mutlu Mete, “Hand Gesture


Controlled Drones: An Open-Source Library”, 1st International Conference on Data
Intelligence and Security, 2018.
• The proposed methodology in this paper is propose the use of hand gestures as a method to
control drones.
• They use computer vision methods to develop an intuitive way of agent-less
communication between a drone and its operator.
• The proposed framework mainly follows the following three key points to achieve this:
• Image segregation from the video streams of front camera
• Creating a robust and reliable image recognition based on segregated images
• Conversion of classified gestures into actionable drone movement

Dept. of ECE, AMCEC 2023-24 1


3
Hand Gesture Controlled Drone

• We are introduced to Haar feature-based AdaBoost classifier, which is employed for


gesture recognition.

• We also learn that the distance between drone and its operator is the most important
indicator of success.
• The framework was successfully tested using a mediocre drone, Parrot AR. Drone 2.0.

5. Hadri Soubhi, “Hand Gesture Control of Drone Using Deep Learning,” Thesis,
University of Oklahoma, December, 2018.
• This thesis presents research detailing the use of hand gestures as a HDI method to control
the drones.
• This work consists of three main modules:
– Hand Detector
– Gesture Recognizer
– Drone Controller.
• A deep learning method is incorporated and utilized in the first module to detect and track
the hands in real-time with high accuracy and from a single Red-Blue-Green (RGB) image.
• Here we are introduced to Single Shot Multi-Box Detect (SSD) network, which was used
to detect the hands.
• The module utilized image processing methods and it was totally dynamic, which meant it
was very simple to add new gestures with no need to re-train the neural network. Drone
Controller, depends on Ardupilot which is one of the most popular autopilot systems
• The tests for the whole system were applied using simulators. The first simulator was the
Ardupilot SITL (Software in The Loop) simulator and the other one was the Gazebo Robot
3D Simulator.

6. W. Wu, M. Shi, T. Wu, D. Zhao, S. Zhang and J. Li, "Real-time Hand Gesture
Recognition Based on Deep Learning in Complex Environments," 2019 Chinese
Control and Decision Conference (CCDC), Nanchang, China, 2019.
• This paper proposes a method of multi-frame recursion to minimize the influences of
redundant frames and error frames.

Dept. of ECE, AMCEC 2023-24 1


4
Hand Gesture Controlled Drone

• This paper focuses on the hand gesture control of the unmanned vehicle as the application
background, and focuses on the gesture detection and recognition of video streams based
on deep learning in the complex environment.
• Here the hand is detected in a complex environment by training the SSD mobilenet model.
We are introduced to Kalman filter which is used to initialize the tracking.
• The hand key points are detected by the architecture of Convolutional Pose Machines
(CPMs), in order to obtain the belief maps for all key points.
• These are used as the train sets of Convolutional Neural Networks (CNNs).
• The proposed algorithm has an accuracy of more than 96% for gesture recognition in
complex environments, effectively reduces the false alarm rates.

7. M R Dileep, A V Navaneeth, Savitha Ullagaddi, Ajit Danti, “A Study and Analysis on


various types of Agricultural Drones and its Applications,” 5th International
Conference on Research in Computational Intelligence and Computer Networks
(ICRCICN), Bengaluru, India, November 2020.
• In this paper, a detailed study has been made on various types of agricultural drones based
on the feature, capacity, range as well as cost and the area of agriculture where they suit the
most, and a statistical analysis about the usage of the drones in the field of agriculture.
• This paper also gives us a knowledge on different types of drones namely:
– Multi-Rotor
– Fixed-Wing
– Single-Rotor
– Fixed-wing hybrid VTOL (Vertical Take-off and Landing).
• In this paper, it is found out that, many types drones are suitable in agricultural fields for
performing various activities. Along with agriculture there are many associated activities
such as, poultry, sericulture, fisheries etc.
• Majorly in agriculture, the drones are used in information gathering, providing reports,
some physical activities, animal surveillance, crop information, spraying pesticides etc.
• It is learnt that if drones are utilized in all the activities of agriculture and related industries,
then there is no doubt that the country will gain huge GDP (Gross Domestic Product) from
agriculture.

Dept. of ECE, AMCEC 2023-24 1


5
Hand Gesture Controlled Drone

8. N. Mohamed, M. B. Mustafa and N. Jomhari, "A Review of the Hand Gesture


Recognition System: Current Progress and Future Directions," in IEEE Access, vol.
9, pp. 157422-157436 November 2021.
• This paper reviewed the sign language research in the vision-based hand gesture
recognition system from 2014 to 2020.
• Its objective is to identify the progress and what needs more attention.
• In the paper they have also reviewed the performance of the vision-based hand gesture
recognition system in terms of recognition accuracy
• It is learnt that data acquisition, features and the environment of the training data are very
important factors that play a main role in hand gesture recognition system.
• It was also noted that the majority of the databases used in hand gesture recognition research
were those from a restricted environment, thereby signaling the need for sign language
databases to be less restrictive and contain different environments
• This paper thus concludes that to make the vision-based gesture recognition system ready
for real-life application, more attention needs to be focused on the uncontrolled
environment setting as it can provide researchers the opportunity to improve the ability of
the system in recognizing hand gestures in any form of environment.

9. J. Qiu, J. Liu and Y. Shen, "Computer Vision Technology Based on Deep Learning,"
IEEE 2nd International Conference on Information Technology, Big Data and
Artificial Intelligence (ICIBA), Chongqing, China, December 2021.
• This paper outlines the development of deep learning models, and determines the inflection
point of the development of the introduction of convolutional neural networks.
• The structure and development process of the convolutional neural network are also
analyzed.
• Here the basic method of Computer Vision Technology is also discussed, namely:
– Deep learning overview
– CNN (Convolutional Neural Network).
• This paper also discusses the core tasks of computer vision:
– Image classification
– Object detection
– Image segmentation

Dept. of ECE, AMCEC 2023-24 1


6
Hand Gesture Controlled Drone

• It also gives us a knowledge on Classic open data sets in the field of Computer Vision which
are:
– MNIST
– ImageNet
– The PASCAL VOC data set

10. Petrovski, Aleksandar & Radovanović, Marko,” APPLICATION OF DRONES


WITH ARTIFICIAL INTELLIGENCE FOR MILITARY PURPOSES,” In
proceedings with the 10th International Scientific Conference of Defensive
Technologies-OTEH, Belgrade, Siberia, 2022.
• This paper presents drones that learn on their own, i.e., that possess artificial intelligence,
and which can be used for military purposes.
• This paper proposes a real-time vital Military Objects detection system based on
Convolution Neural Network using the YOLO-v5.
• The paper points out the possibility of autonomous use of drones with artificial intelligence
in combat and non - combat operations of the army.
• Here we are introduced to. YOLO-v5 method, used for enhanced military drone
surveillance.
• The use of unmanned aerial vehicles equipped with AI and modern technical devices
integrated into the C5IRS system significantly increases the efficiency of units engaged in
combat operations.
• The use of this sophisticated technology enables timely and accurate information about the
event in real time, as well as the destruction of the target without risk to humanity, while
transmitting the situation to the battlefield command center.

Dept. of ECE, AMCEC 2023-24 1


7
Hand Gesture Controlled Drone

COMPARISON OF LITERATURE SURVEY

Ref. Title with author Proposed Performance Advantages Disadvantages

No. method Parameters

[19] TITLE: Design and implementation • GUI • Autonomous • High Response • Safety measures are not met
of UAV with flight data record • IMU 9DOF flight action time • Speed is quite low
AUTHOR: Tuton Chandra Mallick, • GPS and • Trajectory • Very good
Mohammad Ariful Islam Bhuyan, Barometric Tracking Accuracy
Mohammed Saiffudin Munna Systems • Graceful • Easy Dual
• PID Loop motion mode
• ESC • Altitude hold switching
• Stability • Highly Stable

[9] TITLE: Hand gesture for drone • HDI (Human- • Accuracy • High • Lacks Safety
control using Deep learning (Thesis). Drone • Frame rate Efficiency • Use of SSD algorithm
AUTHOR: Soubhi Hadri Interaction) • Gesture • Simple makes the Network
using Hand Classification Complex
Gestures

[18] TITLE: A Study and Analysis on • Detailed • Area of • High coverage • All the mentioned drones
Various Types of Agricultural Drones discussion of Coverage area require precise Manual
and its Applications different types • Payload • Multi Rotor control.
AUTHOR: MR Dileep, A V of drones for carrying
Navaneeth, Savitha Ullagaddi, Ajit Indian • Endurance
Danti Agriculture • Efficiency

[17] TITLE: Computer Vision Technology • Brief Analysis • Accuracy • Low Cost • High Learning rate
based on Deep learning of Computer • Learning rate • High Accuracy • Too many Difficulties
AUTHOR: Jin Qiu, Jian Liu, Yunyi vision • Sensitivity • Strong • Classifying Data-sets is
Shen • Uses Deep • Precision Robustness Complex
learning
methods
• CNN based
Network
Model

[16] TITLE: Application of drone with AI • Uses Artificial • Field of vision • Lightweight • Error occurrence is quite
for Military purposes Intelligence and range of • Fast to deploy high
AUTHOR: Aleksandar Petrovski, • C5IRS vision • Minimum • Can be Unworthy if target is
Marko Radovanović • Confidence delay far.
value • Confidence
• Accuracy and Value is quite
Precision high

Dept. of ECE, AMCEC 2023-24 1


8
Hand Gesture Controlled Drone

[15] TITLE: • RAMA – • Measured • RAMA is an • Noise amplitude is high for


Control System for Unmanned Remotely vehicle open source that all the control systems.
Aerial Vehicles. operated Aerial variables can be accessed
AUTHOR: Model • Control by anyone.
Ondˇrej Spinka, ˇ St ˇ epˇ an Autopilot. system
Kroupa, Zden ´ ek Hanz ˇ alek. variables

[14] TITLE: • architecture for • Error • Coordination of • Prone to noise


Internet of Drones generic detection and drones in • Lack of Security is the
AUTHOR: services that correction for airspace System
Mirmojtaba Gharibi, Raouf can provide for networking navigation
Boutaba, Steven L. Waslander. both navigation • Traffic • strategical
and airspace Control for utilization of
management air traffic airspace
for all current navigation structure.
and future • Many Other
applications parameters
based on
their
application

[13] TITLE: • Image • Accuracy • Provides a way • the accuracy is only


Hand Gesture Controlled Drones: segregation • Range of of agentless maximum in 3 and 5 ft
An Open-Source Library. • Conversion of control communication range
AUTHOR: classified • Illumination with the drone • each gesture is either done
Kathiravan Natarajan, Truong-Huy gestures into (Effect of • maximum by right or left hand
D. Nguyen, Mutlu Mete. actionable lighting) possible • cannot be used in all types
drone accuracy even of environments
movement in mediocre
• Haar feature- drones
based
AdaBoost
classifier

[12] TITLE: • Multi-frame • Average • obtained • Far range detection is


Real-time Hand Gesture recursion recognition accuracy of slightly difficult
Recognition Based on Deep • Convolutional speed. more than 96% • The system should be
Learning in Complex Environments Pose Machines • redundant even in complex properly trained
AUTHOR: (CPMs) frame rate environments • Instant error correction
Weixin Wu, Meiping Shi, Tao Wu, • Convolutional generated by • Effectively method not found.
Dawei Zhao, Shuai Zhang, Junxiang Neural gesture reduces the
Li. Networks switching. false alarm
(CNNs) • Number of rates.
misclassified
frames

[11] TITLE: • This paper • Time • easier compared • Time complexity is high.
A Review of the Hand Gesture aims to review complexity to traditional • difficulty in adapting
Recognition System: Current the current • Adaptability methods that during unanticipated
Progress and Future Directions. issues, • Illumination require an agent conditions.
AUTHOR: progress, and to communicate • issue of handling non-
Noraini Mohamed, Mumtaz Begum potential future gesture movements
Mustafa direction of the • use limited numbers of
vision-based standard signs
hand gesture
recognition
research

Dept. of ECE, AMCEC 2023-24 1


9
Hand Gesture Controlled Drone

PROPOSED BLOCK DIAGRAM

Hand Gesture Drone To Drone


Input
Detector Recognizer Controller

Read from
stored
Data/Store
new Data

Fig.1 Block Diagram of the proposed method


Courtesy: Reference no. [10]

After detailed analysis, according to our objectives, we have arrived with the above block
diagram. The Block Diagram consists of the following:
 An Image or Video input to the camera mounted in front of the user. This input
should first be recognised by the system and hence why we require a Hand
Detector module. The Hand Detector’s main objective is to detect where is hand
is placed in the frame. Only after successful detection, the next block will be
functioning.
 After detecting as to where the hand is in the frame, the system has to recognise
the gesture, for example if the image is implementing an up, down, right, left, etc.
this part is the most crucial part of the entire system and holds the most weightage.

Dept. of ECE, AMCEC 2023-24 2


0
Hand Gesture Controlled Drone

 After successful recognition of the gesture, now the system needs a computed
signal to be sent to the drone. This requires a transmitter module. Since our project
aims to prove that the gestures can be sent to a longer range, a very powerful active
transmitter is used. The drone controller is basically the computation part before
transmission of the signal. Notice how the controller and the recogniser have a
feedback system, this is because after every recognition, the recogniser needs an
acknowledgement to be prepared for the next gesture, like a cooldown.

 After the transmission, the entire system is completed when the transmitted signal
is received by the drone’s receiver. But receiving alone will not conclude the
output, the drone has to accurately and smoothly perform the gesture.

Dept. of ECE, AMCEC 2023-24 2


1
Hand Gesture Controlled Drone

SUMMARY OF THE PROJECT

In this project, a system that controls the motion of a drone based on the user’s hand gestures
was discussed. After detailed analysis of the conventional, autonomous and other methods of
control of navigation of an unmanned aerial vehicle, we analysed the issues of all the different
methods and hence proposed an easy, secure and safe way to control the drone. The project
contains a fully functional, agentless system for the control of a drone. The project uses
Computer Vision based technology to controlled an unmanned aerial vehicle, particularly a
drone. The drone is controlled with the slightest motion of the palm to do complex actions. The
entire system is divided into 3 simple blocks. The input to this system is a frame of image,
particularly an image of a hand. A detecting system, to detect the placement and positioning
of the hand. The second major block of the system is the recogniser block where after detecting
the position of the hand, the computer has to analyse the gesture that has been given as the input
and understand the exact command assigned to the said gesture. After processing the gesture,
the required signal has to be sent to the drone and finally the drone has to successfully perform
the gesture.
The project is designed to work accurately with minimalistic errors and also ensures that the
system is smooth and undisturbed during the process. The use of this system will open up into
a completely new innovation and uses Computer Vision technology to its complete potential.

Dept. of ECE, AMCEC 2023-24 2


2
Hand Gesture Controlled Drone

REFERENCE

[1]. Alex Roney Mathew, A. Al Hajj and A. Al Abri, "Human-Computer Interaction (HCI):
An overview," in IEEE International Conference on Computer Science and Automation
Engineering, Shanghai, China, 2011.
[2]. R. Cauchard, Jessica & E, Jane & Y. Zhai, Kevin & Landay, James, “Drone & me: an
exploration into natural human-drone interaction,” in ACM International Joint Conference,
Osaka, Japan, 2015.
[3]. Ma, Lu & Lung Cheng, Lee, “Studies of AR Drone on Gesture Control,” in 3rd
International Conference on Materials Engineering, Manufacturing Technology and Control
(ICMEMTC), Taiyuan, China, 2016.
[4]. A. Sarkar, K. A. Patel, R. K. G. Ram and G. K. Kapoor, "Gesture control of drone using
a motion controller," in International Conference on Industrial Informatics and Computer
Systems (CIICS), Sharjah, UAE, 2016.
[5]. Kathiravan Natarajan, Truong-Huy D. Nguyen, Mutlu Mete, “Hand Gesture Controlled
Drones: An Open-Source Library,” in 1st International Conference on Data Intelligence and
Security (ICDIS), Texas, USA, 2018.
[6]. Liu, Wei & Anguelov, Dragomir & Erhan, Dumitru & Szegedy, Christian & Reed, Scott,
“SSD: Single Shot Multi-Box Detector,” in the 14th European Conference on Computer
Vision (ECCV), Amsterdam, Netherlands, 2016.
[7]. Redmon, Joseph & Divvala, Santosh & Girshick, Ross & Farhadi, Ali, “You Only Look
Once: Unified, Real-Time Object Detection,” in IEEE Conference on Computer Vision and
Pattern Recognition (CVPR), Las Vegas, NV, USA, 2016.
[8]. K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” in
IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA,
2016.
[9]. K. Simonyan and A. Zisserman, “Very deep convolutional networks for large-scale
image recognition,” in ICLR, San Diego, CA, USA, 2015.
[10]. Hadri Soubhi, “Hand Gesture Control of Drone Using Deep Learning,” University of
Oklahoma, December, 2018. 10.13140/RG.2.2.15939.02089.

Dept. of ECE, AMCEC 2023-24 2


3
Hand Gesture Controlled Drone

[11]. N. Mohamed, M. B. Mustafa and N. Jomhari, "A Review of the Hand Gesture
Recognition System: Current Progress and Future Directions," in IEEE Access, vol. 9, pp.
157422-157436, 2021, doi: 10.1109/ACCESS.2021.3129650.

[12]. W. Wu, M. Shi, T. Wu, D. Zhao, S. Zhang and J. Li, "Real-time Hand Gesture
Recognition Based on Deep Learning in Complex Environments," 2019 Chinese Control
and Decision Conference (CCDC), Nanchang, China, 2019, pp. 5950-5955, doi:
10.1109/CCDC.2019.8833328.
[13]. K. Natarajan, T. -H. D. Nguyen and M. Mete, "Hand Gesture Controlled Drones: An
Open-Source Library," 2018 1st International Conference on Data Intelligence and Security
(ICDIS), South Padre Island, TX, USA, 2018, pp. 168-175, doi: 10.1109/ICDIS.2018.00035.
[14]. M. Gharibi, R. Boutaba and S. L. Waslander, "Internet of Drones," in IEEE Access, vol.
4, pp. 1148-1162, 2016, doi: 10.1109/ACCESS.2016.2537208.
[15]. O. Spinka, S. Kroupa and Z. Hanzalek, "Control System for Unmanned Aerial
Vehicles," 2007 5th IEEE International Conference on Industrial Informatics, Vienna, Austria,
2007, pp. 455-460, doi: 10.1109/INDIN.2007.4384800.
[16]. Petrovski, Aleksandar & Radovanović, Marko,” APPLICATION OF DRONES WITH
ARTIFICIAL INTELLIGENCE FOR MILITARY PURPOSES,” In proceedings with the
10th International Scientific Conference of Defensive Technologies -OTEH, Belgrade, Siberia,
2022.
[17]. J. Qiu, J. Liu and Y. Shen, "Computer Vision Technology Based on Deep Learning,"
2021 IEEE 2nd International Conference on Information Technology, Big Data and Artificial
Intelligence (ICIBA), Chongqing, China, 2021, pp. 1126-1130, doi:
10.1109/ICIBA52610.2021.9687873.
[18]. M. R. Dileep, A. V. Navaneeth, S. Ullagaddi and A. Danti, "A Study and Analysis on
Various Types of Agricultural Drones and its Applications," 2020 Fifth International
Conference on Research in Computational Intelligence and Communication Networks
(ICRCICN), Bangalore, India, 2020, pp. 181-185, doi:
10.1109/ICRCICN50933.2020.9296195.
[19]. T. C. Mallick, M. A. I. Bhuyan and M. S. Munna, "Design & implementation of an UAV
(Drone) with flight data record," 2016 International Conference on Innovations in Science,
Engineering and Technology (ICISET), Dhaka, Bangladesh, 2016, pp. 1-6, doi:
10.1109/ICISET.2016.7856519.

Dept. of ECE, AMCEC 2023-24 2


4

You might also like