0% found this document useful (0 votes)
28 views5 pages

Development of Autonomous Drones For Adaptive

Uploaded by

amal.es23
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
28 views5 pages

Development of Autonomous Drones For Adaptive

Uploaded by

amal.es23
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/328312350

Development of Autonomous Drones for Adaptive Obstacle Avoidance in Real


World Environments

Conference Paper · August 2018


DOI: 10.1109/DSD.2018.00009

CITATIONS READS
36 4,205

3 authors:

Arne Devos Emad Samuel Malki Ebeid


University of Southern Denmark University of Southern Denmark
1 PUBLICATION 36 CITATIONS 78 PUBLICATIONS 783 CITATIONS

SEE PROFILE SEE PROFILE

Poramate Manoonpong
University of Southern Denmark
16 PUBLICATIONS 170 CITATIONS

SEE PROFILE

All content following this page was uploaded by Emad Samuel Malki Ebeid on 22 January 2020.

The user has requested enhancement of the downloaded file.


Development of Autonomous Drones for Adaptive
Obstacle Avoidance in Real World Environments
Arne Devos Emad Ebeid Poramate Manoonpong
Faculty of Engineering Technology, SDU UAS Center, MMMI, Embodied AI and Neurorobotics Lab, MMMI,
KU Leuven, University of Southern Denmark, University of Southern Denmark,
Belgium Denmark Denmark
[email protected] [email protected] [email protected]

Abstract—Recently, drones have been involved in several


critical tasks such as infrastructure inspection, crisis response,
and search and rescue operations. Such drones mostly use
sophisticated computer vision techniques to effectively avoid
obstacles and, thereby, require high computational power. There-
fore, this work tuned and tested a computationally inexpensive
algorithm, previously developed by the authors, for adaptive
obstacle avoidance control of a drone. The algorithm aims at
protecting the drone from entering in complex situations such
as deadlocks and corners. The algorithm has been validated
through simulation and implemented on a newly developed drone
platform for infrastructure inspection. The design of the drone Fig. 1: The adaptive obstacle avoidance control acts as a
platform and the experimental results are presented in this study. closed-loop control system which can be directly applied to
simulated and real drones
Index Terms—Autonomous drone system, adaptive obstacle
avoidance, simulation, implementation.
avoidance. These sensor equipment are heavy and usually need
I. I NTRODUCTION complex algorithms, which require high computational and
Unmanned Aerial Vehicles (UAV) or drones have been a processing power, for signal processing.
subject of great interest over the last decades. Applications To achieve an autonomous drone with less complex sensors
range from the first response in crisis situations [1] to agricul- and low computational and processing power for adaptive
ture [2],[3] and safety inspections [4],[5]. The environment in obstacle avoidance in real-world environments, a drone devel-
which they operate is mostly known and away from obstacles. opment and its adaptive obstacle avoidance control is presented
When moving to more complex surroundings, like indoors, here. This study continues previous work [10]. It is based on
there is often a qualified pilot required to operate these a simple two-neuron recurrent network with synaptic plastic-
drones. However, it is not always possible or wanted to have ity [11], uses only two small and light-weight LiDAR (Light
a pilot controlling the drone. With the market for drones Detection and Ranging) sensors to detect obstacles and enable
growing rapidly [6], there is a need for drones to be operated the drone to autonomously avoid them. Due to the neural
autonomously. An important challenge for the drone is to dynamics and synaptic plasticity of the network, the drone
autonomously and adaptively avoid obstacles it may encounter can also effectively adapt its obstacle avoidance behavior to
in complex environments. successfully navigate in complex environments with obstacles,
To address the challenge, there has been a lot of research corners, and deadlocks.
done to explore different methods. Some rely on computer The paper is structured as follows; Section II describes the
vision and algorithms to identify objects [7]. This provides adaptive obstacle avoidance control. Section III presents the
the distance to the object such that the drone can move complete design of a LiDAR-based obstacle avoidance drone
to ensure a safe flight trajectory. Despite, this approach is system. Section IV presents the simulation results of adaptive
working well in simple environments, it does not perform obstacle avoidance and navigation behaviors of a simulated
well in surroundings with complex structures or cramped drone in complex environments.
indoor spaces. A way to resolve this is to make a map of
the surroundings with the SLAM (Simultaneous Localization II. A DAPTIVE OBSTACLE AVOIDANCE CONTROL
and Mapping) technique [8],[9]. Once the map has been ob- The adaptive obstacle avoidance control for a drone was
tained, a path can be planned for the drone. Nevertheless, this developed and presented in [10] (see Fig. 1). It is based on
control strategy uses cameras, laser scanner or complex sensor a simple two-neuron recurrent network with synaptic plastic-
arrays to create a map and perform navigation with obstacle ity [11]. Here we shortly introduce the key components and
features of the control network where its details are referred forces of the motors along its arms without adding too much
to [10]. The network consists of three main discrete-time non- weight. In this work, Tarot 650 Sport frame has met these
spiking neurons. It receives obstacle detection signals from requirements (Fig. 2).
two LiDAR sensors, installed at the front part of the drone. The
range of each sensor is set to 50 cm. The raw sensory signals
are mapped to a range of [-1, 1] where -1 means no obstacle
in the range and 1 means that an obstacle is near (about 20
cm distance). The output of the control network is converted
to a yaw command for steering the drone. According to this
setup, a positive yaw value (meaning that the drone detects an
obstacle on the left) will steer the drone to turn right and a
negative yaw value (meaning that the drone detects an obstacle
on the right) will steer it to turn left.
By exploiting short-term memory in the neural dynamics
of this recurrent control network, the drone can continuously
turn to successfully avoid an obstacle although the sensors do
not detect an obstacle anymore. In other words, this turning
behavior is guided by first LiDAR sensory feedback and
then later by the short-term memory. Such a memory-driven
turning behavior is important to deal with corner and deadlock Fig. 2: The drone with 1) Pixhawk, 2) telemetry, 3) receiver,
situations. By applying synaptic plasticity [11] to the control 4) LIDAR sensor, 5) GPS module, 6) Raspberry Pi
network, the short-term memory will be regulated online
during the interaction between the drone and the environment. When dimensioning the motors, the type and length of the
This temporal memory regulation leads to optimal turning propellers have to be chosen based on the desired amount of
behavior to avoid obstacles, corners, and deadlocks in different thrust that the drone needs to lift its payload. For example,
environments. As a result, the drone can successfully navigate if the propeller is too small, the drone will not have enough
in complex environments with obstacles without getting stuck. thrust to lift its equipment. To have a first idea of how the
drone will perform, a software tool like eCalc [13] can be
III. L I DAR- BASED OBSTACLE AVOIDANCE DRONE DESIGN
used. eCalc offers a large database of common parts and it is
The most important requirement for a drone, especially easy to view which parameters can be changed.
when flying beyond visual line of sight, is safety including an To meet the requirements of lifting power, the drone will
ability to autonomously and adaptively avoid obstacles [12]. be equipped with 370 KV motors together with carbon fiber
The drone should be reliable so the mission can be completed propellers of 14” long. The carbon fiber gives extra stiffness
without endangering people or damaging property. The battery to the propeller so it does not bend when it is rotating at high
has to be large enough to ensure a sufficiently long flight speed. This helps to dampen unwanted vibrations.
time, the control strategy should be easy to implement in the Every motor needs to be driven by an Electronic Speed
controller, and the drone should be able to lift the necessary Control (ESC) to couple with the flight controller. It tells the
equipment. Since the end-goal is to implement the control motors how much power should go to each motor to give the
strategy in small drones, weight and dimensions of the parts correct steering. For our configuration, the motor should be
should be considered as well. The first section presents the equipped with an ESC of maximum 40A. This will be enough
basic mechanical components that make up the drone so it fits to cover the demanded power by the motors.
the preconditions. The second section presents the electronic To decide on the size of the battery, it is important to find a
components, like the controller and sensors for autonomous balance between weight and flight time. A battery of 22.2 V
obstacle avoidance, as well as how to set up these sensors and a capacity of 8000 mAh was chosen for the drone. This
correctly and calculate the minimum width of detectable allows for a flight time of 25 minutes without extra payload.
obstacles. the third section presents the software design of
the drone system. B. Hardware components
To be able to add other components and implement devel-
A. Mechanical components oped control methods on the drone, the controller should be
The first decision to be made is how many rotors the drone flexible to use [14]. According to this, an open-source flight
should have. In this work, a quadcopter has been chosen as controller is used, named PIXHAWK 2 [15], [16] (see Fig. 2-
a study platform since this meets the design requirements in 1). This controller equips with a 3D gyroscope, an accelerom-
regard to weight and power usage. The next step is to find eter and, a barometer. Other installed equipment includes
a suitable frame. It has to be strong enough to carry sensors telemetry for data exchange with a ground station (Fig. 2-2),
and other equipment (e.g., inspection camera). Carbon fiber a receiver for manual control (Fig. 2-3), a GPS module with
is commonly used since it is rigid enough to withstand the an internal compass for outdoor navigation (Fig. 2-5) and a
Raspberry Pi Zero to run the algorithm (Fig. 2-6). The RPi
is connected with the flight controller through the serial link.
Details about the software components is presented in III-C.
Classical obstacle avoidance strategies to control the drone
are based on computer vision [7], [8] which needs a great
deal of processing power. The proposed neural-based control
method (see Section II) uses two simple sensors which can be
ultrasonic or LIDAR. This reduces the amount of high dimen-
sional signal processing in comparison to image processing
for computer vision. Here two LIDAR-Lite v3 sensors from
Garmin are used (Fig. 3 and 2-4). The sensor is low-cost,
lightweight, and easy to interface with the PIXHAWK module
of the drone via I2C, with a viewing angle of 2◦ (Fig. 3). It Fig. 4: Dimensions of the drone and the sensors in regard to
is assumed that the sensors are mounted at the center of the the width of the obstacle
frame with an orientation angle of Θ. This angle depends on
the sensor range Ls and the safety margin Lm as shown in
Fig. 4. These two parameters are considered in order to make MAVProxy to gain easier access to the command library of
sure that the drone does not hit an obstacle with its propellers. the drone. Thus, the controlling algorithm can run onboard and
The minimum width of the detected obstacle (Wobj ) can be send commands to the drone through the DroneKit framework.
determined by the following equation:
Θ Ld IV. E XPERIMENTAL RESULTS
Wobj ≥ 2 tan( )(Lx + ), (1)
2 2
In this study, preliminary results of the performance of
where Lx is the distance between the drone and the obstacle. the adaptive obstacle avoidance control in the simulation
This distance depends on the speed of the drone as well as the are shown. A V-REP simulation [20] was used to simulate
response time in which the drone can safely turn. The drone’s a drone and the controller was implemented in C++. The
response time varies based on the drone’s mechanical and communication between the simulated drone and the neural
electronic selected parts. However, the worst-case response control network is based on the Robot Operating System
time of a drone can be calculated from the drone’s behavior (ROS). With this implementation, the controller can easily be
on air and its empirical data. transferred to the real drone hardware.
The width of the drone in the simulation is 34 cm (Ld ,
shown in Fig. 4). Together with a safety margin of 6 cm (Lm ,
shown in Fig. 4) and a sensor range of 50 cm (Ls , shown in
Fig. 4), this makes for an orientation angle Θ (Fig. 4) of 40◦
between the sensors. In this test, the speed of the drone was
set to 0.1 m/s.
Two complex environments with different obstacles, cor-
ners, and deadlocks (see Fig. 1) were created. The first one
was made to look like a room that has to be explored. The
second environment has more sharp turns and deadlocks. The
minimum width of the obstacles was calculated with equa-
tion 1. If the drone can safely turn when the object is 15 cm
Fig. 3: Schematics to connect the sensors and Raspberry PI away, the minimum width of the obstacles has to be 24 cm. All
with the controller objects in the maps are larger than this value. Figure 5 shows
the results of the experiments. Each environment was run for
about 30 minutes with the drone having no knowledge of
C. Software components the map. The drone successfully navigated both environments
In order to control the drone through the RPi, first, the RPi without getting stuck and colliding with the obstacles. It can
has to communicate with the drone’s flight controller through also explore small corridors. The results have been recorded
MAVLink [17] (Micro Air Vehicle Link) protocol. In order and can be watched from [21]. Other quantitative tests in
to do so, MAVProxy [18], a MAVLink ground station written different environments with a low density of obstacles, a high
in Python, is installed in the RPi OS (Raspbian) to translate density of obstacles, and a high density of obstacles corners
readable commands into MAVLink command messages. and deadlocks can be also seen at [10]. There, we observed
To streamline and make automation easier, a Python-based that the drone had 100 % success to navigate without getting
module named DroneKit [19], a collection of drone APIs with stuck or crashing in the environment with a low density of
underlying functionality to control a drone, is used on top of obstacles, 90 % success in the environment with a high density
of obstacles, and 90 % success in the environment with a high [4] R. Ashour, T. Taha, F. Mohamed, E. Hableel, Y. A.
density of obstacles as well as corners and deadlocks. Kheil, M. Elsalamouny, M. Kadadha, K. Rangan, J. Dias,
L. Seneviratne et al., “Site inspection drone: A solu-
tion for inspecting and regulating construction sites,”
in Circuits and Systems (MWSCAS), 2016 IEEE 59th
International Midwest Symposium on. IEEE, 2016, pp.
1–4.
[5] J. Irizarry, M. Gheisari, and B. N. Walker, “Usability
assessment of drone technology as safety inspection
tools,” Journal of Information Technology in Construc-
tion (ITcon), vol. 17, no. 12, pp. 194–212, 2012.
[6] D. Floreano and R. J. Wood, “Science, technology and
the future of small autonomous drones,” Nature, vol. 521,
(a) Environment 1 (b) Environment 2 no. 7553, p. 460, 2015.
[7] H. Sedaghat-Pisheh, A. R. Rivera, S. Biaz, and R. Chap-
Fig. 5: Results of adaptive autonomous obstacle avoidance
man, “Collision avoidance algorithms for unmanned
aerial vehicles using computer vision,” Journal of Com-
V. C ONCLUSION puting Sciences in Colleges, vol. 33, no. 2, pp. 191–197,
2017.
In this paper, an adaptive neural control with synaptic [8] R. Mur-Artal and J. D. Tards, “Visual-inertial monocular
plasticity for autonomous obstacle avoidance and exploration slam with map reuse,” IEEE Robotics and Automation
was explained. The V-REP simulator was used to simulate Letters, vol. 2, no. 2, pp. 796–803, April 2017.
a drone and different environments as well as to evaluate [9] S. Moon, W. Eom, and H. Gong, “Development of large-
the performance of the developed controller and demonstrate scale 3d map generation system for indoor autonomous
the obstacle avoidance behavior. Complete design of a drone navigation flight work in progress,” Procedia Engineer-
platform was presented. The drone is developed to fit the ing, vol. 99, pp. 1132 – 1136, 2015.
requirements of the adaptive obstetrical avoidance algorithm. [10] C. K. Pedersen1 and P. Manoonpong, “Neural Control
Therefore, two light-weight LiDAR sensors were selected and and Synaptic Plasticity for Adaptive Obstacle Avoidance
mathematical equations were formulated to find the exact of Autonomous Drones,” Lecture Notes in Artificial In-
locations of the LiDARs to satisfy the predefined design re- telligence, 2018.
quirements such as the minimum width of detectable obstacles. [11] E. Grinke, C. Tetzlaff, F. Wörgötter, and P. Manoonpong,
Future work aims to test the drone in similar simulation “Synaptic plasticity in a recurrent neural network for
environments and compare its results. versatile and adaptive behaviors of a walking robot,”
Frontiers in neurorobotics, vol. 9, p. 11, 2015.
ACKNOWLEDGMENT
[12] FreeD, “Free the Drones,” https://www.sdu.dk/freed.
The research leading to these results has received funding [13] Solution for All Markus Mueller, “eCalc,”
from SDU Strategic Focus Drones for Energy project. https://www.ecalc.ch/xcoptercalc.php.
[14] E. Ebeid, M. Skriver, K. H. Terkildsen, K. Jensen, and
R EFERENCES U. P. Schultz, “A Survey of Open-Source UAV Flight
[1] L. Apvrille, T. Tanzi, and J.-L. Dugelay, “Autonomous Controllers and Flight Simulators,” Microprocessors and
drones for assisting rescue services within the context Microsystems, vol. 61, pp. 11 – 20, 2018.
of natural disasters,” in General Assembly and Scientific [15] px4, “Pixhawk,” https://www.pixhawk.org.
Symposium (URSI GASS), 2014 XXXIth URSI. IEEE, [16] Dronecode Project, “px4: the professional autopilot,”
2014, pp. 1–4. https://www.px4.io.
[2] T. Pobkrut, T. Eamsa-ard, and T. Kerdcharoen, “Sensor [17] mavlink, “MAVLink,” https://github.com/mavlink.
drone for aerial odor mapping for agriculture and security [18] ArduPilot, “MAVProxy,”
services,” in Electrical Engineering/Electronics, Com- https://github.com/ArduPilot/MAVProxy.
puter, Telecommunications and Information Technology [19] 3DR, “DroneKit,” https:/dronekit.io/.
(ECTI-CON), 2016 13th International Conference on. [20] E. Rohmer, S. P. Singh, and M. Freese, “V-rep: A
IEEE, 2016, pp. 1–5. versatile and scalable robot simulation framework,” in
[3] V. Duggal, M. Sukhwani, K. Bipin, G. S. Reddy, and Intelligent Robots and Systems (IROS), 2013 IEEE/RSJ
K. M. Krishna, “Plantation monitoring and yield estima- International Conference on. IEEE, 2013, pp. 1321–
tion using autonomous quadcopter for precision agricul- 1326.
ture,” in Robotics and Automation (ICRA), 2016 IEEE [21] “Recording of testings in different environments,”
International Conference on. IEEE, 2016, pp. 5121– https://tinyurl.com/ya7thupd.
5127.

View publication stats

You might also like