0% found this document useful (0 votes)
13 views

Autonomous Drone Control System For Object Tracking Flexible System Design With Implementation Example

reviewwwww paper autonomous drone

Uploaded by

apsksingh12
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views

Autonomous Drone Control System For Object Tracking Flexible System Design With Implementation Example

reviewwwww paper autonomous drone

Uploaded by

apsksingh12
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 5

Autonomous drone control system for object tracking

Flexible system design with implementation example

Paweł Smyczyński, Łukasz Starzec, Grzegorz Granosik


Institute of Automatic Control
Lodz University of Technology
Łódź, POLAND

Abstract– This paper contains presentation of the flexible


control system for an autonomous UAV (unmanned air vehicle). One of greatest advantages of autonomous drones is
The complete description of hardware and software solutions possibility of conducting synchronized operations in
used to realize autonomous flight are presented in this work. cooperation with different units. Fine example of such
Main objective of the research was to develop software which functionality can be cooperative localization and mapping [3].
provides ease of adjustment and extendibility to drone system Operating and navigating outdoors can be very precise due to
with different equipment. Presented system is utilizable on availability of GPS signal. Cooperative flight in open area can
various hardware platforms and is capable of realizing different be performed well due to lack of obstacles that could have
missions with minimal adjustments. Described concept negative impact on communication devices. Flying in more
significantly simplifies designing complex system by introducing
modular architecture. Presented method of dividing software
obstructed area e.g. inside building is more demanding. It
components into modules with single functionality minimizes requires very precise movement in narrow areas with GPS
amount of work necessary to adjust system in case of changes in signal obstructed or unavailable. Despite those problems it is
hardware. possible to perform cooperative flight inside buildings with
vision-based system [4].
Presented general concept of system architecture is backed
up with real life working model designed for tracking and Creating autonomous system is highly challenging because
landing on moving target. This paper contains detailed it requires integration of multiple elements e.g. mission
description of algorithms used in the project. Landing area planning, navigation or terrain mapping. There are many
detection is accomplished with vision system. Canny’s edge
available solutions for those problems separately, however
detection algorithm with contour shape analysis algorithm is
used for marker detection. Lucas-Kanade optical flow algorithm integrating those components can be problematic.
is applied for tracking detected pattern. Mission planning is Development of software platform with reusable components
realized as dedicated state machine developed for this particular is necessary to simplify design of autonomous drones [5].
task.
The presented system was designed to participate
System design is built with use of ROS (Robot Operating in Mohamed Bin Zayed International Robotic Challenge
System) and is utilizing its subscriber-publisher method of data (MBZIRC) in Abu Dhabi. Competition included two scenarios
exchange between separated software units. Especially designed for autonomous UAV (unmanned air vehicle) i.e. drone:
frame is used as hardware platform. Exemplary system is
realized with Raspberry Pi 3 as onboard computer and Pixhawk  Landing on top of moving vehicle.
flight controller.
 Cooperation of multiple UAVs in collecting and
This concept and the exemplary system is a result of transporting objects.
preparation for Mohamed Bin Zayed International Robotic
Challenge 2017 in Abu Dhabi. Results from experiments Creating universal system capable of performing well
performed as trials for the competition and future prospects are in both of those tasks would be very difficult as they are very
presented in this paper. unlike each other and require different approaches. Moreover,
Keywords—autonomous flight; drone control; flexible system each task demanded different characteristics from machine i.e.
design high agility for landing on vehicle and ability to fly with
payload for collecting objects, so using different machine for
I. INTRODUCTION each challenge was possible option. However, some
Demand on autonomous robots is currently rising. With components e.g. flight controller could be used in both
rapid increase in microcomputer performance it is possible to challenges, since the code used to handle those functions
create very advanced solutions. Many works were done on could be reused without or with minimal change. The best
autonomous vehicles. In media there is a lot of talk about solution for those problems was designing highly adjustable
autonomous cars but robots are not limited to it. For instance, and multiplatform system with possibility to easily reuse or
there are multiple works on autonomous rescue robots [1],[2]. replace parts of software.

978-1-5386-2402-9/17/$31.00 ©2017 IEEE 734

Authorized licensed use limited to: Chandigarh University. Downloaded on November 03,2024 at 14:34:29 UTC from IEEE Xplore.
II. SYSTEM DESIGN Mission control nodes can be divided into three
System consists of two main components: flight controller subcategories:
and onboard computer. Flight controller is responsible for
 Mission planning nodes – responsible for optimizing
executing movements of UAV accordingly to provided control
mission execution schema based on provided sensor
signal. It is also directly connected to some of sensors e.g.
data accordingly to programed mission scenario.
GPS. Onboard computer is the main processing unit of the
whole system. Its role is to manage and compute sensor data  Control law nodes – responsible for computing
and send the correct control signal to flight controller. control signals using programed regulators and
control laws.
We decided to build our software in the modular way.
Each module should be responsible for single task and the  Flight controller interface node – responsible for the
communication between modules had to be fast and stable. communication with flight controller.
Important factor was scalability of the system to allow
connecting new devices and sensors into it without a need The flight controller interface node does not have to be
to do great changes in already written code. single node. This can actually be set of nodes, but this
category was separated because of its unique role in the
Because some of sensors are connected directly to flight system. It is also one of the elements that do not need any
controller there is a need to propagate acquired data changes when the same UAV is adjusted to different missions.
to computing unit. The best solution would be to make data The diagram of our design is presented in Fig. 1.
coming from each sensor available for every software module
at any time. Depending on the complexity of task there might be a need
to apply different control laws. Mission planning also can be
All of above-mentioned requirements are fulfilled by ROS divided into multiple submodules depending on situation e.g.
(Robot Operating System). Software developed with ROS can state machine for controlling the scenario execution and
consist of numerous modules (called nodes) connected with optimization engine for cooperative operation for multiple
each other through publisher-subscriber system. Another big drones.
advantage of using ROS is availability of vast number
of component (ROS packages), in particular MAVROS
(MAVLink extendable communication node for ROS with
Data handling nodes
proxy for Ground Control Station) package which implements
MAVLink (Micro Air Vehicle Communication Protocol)
protocol and provides means for communication between Sensor Sensor
flight controller and onboard computer.
III. SOFTWARE MODULES
Data processing 1 Data processing 1
In general, design modules could be divided into few
categories based on different criteria. On highest level two
main classes can be distinguished:
 Data handling nodes – responsible for handling and Data processing n
analyzing sensor data and providing it in correct form
to mission control nodes.
 Mission control nodes – responsible for executing
missions accordingly to sensor data and control law.
Each of those contains few subcategories. For data Mission planning
handling nodes there are: Control law

 Hardware handle nodes – modules responsible for


acquiring raw data from sensors and/or controlling
additional equipment.
Flight control interface
 Data processing nodes – responsible for analyzing
and filtering data from sensors to provide necessary
Mission control nodes
information for UAV control algorithm.
Hardware handle node can handle multiple sensors e.g. all
Fig. 1. General system design
connected through GPIO (General Purpose Input/Output) pins
on Raspberry Pi board and also other devices. Each sensor
should have its own publisher to make acquired data available
to other nodes. If sensor data needs filtering or processing it
should be done by one or more data processing nodes.

735

Authorized licensed use limited to: Chandigarh University. Downloaded on November 03,2024 at 14:34:29 UTC from IEEE Xplore.
IV. EXAMPLE OF SYSTEM During tests a great additional load on cpu occurred due
A. Hardware platform to separating image acquisition and processing nodes.
To remove this problem those nodes were merged into one.
For the competition in Abu Dhabi a custom hexacopter
The greatest difference between those two approaches
was constructed. It was equipped with Pixhawk flight
is accessibility of raw image; in final solution it is only
controller and Raspberry Pi 3 as an onboard computer. Set
available for one node. In this particular case only one node
of sensors providing data about drone position was connected
needed access to the image, however, this problem should be
directly to Pixhawk; data from those sensors were tunneled to
further investigated and solved to enable more flexible
Raspberry Pi through MAVROS. To enable autonomous flight
solution.
there were additional sensors needed i.e. a camera and a limit
switch in the gripper to provide feedback. Camera with image
processing algorithm allowed detection and tracking of the C. Image processing
landing pattern placed on the top of moving vehicle. The limit The whole image processing
sensor was used to ensure successful landing before turning algorithm was developed with the use
motors off and also to confirm gripping of an object. of OpenCV library. Role of the vision
Hexacopter constructed for competition is shown on Fig. 2. system was to provide the information
about position of landing platform. The
platform was marked with the pattern
shown in Fig. 3. Fig. 3. Landing pattern

Image processing algorithm was composed of two stages:


 Pattern detection
 Following the detected pattern
Main problem was creating algorithm capable of working
in real time. Because of that, it was necessary to limit the
amount of data as much as possible and to avoid complex
calculations. The landing pattern shown in Fig. 3 was detected
through finding a square with a circle inside in the captured
image. The first stage of image analysis was contour detection
Fig. 2. Hardware platform designed for competition on grayscale image with Canny edge detection algorithm [6].
In the next step some contours being too small or having
Pixhawk was chosen due to our previous experience with incorrect width to height ratio of its bounding rectangle were
this platform and its compatibility with ROS through excluded from analysis in order to lower computing time
MAVROS. of following operations. Using the bounding rectangle
dimensions ratio criteria (created for this task) was possible
Raspberry Pi 3 provides good computing power and it is because independently of rotation for both square and circle
easily connectable through ROS with other components used the bounding rectangles should be close to a square shape.
in this project. One of important factors is existence As the third step, all remaining shapes were compared to
of dedicated Raspberry Pi camera modules which are small a square using the function based on moments invariants
in size and provide good quality of image. Also Raspberry Pi introduced by Hu [7]. If the result of comparison was positive
is relatively cheap and easily available which is important all shapes inside square were similarly compared to a circle.
in case of damage during experiments. It also possesses wide If both checks returned positive results, both square and circle
community of engaged users and good support. were qualified as parts of a marker. As the last part of marker
B. Vision system detection, the corners in the center of the cross were found and
According to presented general design, the vision system algorithm was switched to the second stage.
was separated into three nodes: In the second stage, detected corners were tracked with
 Image acquisition node. Lucas-Kanade optical flow algorithm [8]. An optical flow
calculation takes significantly more time for images of higher
 Image processing node. resolution. To increase image processing speed two methods
were used: image scaling and ROI (region of interest)
 Calibration node.
selection. Switching between those methods was based on
Image deformations were negligible (good quality optics marker size. Image scaling has risk of losing some details and
and narrow angle of view). Calibration node was used can result in errors in tracking. However, during flight on low
to calculate tracked object relative position. With such design altitude marker occupies great part of image and such risk is
it was possible to exchange the existing camera with any other nonexistent.
with different parameters without additional changes in image
When UAV was flying with high altitude, marker was
processing node.
smaller in the image. Image scaling in that case would result
in lowering accuracy and possibly inability to correctly detect

736
Authorized licensed use limited to: Chandigarh University. Downloaded on November 03,2024 at 14:34:29 UTC from IEEE Xplore.
marker. To reduce amount of data it was essential to properly If during landing marker left this region algorithm switched
chose ROI. The size of region was constant and the position of back to following. If marker was detected during start
the next one was based on current position of marker in image procedure system directly switched to following procedure. If
and its displacement. Two element vector representing during execution of either following or landing procedure
position of center of ROI for n+1-th frame can be calculated marker was lost, drone returned to the center of the path and
as (1): was hovering over it the until marker was detected again.
Cn+1  Mn  n  Start

Where Cn+1 is position of center of ROI calculated for next Position achieved

and ∆𝑀 is displacement of center of marker between two


frame, Mn is position of marker center in image in n-th frame
Hover
consecutive frames calculated as (2):

n  Mn - Mn-1  Marker found Marker lost

It is important that all values must be expressed on original


image, that means that values from scaled image or ROI must Follow marker
be corrected. Such correction is also needed when switching
from scaling to selecting ROI and otherwise. Marker near center

D. Control system Land


For landing on moving platform all details of the mission
were well defined. Start position, platform’s movement path
and its height were provided in task description. To realize this Limit switch closed
mission the simple algorithm in the form of the state machine
was used. Control law consisted of three PID controllers
Finished
of drone velocity – one for each of the axes. Communication
with flight controller was realized with MAVROS.
Fig. 4. Mission state machine
Mission plan was simple and well defined and only one
control low was used, that did not need to be very
sophisticated. To simplify software structure, mission V. TESTS AND RESULTS
planning and control law were implemented as a single node. Design of the system used for competition is shown
in Fig. 5. As described above some components of the
E. Mission plan
presented general design were merged.
Mission plan was realized as a state machine. Platform was
moving along the predefined 8-shaped path. Main factor in the For data handling nodes there are two categories
task evaluation was an execution time. Instead of actively as described before. There are two nodes in Hardware handle
searching for landing pattern, a strategy of waiting nodes. The first one is GPIO handle node which is responsible
for platform to appear in camera’s field of view was assumed. for managing GPIO pins on Raspberry Pi board. Its role
The best place to wait for the platform was the crossroad is only providing information about current status on pins. The
in the center of the path, because the platform appeared in this second one is Image processing node. In final design image
point with twice higher frequency than in any other on the acquisition and processing were merged together into one
path. image processing node. The reason was related to system
performance. While ROS provides packages to handle image
When the pattern was detected, the drone started to follow acquisition and share data with other nodes it generates to
it without lowering altitude. If marker was close to the center much load on CPU resulting with great decrease (about 50%)
of the image, UAV altitude was being lowered. Finally, in image processing speed. As the Image processing node
closing of limit switch signalized that landing was completed. is responsible both for image acquisition and data analysis it
This strategy can be described as set of states: starting, can also be put under Data processing nodes category.
hovering over position, following, landing and finished. Including Image processing node there are three Data
Machine and triggers of state changes are presented in Fig. 4. processing nodes in final design. Camera calibration node
Start procedure consisted of takeoff and flying to defined converts data form pixels to meters based on camera model
position in center of eight-shaped track. When designated and other sensor data. Limit sensor node takes raw data form
position was achieved drone hovered over it until landing GPIO handle node and uses simple algorithm to check if drone
marker was found. Detection of landing pattern triggered has successfully landed. The algorithm checked if at least two
following procedure which switched to landing procedure limit sensors detected landing at the same time with some time
(tracking marker and lowering altitude) when the center of the threshold to avoid false detection if drone bounced of the
marker was within defined range from the center of the image. landing platform’s surface.

737

Authorized licensed use limited to: Chandigarh University. Downloaded on November 03,2024 at 14:34:29 UTC from IEEE Xplore.
Processed data is provided to Mission planning node order to achieve satisfactory results with higher speed other
which was merged with control law node. This node types of regulators should be tested.
calculated control signals based on mission plan and sensor
data. Control signals are published and sent through
MAVROS to Pixhawk.

Data handling nodes

GPIO handle

e handle
Hardwar
Image processing node

Data

Camera calibration node Limit sensor node


processi

Fig. 6. Following and landing attempt on moving marker

VI. FUTURE PERSPECTIVES


Control

planni

Missi

Mission planning
law

In the near future there is a plan to change an onboard


on

computer to more powerful one so that it allows more


complex operations without introducing additional delay into
the control loop. This should enable further work on
development of the more efficient control law.
control
interfa

Flight

MAVROS For the general design no immediate changes are needed,


ler

however introducing the new set of nodes for controlling


additional equipment e.g. gimbal movements is considered.

Mission control nodes REFERENCES


Fig. 5. Design of the system used for competition [1] N. Suzuki and Y. Yamazaki, “Basic research on the driving performance
of an autonomous rescue robot with obstacles”, in 2015 IEEE
Result achieved with this software configuration up to now International Conference on Robotics and Biomimetics (ROBIO), pp.
is the following of an object moving with the velocity 982–987, 2015.
up to 5 km/h and lowering altitude to around 1 m over surface. [2] P. Mandal, R. K. Barai, M. Maitra, S. Roy, and S. Ghosh, “Autonomous
Exemplary results of an experiment are shown on sequence robot coverage in rescue operation”, in 2013 International Conference on
Computer Communication and Informatics, pp. 1–5, 2013.
of pictures in Fig. 6 below.
[3] C. Forster, S. Lynen, L. Kneip, and D. Scaramuzza, “Collaborative
The sequence presented in Fig. 6 shows that building Monocular SLAM with Multiple Micro Aerial Vehicles”, in IEEE/RSJ
system according to presented design was successful. Drone Int. Conf. Intelligent Robots a Systems, pp. 3963–3970, 2013.
is attempting landing while following moving landing pattern. [4] J. Pestana, J. L. Sanchez-Lopez, P. de la Puente, A. Carrio, and P.
During this experiment minimal altitude above target was Campoy, “A Vision-based Quadrotor Multi-robot Solution for the Indoor
Autonomy Challenge of the 2013 International Micro Air Vehicle
limited due to safety reasons. The marker was moved at the Competition”, J. Intell. Robot. Syst., vol. 84, no. 1, pp. 601–620, 2016.
speed of walking person.
[5] J. L. Sanchez-Lopez, M. Molina, H. Bavle, C. Sampedro, R. A. Suárez
UAV performance was not as good as expected, however Fernández, and P. Campoy, “A Multi-Layered Component-Based
Approach for the Development of Aerial Robotic Systems: The Aerostack
results proved presented general design rules to be correct. Framework,” J. Intell. Robot. Syst., 2017.
One of reasons for worse performance than expected was the
[6] J. Canny, “A Computational Approach to Edge Detection,” IEEE Trans.
use of Raspberry Pi as onboard computer. Rasperry Pi Pattern Anal. Mach. Intell., vol. PAMI-8, no. 6, pp. 679–698, 1986.
provided too little computing power for image processing and
[7] M.-K. Hu, “Visual pattern recognition by moment invariants”, IRE Trans.
mission control resulting with performance decrease in both Inf. Theory, vol. 8, pp. 179–187, 1962.
areas. [8] B. D. Lucas and T. Kanade, “An iterative image registration technique
Using PID regulators for following platform was with an application to stereo vision”, International Joint Conference on
Artificial Intelligence. pp. 674–679, 1981.
successful only with movement speed less than 5 km/h. In

738

Authorized licensed use limited to: Chandigarh University. Downloaded on November 03,2024 at 14:34:29 UTC from IEEE Xplore.

You might also like