0% found this document useful (0 votes)
16 views11 pages

Development of An Automated Camera-Based Drone Landing System

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views11 pages

Development of An Automated Camera-Based Drone Landing System

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

Received October 14, 2020, accepted October 27, 2020, date of publication October 30, 2020, date of current

version November 18, 2020.


Digital Object Identifier 10.1109/ACCESS.2020.3034948

Development of an Automated Camera-Based


Drone Landing System
MALIK DEMIRHAN1 AND CHINTHAKA PREMACHANDRA 2, (Member, IEEE)
1 Department of Electronic Engineering, Shibaura Institute of Technology, Tokyo 135-8548, Japan
2 Department of Electronic Engineering, School of Engineering/Graduate School of Engineering and Science, Shibaura Institute of Technology, Tokyo 135-8548,

Japan
Corresponding author: Malik Demirhan ([email protected])
This work was supported in part by the Branding Research Fund of Shibaura Institute of Technology.

ABSTRACT Digital image processing serves as a multifunctional tool for measurement and positioning
tasks in robotics. The present paper deals with the development of a camera-based positioning system for
quadrocopters in order to automate their landing process. In this regard, a quadrocopter equipped with
classical radio-control components is upgraded with applicable hardware, such as a Raspberry Pi 3B+ and
a wide-angle camera. Hereupon, black-box system identifications are executed to attain the relevant plants
of the attitude control performed by the flight controller. Thereby, a PID-controller for the altitude control
including a back-calculation anti-windup as well as two PD-controllers for the horizontal plane are designed
using a pole placement method. Effective tests of the controller gains are conducted by simulating the closed
loops respectively. Since the camera functions as a position sensor, an image processing algorithm is then
implemented to detect a distinctive landing symbol in real time while converting its image position into
compliant feedback errors (pixel-to-physical distance-conversion). Ultimately, the developed system allows
for the robust detection and successful landing on the landing spot by a position control operating at 20 hertz.

INDEX TERMS Autonomous drone, quadrocopter, landing, PID, control engineering, system identification,
image processing, wide angle camera, region of interest.

I. INTRODUCTION But beforehand, a framework respectively useful base is to


Over the last years the popularity of unmanned aerial vehicles be developed from the beginning. This includes the design
increased enormously together with their use cases. While in and validation of an automated landing system based on a
the beginning they were mainly used for military purposes, classical 2D-camera combined with a symbolized landing
they are now being introduced more and more in economic pad. For this purpose, a quadrocopter equipped with
as well as academic sectors. Especially automated functions radio-control (RC) components must be upgraded through
such as flying through specific coordinates autonomously additional hardware and software. We explicitly settled on
are gaining importance. In this matter, our research the usage of RC-/‘‘plug and play’’ components as well as
group has been conducting studies about various possible easily accessible sensors/hardware since it accelerates the
applications of (camera-equipped) quadrocopters [1]–[8]. development process and enables other researchers to con-
Rescue and aid missions after natural disasters fall under the veniently apply the presented methods of this paper to their
applications amongst others. Critical for such missions are own systems. Later, a position control must be implemented
the autonomous and safe landings of the drones. However, in order to execute suitable landings. Other works rely on
random objects and uneven grounds can have a negative relatively complex approaches for generating controller feed-
impact on the landing process itself. Therefore, an automated back respectively estimating the drone’s position and car-
landing system based on a depth camera is desired within a rying out the control process, e.g., the implementation of
bigger project. Such cameras namely can provide information Kalman filters as in [9], [10] and the implementation of an
about the terrain evenness, which allows for the finding of adaptive PID controller as in [11]. In this work however,
appropriate landing spots in various environments. we kept the system complexity rather low by only relying
on a Time-of-Flight (ToF) sensor, a camera and innovative
The associate editor coordinating the review of this manuscript and software solutions regarding the control system. The chal-
approving it for publication was Wei Wei . lenges in this project, among others, include the development

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.


VOLUME 8, 2020 For more information, see https://creativecommons.org/licenses/by-nc-nd/4.0/ 202111
M. Demirhan, C. Premachandra: Development of an Automated Camera-Based Drone Landing System

of fast and robust image processing algorithms to provide the Fi describes the uplift force of the rotors respectively
desired position feedback while maintaining the desired loop and φ, θ and ψ the roll, pitch and yaw Euler angles
frequency. Instead of detecting the landing symbol through (s = sin, c = cos).
methods based on a neural network respectively deep learn-
ing (e.g., [12]), we focused on classical image processing A. MICROCONTROLLER AND SIGNAL GENERATION
algorithms, as these can be highly efficient when designed A quadrocopter based on RC components is usually maneu-
deliberately. Another major challenge includes the design of vered through the roll, pitch, yaw and throttle flight com-
an innovative fail-safe switch logic allowing the pilot to abort mands coming from the transmitter operated by the pilot.
the autonomous mode and control the drone manually at any At first, our system only consisted of a FUTABA T6L Sport
time, hence preventing accidents during tests. transmitter, a FUTABA R3106GF receiver with six channels
Chapter 2 covers the basics of quadrocopter dynamics as (using the PWM-protocol) and an SP Racing F3 flight con-
well as the upgrade process of the used model. In Chapter 3, troller. In order to replace the pilot and enable autonomous
the implementation of the quadrocopter’s altitude control control, we need a microcontroller to give out the necessary
through a PID controller with a back-calculation anti-windup commands to the flight controller. Since the desired position
is presented. The design of two PD controllers for the hori- control will be camera-based, CPU-intensive image process-
zontal plane based on black-box system identifications is then ing must be considered while choosing the right hardware.
described in Chapter 4. Subsequently, Chapter 5 includes the Hence, a Raspberry Pi Model 3B+ (RPi) is used, which offers
development and optimization of the image processing algo- high performance in addition to fast development by using
rithm for generating horizontal feedback. Finally, Chapter 6 available libraries such as OpenCV. Lastly, Fig. 2 illustrates
deals with the implementation/test of the horizontal control the implemented signal routing/switch logic for achieving
and the overall landing process. Finally, Chapter 7 concludes autonomous signal generation. By using a 4 × 2:1 multiplexer
the paper. (TC74HC157A [16]) the drone can be controlled either man-
ually by the pilot/receiver or the RPi. Therefore, the RPi reads
II. QUADROCOPTER DYNAMICS in the channel 6 receiver signal, which is linked to a shaft
A quadrocopter is a multiple-input-multiple-output-system encoder on the transmitter. Depending on this PWM signal
with 6 degrees of freedom and 4 motors/inputs (Fig. 1). (ranging between 1-2 ms) and a defined threshold, the RPi
Generally, a body fixed coordinate system (index B) as well as sends out either a Low or a High signal to the multiplexer.
inertial one (index I) are defined. Converting the body fixed The multiplexer then switches its inputs accordingly, whereby
accelerations into the latter, assuming the quadrocopter is a only the flight commands of either the receiver or the RPi
point mass, is possible by applying the rotation matrix as are passed to the flight controller. Additionally, a pull-down
in [13]–[15]: resistor always provides a defined Low signal in case the RPi
 
mẍI should have a defect and turns off randomly. Hence, we can
 mÿI  still control the quadrocopter manually after mid-flight fail-
mz̈I ures of the single-board computer (fail-safe system). Since
this setup allows for a continuous switching between the
     
0 0 (cψsθ cφ + sψsφ)6Fi
= R  0  +  0  =  (sψsθ cφ − cψsφ)6Fi  two described modes, severe accidents through unwanted
6Fi −mg (cθcφ)6Fi −mg behaviors of the autonomous mode can be prevented.
(1)
R (φ, θ, ψ)
 
cψcθ cψsθ sφ − sψcφ cψsθcφ + sψsφ
=  sψcθ sψsθ sφ + cψcφ sψsθcφ − cψsφ 
−sθ cθsφ cθcφ
(2)

FIGURE 2. Switch logic between autonomous control (Raspberry Pi) and


FIGURE 1. Quadrocopter Euler angles and coordinate systems. manual/RC control.

202112 VOLUME 8, 2020


M. Demirhan, C. Premachandra: Development of an Automated Camera-Based Drone Landing System

B. GIMBAL, CAMERA AND HEIGHT SENSOR


Fig. 3 illustrates the necessity of a camera stabilization.
By combining two servos with 3D printed parts, we built a
cheap and light-weight gimbal (Fig. 4).

FIGURE 5. The used quadrocopter with its components.

feature of the flight controller, we can measure the verti-


cal acceleration (output) of the quadrocopter together with
the throttle flight command (input) over time. After collect-
FIGURE 3. Necessity of a gimbal for camera usage.
ing enough data, we can deduce the throttle command for
hovering/compensating gravity as well as a PT1 element to
approximate the acceleration-throttle relation (Fig. 6).

FIGURE 4. 3-D printed servo gimbal.

To compensate the current roll and pitch angles of the FIGURE 6. Acceleration characteristics of the quadrocopter.

quadrocopter, their shafts must move accordingly in the oppo-


site direction. For this purpose, the CAMSTAB function of The overall plant for the height z can thus be described as:
the flight controller is employed.
As for the camera, a modified Raspberry Pi v1.3 camera Z(s) Z̈ (s) 1 0.0248m/s2
= = (4)
with a fisheye lens is used (Longrunner LC20). While it CMthrottle (s) CMthrottle (s) s 2 (0.08s + 1)s2
provides a wide field of view with a diagonal angle of 160◦ , The controller gain values are then calculated by pole place-
the resulting distortions must be mentioned as a downside. ment respectively coefficient comparison between the closed
However, it can flexibly be set to different fps and resolutions. loop transfer function and a desired transfer function accord-
In addition, a VL53L1X Time-of-Flight (ToF) module ing to [17], [18]. However, the pole of the throttle-PT1 ele-
serves as the feedback sensor of the quadrocopter’s height. ment as well as the zeros of the closed loop are neglected for
It offers millimeter resolution and is operated at 33 Hz, this purpose:
allowing for height measurements of up to 3 m. Finally,
Fig. 5 depicts the upgraded quadrocopter. ω0,w
2

Tw
ω 2
   
2Dw
III. ALTITUDE CONTROL s3 + 2Dw ω0,w + 1
Tw s2 + Tw + ω0,w ω0,w s+ T0,w
w
Next, the height control operating at 20 Hz will be imple-
KS,T KI
mented by designing a PID controller (parallel form): = (5)
s3 + KS,T KD s2 + KS,T KP s + KS,T KI
KD · s2 + KP · s + KI
GPID (s) = (3) At first, we calculated the controller gains for a closed loop
s simulation in MATLAB Simulink by choosing a desired
The integral part of the controller will help us in compensat- damping of Dw = 0.85, a frequency of ω0,w = 1.95 rad/s and
ing the loss of uplift force due to the continuous discharge a time constant of Tw = 1 s to KP = 2.87, KD = 1.74 and
of the battery mid-air. Using the black-box measurement KI = 1.53 (Fig. 7).

VOLUME 8, 2020 202113


M. Demirhan, C. Premachandra: Development of an Automated Camera-Based Drone Landing System

FIGURE 9. Back-calculation anti-windup principle.

TABLE 1. Height Control PID Gains.


FIGURE 7. The Simulink simulation model for the height control.

The resulting controller gains combined with a ±150 out-


put saturation of the throttle-command around its hovering-
state value (control ratio) and a classic clamping anti-windup and Tw = 2 s. Also, we decreased the already mentioned
resulted in a rather unsatisfying overshoot not suited for throttle saturation to ±120 around the hover-value.
landings (Fig. 8). Nevertheless, the simulation’s step response The final throttle command to the flight controller con-
was compared to experiments using the real system with the tains the saturated PID controller output yPID,sat added to the
same control parameters. hovering-throttle command CMthrottle,hover and is corrected
by the current roll and pitch angles:
yPID,sat (k) + CMthrottle,hover
ythrottle (k) = (6)
cos (φ (k)) cos (θ (k))
Furthermore, a PT1-filter with α = 0.6 is applied to the
control error of the derivative part decreasing the controller’s
sensitivity against sensor noise and improving the perfor-
mance of the back-calculation anti-windup [21]:
ez,filt (k) = (1 − α) ez,filt (k − 1) +αez (k) (7)
With this configuration we can achieve a sufficiently damped
step response required for adequate landings (Fig. 10).

FIGURE 8. Height control step responses - real system and simulation


with a clamping anti-windup.

The high similarity between the curves’ progression serves


as a sufficient validation of the built simulation and system
parameters. Nonetheless, manipulating the step response of
the real system to make it suitable for landings by simply
varying the controller gains poses a difficult task. Reason
being is that the PID controller with its two zeros can only
compensate two poles of the existing third order plant [19].
Therefore, we replaced the existing anti-windup method with
a back-calculation logic according to [20] (Fig. 9). FIGURE 10. Step response of the height control - real system with a
This anti-windup method allows for the effective change of back-calculation anti-windup.
the system dynamics by ‘‘discharging’’ the integrator against
its natural direction of error-integration during saturation.
Hence it can be useful regarding the prevention of high IV. CONTROLLER DESIGN FOR POSITIONING IN THE
overshoots. Eventually, the final controller gains were deter- HORIZOTAL PLANE
mined by tuning the desired transfer function (5) and the A. BLACK-BOX SYSTEM IDENTIFICATION
back-calculation factor Kb empirically. Table 1 includes the Horizontal movements can be achieved through roll and pitch
corresponding values based on Dw = 0.95, ω0,w = 1.5 rad/s maneuvers without needing the yaw command. In order to

202114 VOLUME 8, 2020


M. Demirhan, C. Premachandra: Development of an Automated Camera-Based Drone Landing System

design PD controllers, we first simplify (1) with the following


ideas:
• Linearization of the trigonometric functions by approx-
imating sin x ≈ x and cos x ≈ 1 for small angels.
• The height setpoint during the horizontal con-
trol/approach of the landing spot will not change. Hence,
we can assume an almost hovering state, so that the uplift
force of the rotors roughly equals the gravitational force
(6F = mg).
• No yaw movements during the landing process allow us
to set the yaw angle to ψ = 0◦ .
By that, the accelerations in the horizontal plane can be
approximated as:

θg
   
ẍI
≈ (8)
ÿI −φg

At this point, the Euler angles are unknown to us since


they are being controlled by the flight controller as outputs
depending on the flight commands. Hence, we deduce models
for the roll and pitch plants of the quadrocopter by applying FIGURE 11. Comparing the measured system output with the ones
simulated through the identified black-box models.
black-box system identifications. First, measurements are
prepared by letting the RPi handle the height control at a
specific setpoint, while the pilot solely focuses on excit-
ing the quadrocopter with roll and pitch maneuvers respec- movement of the quadrocopter:
π
tively. In this regard, the roll, pitch and yaw inputs of the X (s) KS,P · 180 ·g
flight controller will temporarily be connected to the receiver = and
CMpitch (s)

TS,P · s + 1 · s2
directly instead of passing them to the multiplexer. With the π
Y (s) −KS,R · 180 ·g
resulting 4 min long data sets suitable models are identi- = (9)
CMroll (s)

fied by applying the instrumental variable method (IV) [22]. TS,R · s + 1 · s2
We therefore used MATLAB’s system identification toolbox The PD gains for the roll and pitch commands (R/P) are then
to determine PT1 elements for the controller design as well again calculated by pole placement through the following
as models of higher order for accurate simulation purposes. equation (neglecting the plant’s PT1 element and the zero of
Cross validations have been executed through additionally the closed loop respectively):
prepared data sets. Furthermore, different resampling rates
have been analyzed regarding their effect on the fit values ω0,w
2
gKP KS,R/P
=
of the resulting models as in [23]. The following transfer s2 + 2Dw ω0,w s+ω0,w
2 s2 + gKD KS,R/P s+gKP KS,R/P
functions have resulted with 50 Hz being the most beneficial (10)
sampling rate during the identification process (Table 2):
Based on Dw = 0.8, ω0,w = 1.2 rad/s, g = 981 cm/s2
π
TABLE 2. Identified black-box models of the roll and pitch system plant. and KS,R/P = 180 ◦ mean(KS,R , KS,P ) the gains are KDx/y =
0.88 and KPx/y = 0.66. Additionally, an output saturation of
±30 will be introduced to limit the Euler angles (<3◦ ). The
step responses of the simulation model in Fig. 12 can be seen
in Fig. 13. Even with a feedback delay of 30 ms, representing
a rough/temporary estimation of the image processing time
consumption, the results are satisfactory.

V. DETECTION OF THE LANDING SYMBOL


Fig. 11 shows the graphical comparison between the mea- A. ALGORITHM TO DETECT THE H-SYMBOL
sured and simulated system outputs of the stated models using To test the horizontal control, camera-based feedback is
the validation data sets. needed. We used a landing pad with high color contrasts
and a unique symbol for easy segmentation. Initially, basic
B. PD CONTROLLER DESIGN operations like the transformation of the RGB image into
By integrating the determined PT1 roll and pitch models grayscale and its binarization through a simple threshold
twice we can obtain the plants describing the horizontal method after applying a Gaussian filter are executed [24].

VOLUME 8, 2020 202115


M. Demirhan, C. Premachandra: Development of an Automated Camera-Based Drone Landing System

FIGURE 12. The Simulink simulation model of the horizontal control.

FIGURE 13. Step responses of the horizontal control – simulation.

A slight dilation by a 3×3 mask increases the recognizability


of the 20 cm large H-symbol from further distances (Fig. 14).
FIGURE 15. Flowchart of detecting the landing symbol (H).

searched for in the resulting contour vectors inside a loop.


Only those contours, which neither include a child contour
nor are described with less than 30 points are passed for
further examination. Subsequently, the current contour is
approximated to such a degree that ideally 12 corner points
should remain if it was the H-symbol (by cv.approxPolyDP).
The next condition requires the contour to have a total length
of at least 200 pixels. Ultimately, the H-symbol can be
robustly determined by the final two conditions. They are
verifying, whether the contour has 5 lines between the two
longest ones and whether those two are at least four times
longer than their direct neighbours respectively. Small toler-
ances in step 3 and 5 of the flowchart allow the detection of
the H-symbol even if it should be approximated by 13 corner
points or has up to 7 lines between its two longest ones.
FIGURE 14. Binarizing and dilating the RGB image. A positive detection results in the immediate abortion of the
loop (we presume that maximally only one H-symbol will
The flowchart of Fig. 15 describes the successive algorithm appear in the image). By that, we can successfully identify the
to detect the H-symbol in the image. First, the closed contours landing symbol in heights of 15-165 cm as shown in Fig. 16.
are identified through the cv.findContours function from the The green center point is essentially the desired landing spot
OpenCV library based on [25]. Hereafter, the H-symbol is and can be deduced by averaging all the corner points of the

202116 VOLUME 8, 2020


M. Demirhan, C. Premachandra: Development of an Automated Camera-Based Drone Landing System

FIGURE 16. Detection of the H-symbol from 15 cm (left) and 165 cm


(right).

contour. In this form the algorithm needs about 12-22 ms from


the start to the end of the flowchart in Fig. 15. The computing
time variation depends on the image’s lineup and tends to be
higher when the H-symbol is in the picture.

B. INTRODUCING AN ADAPTIVE REGION OF INTEREST


After the very first detection of the landing symbol, pixels
distant to the tracked object lose their relevance. The reason
for this is that between two frames the object cannot change
its position in the image too much due to the quadrocopter’s
inertia and limited speed. Hence, a Region of Interest (ROI)
can be introduced, which allows us to speed up the algo-
rithm by only processing the relevant parts of the image. FIGURE 17. Implementation and theoretical operation of the ROI.
Fig. 17 depicts the implementation of the ROI. After reading
in the image, the detection algorithm only processes the ROI
determined in the last loop. Therefore, a square window
in the image (Fig. 18 and attached video). In the case of
spanned by the points P1 and P2 is cut out of the whole/full
P1 or P2 being out of the image boundaries, the ROI will be
frame (Fig. 17, top). However, the ROI is regarded as a new
cropped appropriately (e.g., H-symbol too large or nearly at
image and hence makes us lose the global coordinates of
the edge of the frame). Finally, Fig. 19 shows the effectivity
the H-symbol in the full image (Fig. 17 edge coordinates of
of the ROI by depicting the computing time over the camera’s
the ROI, top). Thereby, we would not know in which part
vertical distance to the landing symbol respectively the size
of the frame we must place the ROI for the detection in the
of the ROI. As seen, the computing time drops from about
next loop. Thus, we require a (recursive) transformation of the
12-14 ms down to 3 ms with steadily increasing vertical
locally determined center point of the H-symbol CPROI (k)
distance respectively shrinking ROI size. Starting with a
back into global coordinates CPGB (k). We know the ROI
bisection above 50 cm we obtain high savings of time, which
has been placed around the global H-symbol coordinates
justifies the usage of the ROI and could even allow us to use
CPGB (k − 1) in the last loop (k − 1). Due to the physical
higher loop frequencies.
inertia of the drone the H-symbol can only move away
slightly from the ROI’s center in between two loops. This
displacement is described by 1xp and 1yp (Fig. 17, bottom). C. GENERATING COMPLIANT FEEDBACK ERRORS
Accordingly, only these two values are of interest in order to Within the horizontal control the displacement of the land-
update the global coordinates of the H-symbol CPGB (k). For ing symbol to the image’s center point and hence to the
this purpose, we simply add the displacement to the global quadrocopter’s physical center is to be minimized. Currently,
center point from the last loop CPGB (k − 1). 1xp and 1yp we only attain position values in pixels, while the PD con-
can be deduced by subtracting the ROI’s center coordinates trollers are configured for physical feedback (cm). Thus,
from the newly acquired local center point of the H-symbol a continuous conversion of the landing symbol’s image posi-
CPROI (k). tion into compliant feedback errors is needed. Innovatively
The ROI’s center is half of its height respectively width deducing a reliable regularity for this task was possible by
Smax (k − 1), which represents the longest line of the conducting multiple measurements of the H-symbol’s image
H detected in the last loop (if the ROI wasn’t cropped). After position in respect to the image center at equally spaced
calculating CPGB (k), the ROI needs to be placed around physical distances (both vertically and horizontally). Later,
these new global coordinates. Its height and width are updated we reversed the data so that the image position (in pixels)
to be twice as large as the H-symbol’s current longest line at constant heights (cm) serves as the input, whereas the
Smax (k), making it adaptive to the size of the landing symbol horizontal distance (cm) represents the output (Fig. 20).

VOLUME 8, 2020 202117


M. Demirhan, C. Premachandra: Development of an Automated Camera-Based Drone Landing System

FIGURE 20. Inverted measurements representing the relationship


between the horizontal physical distance as a function of the height and
the distance of the landing symbol to the image center in pixels.

The slope-function m (z) can again be approximated by a


regression line through the slopes derived from the individual
lines at constant heights (Fig. 21).

FIGURE 18. Adaptation of the ROI.

FIGURE 21. Determining the slope function m(z) by regressing a line


through the deduced slopes at different heights (from Fig. 20).

With that, the final functions converting the pixel values


into physical ones are:
 
cm cm
x (m (z) , px ) ≈ 0, 00326 z + 0, 0541 px
pixel pixel
 
cm cm
y m (z) , py ≈ 0, 00326

z + 0, 0541 py (12)
pixel pixel
FIGURE 19. Computing time of the detection algorithm using the ROI.
VI. IMPLEMENTING THE POSITION CONTROL
Furthermore, regression lines have been laid through the A. HORIZONTAL CONTROL
measurements at constant heights. With the controller design and the feedback generation being
Generally, the relation between the horizontal distances finished, we are now able to test the horizontal control. Fig. 22
and the image position at a constant height can be approxi- illustrates the orientation of the camera mounted onto the
mated as linear. As expected, the slopes of those lines increase quadrocopter and the defined axes, while Fig. 23 describes
together with the height of the camera, since zooming out the general program flow. First, the current camera frame is
results in less pixels describing the same physical length. read followed by the detection algorithm and the update of the
Hence the conversion should be a function of both the pixel current height (ToF-sensor-value). A successful detection of
distances px and py as well as the height z: the H-symbol only leads to the activation of the autonomous
mode respectively the horizontal control if the pilot enables
x (m (z) , px ) ≈ m (z) px this through a switch on the remote control. For that matter,
y m (z) , py ≈ m (z) py

(11) a successful detection is defined as follows:

202118 VOLUME 8, 2020


M. Demirhan, C. Premachandra: Development of an Automated Camera-Based Drone Landing System

additionally decided on slightly tuning the controller gains


in order to get a smoother/more damped step response
and limit the drone’s oscillation around its setpoint. With
Dw = 0.87 and ω0,w = 1 rad/s the final gains result in
KDx/y = 0.8 and KPx/y = 0.46. Additionally, the pitch and
roll command saturations were further decreased to ±25.
Fig. 24 lastly illustrates the step responses of such an exper-
iment with the final configuration by plotting the variation of
all axes with time as well as including the H-symbol’s image
position over time. While the damping of the system is high
enough to prevent significant overshoots, the quadrocopter
FIGURE 22. Orientation of the camera/image on the drone.
slightly oscillates around the setpoint with amplitudes of
maximally ±8 cm respectively ±18 pixels. These could result
from the quadrocopter being not perfectly balanced (causing
continuous drifts) as well as the servo gimbal compensating
the roll and pitch angles in a rather inaccurate way. Neverthe-
less, the horizontal control possesses a satisfying precision
and is hence usable for the overall landing process.

FIGURE 23. Flowchart of the horizontal control test.

• 3 consecutive respectively uninterrupted detections,


if the autonomous mode was previously deactivated. The
first takes place in the full image, while the last two only
process the emerging ROI.
• 2 consecutive failed detections at most during the
autonomous mode. After the third, the manual mode is
reactivated and the ROI reset.
These precautions allow for a safer start of the horizon-
tal control as well as a higher robustness against randomly
failed detections during said process. Moreover, the horizon-
tal control should take place during the hovering state of the
quadrocopter. Hence, the height setpoint is set one-time to
the last measured vertical distance z at the beginning of the FIGURE 24. Step response of the horizontal control (physical and image
autonomous mode. In the same way, the slope m(z) for the plane).
pixel-into-cm conversion is fixed uniquely. While this pre-
vents adaptions of the horizontal feedback to slight changes B. THE OVERALL LANDING PROCESS
in the real height, it also decouples said feedback from the The overall landing process can be divided into two sub-steps
ToF-sensor’s noise. Finally, the RPi calculates the feedback as shown in Fig. 25. After detecting the landing symbol, the
errors and sends flight commands to the flight controller quadrocopter flies towards it while keeping its height. In order
accordingly. Due to the low mechanical accuracy of the servo to activate the second sub-step, it must be airborne in a
gimbal (low angle resolutions) as well as general noise in the predefined horizontal tolerance zone (±10 cm radius) for 4 s.
image processing, we also applied PT1-filters for the x-axis This ensures, that the vertical landing starts after the transient
and y-axis respectively: effect respectively any possible overshoots. Subsequently,
ex/y,filt (k) = 0.5ex/y,filt (k − 1) +0.5ex/y (k) (13) the height setpoint is set to −3 cm instead of 0 cm. Even
though this is not possible physically, it allows for a more
The tests were then conducted by flying the quadrocopter precise landing behavior, which is disturbed by the ground
manually towards the landing symbol until the RPi takes effect. The latter arises from the quadrocopter being too near
over the control by itself. After several experiments we to the ground, hindering the downwards airflow and hence

VOLUME 8, 2020 202119


M. Demirhan, C. Premachandra: Development of an Automated Camera-Based Drone Landing System

longer executed under approximately 15 cm. Additionally,


the slope m(z) for the pixel-into-cm conversion is adapted
actively to the real height. Otherwise the horizontal feedback
would scale to unrealistic errors. Fig. 26 and Fig. 27 finally
show the plot of such a landing process (the video material
also includes a test). The 3-axis position control is sufficiently
functional and hence conceptionally transferable to other
systems.

VII. CONCLUSION
In the present paper, an automated camera-based landing sys-
tem for quadrocopters is proposed. It relies on a commercial
FIGURE 25. Flowchart of the overall landing process test.
flight controller handling the attitude control (inner loop),
while the 3-axis position control (outer loop operating at
20 Hz) is taken care of by a Raspberry Pi 3B+. Therefore,
we upgraded a standard radio-control model with additional
hardware, such as a wide-angle camera for horizontal feed-
back (modified Raspberry Pi Cam v1.3), a servo gimbal,
a ToF-height-sensor and a multiplexer. The latter serves as
a switch, allowing the pilot to either fly the quadrocopter
manually through his remote control or activate the developed
autonomous mode, in which the Raspberry Pi generates the
flight commands. A PID controller is designed for the height
control including a back-calculation anti-windup. In the hori-
zontal plane however, two PD controllers are implemented.
Beforehand, several black-box system identifications have
been conducted regarding the flight controller’s attitude con-
FIGURE 26. Step responses in the overall landing process (separate axes).
trol in order to deduce the relevant system plants and sim-
ulations for each controller design. The image processing
algorithms generate horizontal feedback by detecting a dis-
tinct landing symbol and converting its image coordinates
into compliant physical distances. Implementing a region of
interest helps us to speed up the computing time for that
matter. Generally, the resulting step responses as well as
the performance regarding the overall landing process are
making up a functional autonomous system.
In the future, we plan on replacing the 2D-camera with
a depth camera, allowing us to obtain information about
the evenness of the ground. Thereby, it would be possible
to locate flat surfaces to land on without depending on a
landing symbol/pad. This can be especially helpful for any
kind of rescuing missions after natural disasters such as
earthquakes. Nevertheless, depth cameras require high com-
putational power and suitable algorithms with respect to a
sufficiently fast position control. Choosing the right hardware
(camera and single-board computer) will have a huge effect
FIGURE 27. Step responses in the overall landing process (3D view).
on the functionality of the desired system.

REFERENCES
resulting in an increasing uplift in the general case [26], [27].
[1] Y. Yamazaki, M. Tamaki, C. Premachandra, C. J. Perera, S. Sumathipala,
However, boosting the integrator gain of the PID controller and B. H. Sudantha, ‘‘Victim detection using UAV with on-board voice
under a characteristic height of 20 cm proves to be an even recognition system,’’ in Proc. 3rd IEEE Int. Conf. Robotic Comput. (IRC),
more effective (whilst simple) measurement against said Feb. 2019, pp. 555–559.
problem. During the vertical landing, the horizontal control [2] C. Premachandra, M. Otsuka, R. Gohara, T. Ninomiya, and K. Kato,
‘‘A study on development of a hybrid aerial terrestrial robot system for
stays active until the H-symbol can no longer be detected due avoiding ground obstacles by flight,’’ IEEE/CAA J. Automatica Sinica,
to too low heights. Thereby, pitch and roll maneuvers are no vol. 6, no. 1, pp. 327–336, Jan. 2019.

202120 VOLUME 8, 2020


M. Demirhan, C. Premachandra: Development of an Automated Camera-Based Drone Landing System

[3] C. Premachandra, D. Ueda, and K. Kato, ‘‘Speed-up automatic quadcopter [25] S. Suzuki and K. Abe, ‘‘Topological structural analysis of digitized binary
position detection by sensing propeller rotation,’’ IEEE Sensors J., vol. 19, images by border following,’’ Comput. Vis., Graph., Image Process.,
no. 7, pp. 2758–2766, Apr. 2019. vol. 30, no. 1, pp. 32–46, 1985.
[4] K. Nakajima, C. Premachandra, and K. Kato, ‘‘3D environment mapping [26] S. Aich, C. Ahuja, T. Gupta, and P. Arulmozhivarman, ‘‘Analysis of ground
and self-position estimation by a small flying robot mounted with a mov- effect on multi-rotors,’’ in Proc. Int. Conf. Electron., Commun. Comput.
able ultrasonic range sensor,’’ J. Electr. Syst. Inf. Technol., vol. 4, no. 2, Eng. (ICECCE), Nov. 2014, pp. 236–241.
pp. 289–298, Sep. 2017. [27] P. Wei, S. N. Chan, and S. Lee, ‘‘Mitigating ground effect on mini quad-
[5] C. Premachandra and M. Otsuka, ‘‘Development of hybrid aerial/terrestrial copters with model,’’ Univ. California Davis, Davis, CA, USA, Tech. Rep.,
robot system and its automation,’’ in Proc. IEEE Int. Syst. Eng. Symp., 2019.
Oct. 2017, pp. 1–3.
[6] C. Premachandra, S. Takagi, and K. Kato, ‘‘Flying control of small-type
helicopter by detecting its in-air natural features,’’ J. Electr. Syst. Inf.
Technol., vol. 2, no. 1, pp. 58–74, May 2015.
[7] Y. Yamazaki, C. Premachandra, and C. J. Perea, ‘‘Audio-processing-based
human detection at disaster sites with unmanned aerial vehicle,’’ IEEE MALIK DEMIRHAN received the B.Eng. degree
Access, vol. 8, pp. 101398–101405, Jun. 2020. in mechanical engineering (dual study degree pro-
[8] C. Premachandra, D. N. H. Thanh, T. Kimura, and H. Kawanaka, gram) from the Ostfalia University of Applied
‘‘A study on hovering control of small aerial robot by sensing existing floor Sciences, and Volkswagen AG, Wolfsburg/
features,’’ IEEE/CAA J. Automatica Sinica, vol. 7, no. 4, pp. 1016–1025, Wolfenbüttel, Germany, in 2017, and the
Jul. 2020. M.Sc. degree in mechanical engineering from
[9] H. Beck, J. Lesueur, G. Charland-Arcand, O. Akhrif, S. Gagne, F. Gagnon, the Clausthal University of Technology,
and D. Couillard, ‘‘Autonomous takeoff and landing of a quadcopter,’’ in Clausthal-Zellerfeld, Germany, in 2020.
Proc. Int. Conf. Unmanned Aircr. Syst. (ICUAS), Jun. 2016, pp. 475–484. From 2019 to 2020, he was a Research Assistant
[10] N. Xuan-Mung, S. K. Hong, N. P. Nguyen, L. N. N. T. Ha, and
with the Laboratory for Image Processing and
T.-L. Le, ‘‘Autonomous quadcopter precision landing onto a heav-
ing platform: New method and experiment,’’ IEEE Access, vol. 8,
Robotics, Shibaura Institute of Technology, Tokyo, Japan. He is currently
pp. 167192–167202, Sep. 2020. a Calibration Engineer in DCT transmissions with Volkswagen AG. His
[11] K. T. Putra, R. O. Wiyagi, and M. Y. Mustar, ‘‘Precision landing system on research interests include control theory, mechatronic applications, calibra-
H-octocopter drone using complementary filter,’’ in Proc. Int. Conf. Audio, tion, measurement technology, and image processing.
Lang. Image Process. (ICALIP), Jul. 2018, pp. 283–287.
[12] N. Q. Truong, P. H. Nguyen, S. H. Nam, and K. R. Park, ‘‘Deep learning-
based super-resolution reconstruction and marker detection for drone land-
ing,’’ IEEE Access, vol. 7, pp. 61639–61655, May 2019.
[13] W. Dong, G.-Y. Gu, and X. D. H. Zhu, ‘‘Modeling and control of a CHINTHAKA PREMACHANDRA (Member,
quadrotor UAV with aerodynamic concepts,’’ in World Academy of Sci- IEEE) was born in Sri Lanka. He received the
ence, Engineering and Technology. 2013, pp. 901–906. B.Sc. and M.Sc. degrees from Mie University, Tsu,
[14] A. E. V. Moreno, ‘‘Machine learning techniques to estimate the dynamics Japan, in 2006 and 2008, respectively, and the
of a slung load multirotor UAV system,’’ Univ. Glasgow, Glasgow, U.K., Ph.D. degree from Nagoya University, Nagoya,
Tech. Rep., 2017. Japan, in 2011.
[15] A. Reizenstein, ‘‘Position and trajectory control of a quadrocopter
From 2012 to 2015, he was an Assistant Profes-
using PID and LQ controllers,’’ Linköping Univ., Linköping, Sweden,
Tech. Rep., 2017.
sor with the Department of Electrical Engineering,
[16] Toshiba. (2019). TC74HC157AP Datasheet. [Online]. Available: Faculty of Engineering, Tokyo University of Sci-
https://www.alldatasheet.com/datasheet-pdf/pdf/214509/TOSHIBA/ ence, Tokyo, Japan. From 2016 to 2017, he was
TC74HC157AP_07.html an Assistant Professor with the Department of Electronic Engineering,
[17] C. Bohn, ‘‘Regelungstechnik (control theory) 1—Lecture script,’’ TU School of Engineering, Shibaura Institute of Technology, Tokyo. In 2018,
Clausthal, Clausthal-Zellerfeld, Germany, Tech. Rep., 2017. he was also an Associate Professor with the Department of Electronic
[18] C. Bohn, ‘‘Parameteridentifikation DC-motor und regelung—Laboratory Engineering, School of Engineering/Graduate School of Engineering and
script,’’ TU Clausthal, Clausthal-Zellerfeld, Germany, Tech. Rep., 2018. Science, Shibaura Institute of Technology, where he is currently a Manager
[19] S. Jung and R. C. Dorf, ‘‘Analytic PIDA controller design technique for with the Image Processing and Robotic Laboratory. His laboratory conducts
a third order system,’’ in Proc. 35th IEEE Conf. Decis. Control, vol. 3, research in two main fields, such as image processing and robotics. His
Dec. 1996, pp. 2513–2518. research interests include computer vision, pattern recognition, speed up
[20] X. Li, J. Park, and H. Shin, ‘‘Comparison and evaluation of anti-windup PI image processing, camera-based intelligent transportation systems, terres-
Controllers,’’ Gyeongsang Nat. Univ., Gyeongsangnam-do, South Korea, trial robotic systems, flying robotic systems, and integration of terrestrial
Tech. Rep., 2010. robot and flying robot.
[21] S. W. Smith, The Scientist and Engineer’s Guide to Digital Signal Process-
Dr. Premachandra was a steering committee member with many interna-
ing. California Technical Pub., 1997.
[22] C. Bohn and H. Unbehauen, Identifikation Dynamischer Systeme.
tional conferences. He is a member of IEICE, Japan, SICE, Japan, and SOFT,
Wiesbaden, Germany: Springer Vieweg-Verlag, 2016. Japan. He received the FIT Best Paper Award from IEICE in 2009 and the
[23] I. Kugelberg, ‘‘Black-box modeling and attitude control of a quadro- FIT Young Researchers Award from IPSJ, Japan, in 2010. He serves as the
copter,’’ Linköping Univ., Linköping, Sweden, Tech. Rep., 2016. Founding Chair for the International Conference on Image Processing and
[24] B. Jaehne, Digitale Bildverarbeitung Und Bildgewinnung. Berlin, Robotics (ICIPRoB). He served as an Editor for journals.
Germany: Springer Vieweg-Verlag, 2012.

VOLUME 8, 2020 202121

You might also like