Development of An Automated Camera-Based Drone Landing System
Development of An Automated Camera-Based Drone Landing System
Japan
Corresponding author: Malik Demirhan ([email protected])
This work was supported in part by the Branding Research Fund of Shibaura Institute of Technology.
ABSTRACT Digital image processing serves as a multifunctional tool for measurement and positioning
tasks in robotics. The present paper deals with the development of a camera-based positioning system for
quadrocopters in order to automate their landing process. In this regard, a quadrocopter equipped with
classical radio-control components is upgraded with applicable hardware, such as a Raspberry Pi 3B+ and
a wide-angle camera. Hereupon, black-box system identifications are executed to attain the relevant plants
of the attitude control performed by the flight controller. Thereby, a PID-controller for the altitude control
including a back-calculation anti-windup as well as two PD-controllers for the horizontal plane are designed
using a pole placement method. Effective tests of the controller gains are conducted by simulating the closed
loops respectively. Since the camera functions as a position sensor, an image processing algorithm is then
implemented to detect a distinctive landing symbol in real time while converting its image position into
compliant feedback errors (pixel-to-physical distance-conversion). Ultimately, the developed system allows
for the robust detection and successful landing on the landing spot by a position control operating at 20 hertz.
INDEX TERMS Autonomous drone, quadrocopter, landing, PID, control engineering, system identification,
image processing, wide angle camera, region of interest.
of fast and robust image processing algorithms to provide the Fi describes the uplift force of the rotors respectively
desired position feedback while maintaining the desired loop and φ, θ and ψ the roll, pitch and yaw Euler angles
frequency. Instead of detecting the landing symbol through (s = sin, c = cos).
methods based on a neural network respectively deep learn-
ing (e.g., [12]), we focused on classical image processing A. MICROCONTROLLER AND SIGNAL GENERATION
algorithms, as these can be highly efficient when designed A quadrocopter based on RC components is usually maneu-
deliberately. Another major challenge includes the design of vered through the roll, pitch, yaw and throttle flight com-
an innovative fail-safe switch logic allowing the pilot to abort mands coming from the transmitter operated by the pilot.
the autonomous mode and control the drone manually at any At first, our system only consisted of a FUTABA T6L Sport
time, hence preventing accidents during tests. transmitter, a FUTABA R3106GF receiver with six channels
Chapter 2 covers the basics of quadrocopter dynamics as (using the PWM-protocol) and an SP Racing F3 flight con-
well as the upgrade process of the used model. In Chapter 3, troller. In order to replace the pilot and enable autonomous
the implementation of the quadrocopter’s altitude control control, we need a microcontroller to give out the necessary
through a PID controller with a back-calculation anti-windup commands to the flight controller. Since the desired position
is presented. The design of two PD controllers for the hori- control will be camera-based, CPU-intensive image process-
zontal plane based on black-box system identifications is then ing must be considered while choosing the right hardware.
described in Chapter 4. Subsequently, Chapter 5 includes the Hence, a Raspberry Pi Model 3B+ (RPi) is used, which offers
development and optimization of the image processing algo- high performance in addition to fast development by using
rithm for generating horizontal feedback. Finally, Chapter 6 available libraries such as OpenCV. Lastly, Fig. 2 illustrates
deals with the implementation/test of the horizontal control the implemented signal routing/switch logic for achieving
and the overall landing process. Finally, Chapter 7 concludes autonomous signal generation. By using a 4 × 2:1 multiplexer
the paper. (TC74HC157A [16]) the drone can be controlled either man-
ually by the pilot/receiver or the RPi. Therefore, the RPi reads
II. QUADROCOPTER DYNAMICS in the channel 6 receiver signal, which is linked to a shaft
A quadrocopter is a multiple-input-multiple-output-system encoder on the transmitter. Depending on this PWM signal
with 6 degrees of freedom and 4 motors/inputs (Fig. 1). (ranging between 1-2 ms) and a defined threshold, the RPi
Generally, a body fixed coordinate system (index B) as well as sends out either a Low or a High signal to the multiplexer.
inertial one (index I) are defined. Converting the body fixed The multiplexer then switches its inputs accordingly, whereby
accelerations into the latter, assuming the quadrocopter is a only the flight commands of either the receiver or the RPi
point mass, is possible by applying the rotation matrix as are passed to the flight controller. Additionally, a pull-down
in [13]–[15]: resistor always provides a defined Low signal in case the RPi
mẍI should have a defect and turns off randomly. Hence, we can
mÿI still control the quadrocopter manually after mid-flight fail-
mz̈I ures of the single-board computer (fail-safe system). Since
this setup allows for a continuous switching between the
0 0 (cψsθ cφ + sψsφ)6Fi
= R 0 + 0 = (sψsθ cφ − cψsφ)6Fi two described modes, severe accidents through unwanted
6Fi −mg (cθcφ)6Fi −mg behaviors of the autonomous mode can be prevented.
(1)
R (φ, θ, ψ)
cψcθ cψsθ sφ − sψcφ cψsθcφ + sψsφ
= sψcθ sψsθ sφ + cψcφ sψsθcφ − cψsφ
−sθ cθsφ cθcφ
(2)
To compensate the current roll and pitch angles of the FIGURE 6. Acceleration characteristics of the quadrocopter.
Tw
ω 2
2Dw
III. ALTITUDE CONTROL s3 + 2Dw ω0,w + 1
Tw s2 + Tw + ω0,w ω0,w s+ T0,w
w
Next, the height control operating at 20 Hz will be imple-
KS,T KI
mented by designing a PID controller (parallel form): = (5)
s3 + KS,T KD s2 + KS,T KP s + KS,T KI
KD · s2 + KP · s + KI
GPID (s) = (3) At first, we calculated the controller gains for a closed loop
s simulation in MATLAB Simulink by choosing a desired
The integral part of the controller will help us in compensat- damping of Dw = 0.85, a frequency of ω0,w = 1.95 rad/s and
ing the loss of uplift force due to the continuous discharge a time constant of Tw = 1 s to KP = 2.87, KD = 1.74 and
of the battery mid-air. Using the black-box measurement KI = 1.53 (Fig. 7).
θg
ẍI
≈ (8)
ÿI −φg
VII. CONCLUSION
In the present paper, an automated camera-based landing sys-
tem for quadrocopters is proposed. It relies on a commercial
FIGURE 25. Flowchart of the overall landing process test.
flight controller handling the attitude control (inner loop),
while the 3-axis position control (outer loop operating at
20 Hz) is taken care of by a Raspberry Pi 3B+. Therefore,
we upgraded a standard radio-control model with additional
hardware, such as a wide-angle camera for horizontal feed-
back (modified Raspberry Pi Cam v1.3), a servo gimbal,
a ToF-height-sensor and a multiplexer. The latter serves as
a switch, allowing the pilot to either fly the quadrocopter
manually through his remote control or activate the developed
autonomous mode, in which the Raspberry Pi generates the
flight commands. A PID controller is designed for the height
control including a back-calculation anti-windup. In the hori-
zontal plane however, two PD controllers are implemented.
Beforehand, several black-box system identifications have
been conducted regarding the flight controller’s attitude con-
FIGURE 26. Step responses in the overall landing process (separate axes).
trol in order to deduce the relevant system plants and sim-
ulations for each controller design. The image processing
algorithms generate horizontal feedback by detecting a dis-
tinct landing symbol and converting its image coordinates
into compliant physical distances. Implementing a region of
interest helps us to speed up the computing time for that
matter. Generally, the resulting step responses as well as
the performance regarding the overall landing process are
making up a functional autonomous system.
In the future, we plan on replacing the 2D-camera with
a depth camera, allowing us to obtain information about
the evenness of the ground. Thereby, it would be possible
to locate flat surfaces to land on without depending on a
landing symbol/pad. This can be especially helpful for any
kind of rescuing missions after natural disasters such as
earthquakes. Nevertheless, depth cameras require high com-
putational power and suitable algorithms with respect to a
sufficiently fast position control. Choosing the right hardware
(camera and single-board computer) will have a huge effect
FIGURE 27. Step responses in the overall landing process (3D view).
on the functionality of the desired system.
REFERENCES
resulting in an increasing uplift in the general case [26], [27].
[1] Y. Yamazaki, M. Tamaki, C. Premachandra, C. J. Perera, S. Sumathipala,
However, boosting the integrator gain of the PID controller and B. H. Sudantha, ‘‘Victim detection using UAV with on-board voice
under a characteristic height of 20 cm proves to be an even recognition system,’’ in Proc. 3rd IEEE Int. Conf. Robotic Comput. (IRC),
more effective (whilst simple) measurement against said Feb. 2019, pp. 555–559.
problem. During the vertical landing, the horizontal control [2] C. Premachandra, M. Otsuka, R. Gohara, T. Ninomiya, and K. Kato,
‘‘A study on development of a hybrid aerial terrestrial robot system for
stays active until the H-symbol can no longer be detected due avoiding ground obstacles by flight,’’ IEEE/CAA J. Automatica Sinica,
to too low heights. Thereby, pitch and roll maneuvers are no vol. 6, no. 1, pp. 327–336, Jan. 2019.
[3] C. Premachandra, D. Ueda, and K. Kato, ‘‘Speed-up automatic quadcopter [25] S. Suzuki and K. Abe, ‘‘Topological structural analysis of digitized binary
position detection by sensing propeller rotation,’’ IEEE Sensors J., vol. 19, images by border following,’’ Comput. Vis., Graph., Image Process.,
no. 7, pp. 2758–2766, Apr. 2019. vol. 30, no. 1, pp. 32–46, 1985.
[4] K. Nakajima, C. Premachandra, and K. Kato, ‘‘3D environment mapping [26] S. Aich, C. Ahuja, T. Gupta, and P. Arulmozhivarman, ‘‘Analysis of ground
and self-position estimation by a small flying robot mounted with a mov- effect on multi-rotors,’’ in Proc. Int. Conf. Electron., Commun. Comput.
able ultrasonic range sensor,’’ J. Electr. Syst. Inf. Technol., vol. 4, no. 2, Eng. (ICECCE), Nov. 2014, pp. 236–241.
pp. 289–298, Sep. 2017. [27] P. Wei, S. N. Chan, and S. Lee, ‘‘Mitigating ground effect on mini quad-
[5] C. Premachandra and M. Otsuka, ‘‘Development of hybrid aerial/terrestrial copters with model,’’ Univ. California Davis, Davis, CA, USA, Tech. Rep.,
robot system and its automation,’’ in Proc. IEEE Int. Syst. Eng. Symp., 2019.
Oct. 2017, pp. 1–3.
[6] C. Premachandra, S. Takagi, and K. Kato, ‘‘Flying control of small-type
helicopter by detecting its in-air natural features,’’ J. Electr. Syst. Inf.
Technol., vol. 2, no. 1, pp. 58–74, May 2015.
[7] Y. Yamazaki, C. Premachandra, and C. J. Perea, ‘‘Audio-processing-based
human detection at disaster sites with unmanned aerial vehicle,’’ IEEE MALIK DEMIRHAN received the B.Eng. degree
Access, vol. 8, pp. 101398–101405, Jun. 2020. in mechanical engineering (dual study degree pro-
[8] C. Premachandra, D. N. H. Thanh, T. Kimura, and H. Kawanaka, gram) from the Ostfalia University of Applied
‘‘A study on hovering control of small aerial robot by sensing existing floor Sciences, and Volkswagen AG, Wolfsburg/
features,’’ IEEE/CAA J. Automatica Sinica, vol. 7, no. 4, pp. 1016–1025, Wolfenbüttel, Germany, in 2017, and the
Jul. 2020. M.Sc. degree in mechanical engineering from
[9] H. Beck, J. Lesueur, G. Charland-Arcand, O. Akhrif, S. Gagne, F. Gagnon, the Clausthal University of Technology,
and D. Couillard, ‘‘Autonomous takeoff and landing of a quadcopter,’’ in Clausthal-Zellerfeld, Germany, in 2020.
Proc. Int. Conf. Unmanned Aircr. Syst. (ICUAS), Jun. 2016, pp. 475–484. From 2019 to 2020, he was a Research Assistant
[10] N. Xuan-Mung, S. K. Hong, N. P. Nguyen, L. N. N. T. Ha, and
with the Laboratory for Image Processing and
T.-L. Le, ‘‘Autonomous quadcopter precision landing onto a heav-
ing platform: New method and experiment,’’ IEEE Access, vol. 8,
Robotics, Shibaura Institute of Technology, Tokyo, Japan. He is currently
pp. 167192–167202, Sep. 2020. a Calibration Engineer in DCT transmissions with Volkswagen AG. His
[11] K. T. Putra, R. O. Wiyagi, and M. Y. Mustar, ‘‘Precision landing system on research interests include control theory, mechatronic applications, calibra-
H-octocopter drone using complementary filter,’’ in Proc. Int. Conf. Audio, tion, measurement technology, and image processing.
Lang. Image Process. (ICALIP), Jul. 2018, pp. 283–287.
[12] N. Q. Truong, P. H. Nguyen, S. H. Nam, and K. R. Park, ‘‘Deep learning-
based super-resolution reconstruction and marker detection for drone land-
ing,’’ IEEE Access, vol. 7, pp. 61639–61655, May 2019.
[13] W. Dong, G.-Y. Gu, and X. D. H. Zhu, ‘‘Modeling and control of a CHINTHAKA PREMACHANDRA (Member,
quadrotor UAV with aerodynamic concepts,’’ in World Academy of Sci- IEEE) was born in Sri Lanka. He received the
ence, Engineering and Technology. 2013, pp. 901–906. B.Sc. and M.Sc. degrees from Mie University, Tsu,
[14] A. E. V. Moreno, ‘‘Machine learning techniques to estimate the dynamics Japan, in 2006 and 2008, respectively, and the
of a slung load multirotor UAV system,’’ Univ. Glasgow, Glasgow, U.K., Ph.D. degree from Nagoya University, Nagoya,
Tech. Rep., 2017. Japan, in 2011.
[15] A. Reizenstein, ‘‘Position and trajectory control of a quadrocopter
From 2012 to 2015, he was an Assistant Profes-
using PID and LQ controllers,’’ Linköping Univ., Linköping, Sweden,
Tech. Rep., 2017.
sor with the Department of Electrical Engineering,
[16] Toshiba. (2019). TC74HC157AP Datasheet. [Online]. Available: Faculty of Engineering, Tokyo University of Sci-
https://www.alldatasheet.com/datasheet-pdf/pdf/214509/TOSHIBA/ ence, Tokyo, Japan. From 2016 to 2017, he was
TC74HC157AP_07.html an Assistant Professor with the Department of Electronic Engineering,
[17] C. Bohn, ‘‘Regelungstechnik (control theory) 1—Lecture script,’’ TU School of Engineering, Shibaura Institute of Technology, Tokyo. In 2018,
Clausthal, Clausthal-Zellerfeld, Germany, Tech. Rep., 2017. he was also an Associate Professor with the Department of Electronic
[18] C. Bohn, ‘‘Parameteridentifikation DC-motor und regelung—Laboratory Engineering, School of Engineering/Graduate School of Engineering and
script,’’ TU Clausthal, Clausthal-Zellerfeld, Germany, Tech. Rep., 2018. Science, Shibaura Institute of Technology, where he is currently a Manager
[19] S. Jung and R. C. Dorf, ‘‘Analytic PIDA controller design technique for with the Image Processing and Robotic Laboratory. His laboratory conducts
a third order system,’’ in Proc. 35th IEEE Conf. Decis. Control, vol. 3, research in two main fields, such as image processing and robotics. His
Dec. 1996, pp. 2513–2518. research interests include computer vision, pattern recognition, speed up
[20] X. Li, J. Park, and H. Shin, ‘‘Comparison and evaluation of anti-windup PI image processing, camera-based intelligent transportation systems, terres-
Controllers,’’ Gyeongsang Nat. Univ., Gyeongsangnam-do, South Korea, trial robotic systems, flying robotic systems, and integration of terrestrial
Tech. Rep., 2010. robot and flying robot.
[21] S. W. Smith, The Scientist and Engineer’s Guide to Digital Signal Process-
Dr. Premachandra was a steering committee member with many interna-
ing. California Technical Pub., 1997.
[22] C. Bohn and H. Unbehauen, Identifikation Dynamischer Systeme.
tional conferences. He is a member of IEICE, Japan, SICE, Japan, and SOFT,
Wiesbaden, Germany: Springer Vieweg-Verlag, 2016. Japan. He received the FIT Best Paper Award from IEICE in 2009 and the
[23] I. Kugelberg, ‘‘Black-box modeling and attitude control of a quadro- FIT Young Researchers Award from IPSJ, Japan, in 2010. He serves as the
copter,’’ Linköping Univ., Linköping, Sweden, Tech. Rep., 2016. Founding Chair for the International Conference on Image Processing and
[24] B. Jaehne, Digitale Bildverarbeitung Und Bildgewinnung. Berlin, Robotics (ICIPRoB). He served as an Editor for journals.
Germany: Springer Vieweg-Verlag, 2012.