(A9) Outdoor Autonomous Landing On A Moving Platform For Quadrotors Using An Omnidirecional Camera PDF
(A9) Outdoor Autonomous Landing On A Moving Platform For Quadrotors Using An Omnidirecional Camera PDF
(A9) Outdoor Autonomous Landing On A Moving Platform For Quadrotors Using An Omnidirecional Camera PDF
Abstract — This paper proposes a vision-based target coordinate of the site in the earth frame.
following and landing system for a quadrotor vehicle on a Differential GPS (DGPS) can provide very
moving platform. The system is consisted with vision-based
landing site detection and locating algorithm using an accurate position information in centimeter units,
omnidirectional lens. Latest smartphone was attached on the but the receiver is bulky for attaching on the
UAV and served as an on-board image acquisition and miniature UAVs.
process unit. Measurements from the omnidirectional Vision sensors serve as eyes of unmanned
camera are combined with a proper dynamic model in order
to estimate position and velocity of the moving platform. An vehicles assigned for reconnaissance missions.
adaptive control scheme was implemented on the flight Not only the cameras can shoot pictures of the
computer to deal with unknown disturbances in outdoor mission area, but they are able to support the
environment. The system was validated on a quadrotor UAV vehicle autonomy using visual information.
and the vehicle successfully landed on the moving platform
in outdoor flight tests.1 One of the most representative usages of the
visual information in UAV industries is indication
I. INTRODUCTION of the landing site with several painting patterns.
In recent years, Unmanned Aerial Vehicles In [2], simulation of the landing logic for a
(UAVs) have become important not only in the quadrotor UAV is provided.
military, but also in civilian practice. There are There were several trials, also, to guide the
many examples where UAVs are successfully Vertical Take-Off and Landing (VTOL) UAVs to
applied to help human beings. [1] describes the the moving target through flight tests. [3]
various civilian applications for the UAVs describes a landing algorithm for helicopters in
including search and rescue, border surveillance, outdoors. The experimental results in this paper
communication relay, wild fire suppression, are acquired by manual flights. In [4], a visual
disaster and emergency handling, research, servoing scheme using optical flow of the landing
agricultural and industrial applications. The most pad helped the multi-rotor UAV land on vertically
recent and noticeable contribution was the moving pad in indoors. [5-6] introduces full
exploration of damaged nuclear reactors in landing sequence of the quadrotor UAV in indoors.
Fukushima in March 2011 by flying UAVs into The authors in this paper used visual marker for
the air and monitoring the dangerous parts of the the moving landing site.
reactors. Landing control of the quadrotor UAV in
GPS receiver is the most popular sensor for outdoors, however, is difficult due to the absence
navigation of UAVs. A stand-alone GPS receiver, of the precise position and velocity measurements
however, cannot afford to provide precise position from the external motion capture system. Rather,
information required for landing sequence to some wind and gust disturb the landing sequence of the
miniature UAVs. Thus, the vehicle is hard to land quadrotor.
on the landing site precisely even if it has the The objective of this paper is to develop a
system capable of following and landing on
predefined targets with quadrotor platforms. The
1
J. Kim, Y. Jung and D. Lee are with the Department of Aerospace
Engineering, KAIST, Daejeon, South Korea (phone: +82-42-350-3764; fax:
main sensor to be used is the bottom facing
+82-42-350-3710; e-mail: [email protected]). camera in order to detect the target and calculate
D.H. Shim is with the Department of Aerospace Engineering, KAIST,
Daejeon, South Korea (e-mail:[email protected]). relative position with respect to it. Image
978-1-4799-2376-2/14/$31.00 ©2014 IEEE 1243
processing algorithms are required to be light
Algorithm 2.
enough – also the target itself is needed to be Input: , , , where , and are split
simple – to maneuver the drone with proper matrices of 8-bit, 1 channel image
performance. Whole procedure including process frames of red, blue and green color from
of the image and control of the drone should the 3-channel RGB image.
robustly deal with various situations, such as Output: A filtered binary image frame matrix,
following a moving target or landing on a moving
platform.
colorFilter , ,
II. TARGET DETECTION getBinaryImageofColor , ,
In this study, we consider a red target for vision- getBinaryImageofColor , ,
based tracking using a quadrotor. The methods getBinaryImageofColor , ,
applied for finding the red target in an image For every pixel value ,
frame in this part uses the same target detection If 0.6 , , then
algorithm in [8]. , ←1
A color filter is an algorithm that collects red else
pixels from the original RGB image and reforms , ←0
the image into a binary format. Let us assume that endIf
the acquired image has 8-bit, 3-channel RGB , ← , , , ,
format. Algorithm 1 filters out the pixels where endFor
one major color is dominant over the other two return
colors.
Algorithm 1. Pseudocode for filtering out the most
Algorithm 2 is the whole procedure of the color “reddish” pixels in an input RGB image frame.
filter which has a basis on Algorithm 1. First, run
Algorithm 1 three times, one repetition for each of “reddish” pixels in the RGB image and produces
the three colors. The resultant three images are final binary image, .
subjected to some binary operations for finding In the next step, contours in are retrieved
using the method proposed in [9]. The biggest
Algorithm 1. contour is finally regarded as our target. The
Input: , , , where each is a matrix of
an 8-bit, 1 channel image frame.
Output: A filtered binary image frame matrix,
getBinaryImageofColor , ,
For every pixel value , of i-th row and
j-th column in the matrix
If , K , , C then
, ←1
Else
, ←0
endIf
endFor
return
Algorithm 2. Pseudocode for filtering out a 1-channel Fig. 1. The detected target in an image frame. The
binary image from a 3-channel image with respect to a detection algorithm indicates the target with a yellow
specific color. contour.
1244
Figure 1 depicts the target that is detected in an
image frame collected from flight test.
1245
North
East
Fig. 3. Geometry between the UAV and the target in Fig. 4. Horizontal geometry between the UAV and the
the navigation frame. target in the navigation frame.
1246
A. Dynamic model
T
We assume that our target is moving on the where xCT x y
T
and v CT vx vy are
earth with maintaining its velocity constant.
horizontal position distinction and its derivative
Therefore in continuous time domain we can say:
between the target and the UAV.
T
vT 0 (10) ηC C C C is an Euler angle vector of
the camera frame with respect to the navigation
where vT is velocity of the target on the frame. Next, input of state propagation equation is
defined:
navigation frame. Above equation can be also
expressed in discrete time domain:
0
vT , k vT , k 1 (11) u vC (15)
ηC
where vT ,k denotes velocity of the target at
specific moment k . Considering that our where v C is the horizontal velocity difference
geolocation algorithm provides relative position between time k and k 1 .
between the target and the camera (or UAV), we According to Equation (12), the state prediction
need to reproduce our state equation from the equation is finalized as below:
above equation. By substituting vT ,k v C ,k v CT ,k
to the equation, we can get: xk f xk 1 , uk Axk 1 uk (16)
v CT ,k v CT ,k 1 vC (12) I2 2 tI 2 2 02 3
where A 02 2 I2 2 02 3 .
where v C ,k is the velocity of the camera at the
03 2 03 2 03 3
specific moment k and v C indicates the velocity
Now let’s define measurement vector as:
difference between time k and k 1 .
yk h xk , H k (17)
B. Nonlinear Discrete Kalman Filter
Formulation Considering that the state update equation needs
Now we can formulate the discretized version of to update the horizontal position of the target in
nonlinear Kalman filter: the NED frame xCT x
T
y using fish-eye lens
T
xk f xk 1, uk model and the measurement vector y u v ,
(13)
yk h xk , H k the nonlinear function h xk , H k can be
simplified as:
where x is a state vector of our filter, u is a input
vector of state prediction equation and y is a yk h xk , yk , H k (18)
measurement vector. Let us define x as:
From Equation (5),
T
x xCT v CT ηC
T
(14)
x y vx vy C C C
1247
u formula for four roots of quartic equation is well-
known. Therefore we can evaluate the four roots
v sR CN xCT
0 i 3, i Z . Among them, we need to
f u, v i
i 1
1248
Fig. 6. Illustration of the test environment.
detection result, a smartphone is equipped on the
bottom of the quadrotor. The smartphone takes
real-time video pictures by using its onboard
camera and process them with its own processor.
Results from the video processing algorithm are
passed to the FCC to generate guidance
commands.
B. Test Environment
Due to extremely large FOV of fish-eye lens,
shape of the target becomes unidentifiable above
certain altitude. Hence, the target detection is only
Fig. 5. DJI NAZA F450 equipped with a customized done by the color filter algorithm from [8], which
FCC box and a smartphone. finds a biggest red contour in the camera frame.
contain trials of landing on a moving pad. This A rover platform drags red target toward and is
chapter describes the tests and their results. controlled manually. The quadrotor UAV is
commanded to maintain its horizontal position
A. Quadrotor Testbed right above the target while descending.
The UAV faces certain interruptions including
We organized a quadrotor UAV system by using
wind disturbances and model uncertainty while it
customized avionics box. The base quadrotor
is in flight. Therefore, we use L1 adaptive
platform is DJI NAZA F450. It has attitude
augmentation loop based on an output feedback
stabilization board inside to feedback angular rates.
method to improve the accurate path tracking. The
FCC (Flight Control Computer) provides PWM
adaptive manner can supplements the linear
(Purse Width Modulation) commands of control
controller to guarantee the flight performance in
surfaces such as an aileron, an elevator and a
the presence of the uncertainties [9].
rudder to the stabilization board. The FCC of the
UAV is Gumstix Verdex Pro composed of
ARMv5 600MHz processor with 128MB DDR C. Flight Test Results
RAM. GPS and IMU sensors are equipped and We tried several landing tests on the moving
packed in a box in the simple manner. The visual pad with the test environment and we got
quadrotor can be manually operated with a successful results.
controller with 72MHz RF signal. The rover maintained its movement direction, so
In order to provide nearly real-time target the landing pad only moved along the straight line
1249
while maintaining nearly constant velocity.
Quadrotor
Figure 8 shows the two-dimensional horizontal -20 Marker
trajectory of the UAV and the landing pad in the
NED frame. The horizontal position difference -22
North [m]
-28
a half meter error at 20s to 45s, but it eventually -20 -18 -16 -14 -12 -10 -8 -6 -4
East [m]
converged to the zero.
Figure 9 illustrates vehicle attitude and body Fig. 8. 2D Trajectory of the UAV and the estimated
velocity plots. The attitude angle graph showed position of the landing pad.
about 1 degree offset error from in whole flight.
The body velocity has some amount of high
frequency noise, but it almost converged to the
reference.
-20 0
North [m]
Marker Vehicle
Roll [deg]
-10 10
Pitch[deg]
-20 5
0 10 20 30 40 50 60
10
Height [m]
Reference
0
5 Vehicle 0 10 20 30 40 50 60
0.5
0
0 10 20 30 40 50 60
u [m/s]
1 0
Marker
V N [m]
0 Vehicle
-0.5
0 10 20 30 40 50 60
-1
0 10 20 30 40 50 60 0
1
v [m/s]
V E [m]
0 -0.5
-1
0 10 20 30 40 50 60 -1
Time [sec] 0 10 20 30 40 50 60
Time [sec]
Fig. 7. Time history of the UAV, the estimated Fig. 9. Time history of the attitude angles and body
position and velocity of the landing pad. velocity with their reference inputs.
1250
Fig. 10. Still Pictures from the Landing Test on a Moving Pad
1251
VI. CONCLUSION [3] S. Saripalli, G. Sukhatme, "Landing a
This paper proposed a landing algorithm for a Helicopter on a Moving Target," 2007 IEEE
quadrotor UAV. A set of algorithms were International Conference on Robotics and
developed for detecting the target based on the Automation, pp.2030-2035, April 10-14, 2007.
[4] B. Herissé, T. Hamel, R. Mahony, F-X.
color appearing on the image. In order to land on
Russotto, "Landing a VTOL Unmanned Aerial
the specific visual landing pad in outdoors, fish-
Vehicle on a Moving Platform Using Optical
eye lens and its calibration model helped the
Flow," IEEE Transactions on Robotics, Vol.28,
shrinking FOV problem while descending above no.1, pp.77-89, Feb. 2012.
the visual pad. The nonlinear observation model [5] D. Lee, T. Ryan, H.J. Kim, "Autonomous
and the constant velocity model in the NED landing of a VTOL UAV on a moving
coordinate frame were concatenated to formulate platform using image-based visual servoing,"
nonlinear estimation model. The model were 2012 IEEE International Conference on
properly evaluated and the state vector is Robotics and Automation (ICRA), pp.971-976,
estimated by using the Unscented Kalman Filter. May 14-18, 2012.
The geolocation filter and the rotorcraft [6] K.E. Wenzel, A. Masselli, A. Zell, “Tracking
controller were integrated and implemented into and Landing of a Miniature UAV on a Moving
one single system and the system was validated by Carrier Vehicle,” Journal of Intelligent and
several flight tests. Especially, the UAV Robotic systems, Vol. 61, Issue 1-4, pp.221-
successfully landed on the moving pad in outdoors 238, Jan. 2011.
without help of external motion capture system. [7] D. Scaramuzza, A. Martinelli and R. Siegwart,
The resultant flight test graphs showed satisfactory "A Flexible Technique for Accurate
performance. Omnidirectional Camera Calibration and
The proposed target observation and tracking Structure from Motion," Proceedings of IEEE
algorithm can be used for following or landing on International Conference of Vision Systems,
Jan 5-7, 2006.
specific platforms such as ground carrier vehicle
[8] J. Kim, D. H. Shim, J. R. Morrison, “Tablet
or shipboards with visual markers. Furthermore,
PC-based Visual Target-Following System for
the possibility of using smartphones as a viable
Quadrotors,” Journal of Intelligent and
on-board image acquisition and computation Robotic systems, Vol. 74, Issue 1-2, pp.85-95,
platform and potent computing platform for Apr. 2014.
realistic application was demonstrated. [9] Jung, Y.D., and Shim, D.H., “Development
and Application of Controller for Transition
ACKNOWLEDGMENT Flight of Tail-Sitter UAV”, Journal of
Authors are gratefully acknowledging the Intelligent and Robotic systems, Vol. 65,
financial support by Agency for Defense pp.137-152, 2012.
Development funded by the Korean government
(No.UE124026JD).
REFERENCES
[1] Z. Sarris, “Survey of UAV applications in civil
markets,” 9th Mediterranean Conference on
Control and Automation, Dubrovnik, Croatia,
June 27–29, 2001.
[2] H. Voos, H. Bou-Ammar, “Nonlinear tracking
and landing controller for quadrotor aerial
robots,” 2010 IEEE International Conference
on Control Applications, pp.2136-2141, Sep
8-10, 2010.
1252