0% found this document useful (0 votes)
69 views7 pages

Coombes 2014

Uploaded by

fydhd
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
69 views7 pages

Coombes 2014

Uploaded by

fydhd
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

2014 UKACC International Conference on Control

9th - 11th July 2014, Loughborough, U.K

Development of a Generic Network Enabled


Autonomous Vehicle System
Matthew Coombes, William Eaton, Owen McAree, Wen-Hua Chen
Department of Automotive and Aeronautical Engineering
Loughborough University
Loughborough, LE11 3TU UK
Email: {[email protected], [email protected], [email protected], [email protected]}

Abstract—This paper describes the development of a system for • Simultaneous control of multiple vehicles.
autonomous vehicle testing, utilising conventional network infras- • Control signals and data to be sent using high bandwidth
tructure for communication and control; allowing simultaneous network communication (between any vehicles or con-
control of multiple vehicles of differing vehicle types. A basic level
of autonomy is achieved through the use of an Arduino based nected devices).
commercial autopilot (ArduPilot), which also allows for remote • Compatibility with common development environments,
vehicle control via MAVLink protocol commands given through i.e. MATLAB and Simulink.
serial communication. Traditionally messages are sent using The final point highlights that the main aim of this system is
point-to-point wireless serial modems. As these are restricted
in terms of bandwidth and flexibility, an improved set-up is to provide a capability of testing high level control algorithms,
suggested, where an embedded computer system is attached to without the usual time investment in hardware specific code
each vehicle. A custom written Node.js program (MAVNode) is adaptation. As the intended users will be of various skill levels,
then used to encode and decode MAVLink messages onboard the system was designed for use with MATLAB and Simulink,
allowing communication over a Local Area Network via Wi-Fi. as many people are familiar with this software environment.
A selection of hardware configurations are discussed, including
the use of conventional Wi-Fi and long range Ubiquiti airMAX A set of accompanying functions and blocks were also
wireless routers. Both software and hardware in the loop testing developed for Matlab/Simulink. These functions act as an
is discussed, in addition to the ability to to perform control from interface, providing vehicle data and accepting commands as
Matlab/Simulink. With all the infrastructure in place, algorithms appropriate. Once a vehicle is set-up to use the system, it can
can be rapidly prototyped. As an example use of the system, a be given to an interested party as a ‘black box’, leaving them
quad-rotor visually tracks a robot while using a remote Matlab
installation for image processing and control. to concentrate on algorithm development. To enhance safety
Index Terms—Autopilot; rapid prototyping; algorithm devel- and reduce development time, all testing can be done under
opment; hardware in the loop; software in the loop; invariant both Software In the Loop (SIL) and Hardware In the Loop
object recognition; Ethernet Networks. (HIL) condition before moving on to real world testing. The
virtues of going through this testing procedure is discussed in
I. I NTRODUCTION
[3].
While developing test environments for autonomous-vehicle This rest of this paper discusses the development of the
research, it has been observed that despite the focus of the system in stages. The chosen autopilot and its functions are
research being on the software algorithms, it is often the described in Section II, with the additional system components
development of the physical hardware that takes the longest described in III. Section IV explains how a program called
time. For example, multi-vehicle control algorithms often MAVNode enables the use of the system over the network
require testing on ground vehicles before moving to aircraft, while section V discusses the network infrastructure, its ad-
yet this migration between vehicles can increase development vantages, and alternate configurations. Section VI presents SIL
time significantly. and HIL testing and finally in Section VII an example use of
As such, a generic baseline system that can be used on a the system is presented.
range of development platforms has been developed. When
implemented, this should lower costs and reduce development II. AUTOPILOT S ELECTION
time, while also improving reliability and safety. For this There are already several examples of autopilot systems
system to satisfy the majority of research needs, the following developed by research organisations, usually designed to assist
were considered requirements: with their own research goals; such as [3], [5], [6], and [7].
• A standardised embedded computer structure on board Producing a bespoke autopilot offers the chance to tailor it
each vehicle; providing a low level of autonomy and the to the research requirements exactly, adding only the func-
ability to interface with a range of sensors. tionality required. However, there are many drawbacks to
• Compatibility with multiple vehicle types; for example this approach. Developing control hardware requires specialist
fixed wing and rotary wing aircraft (both multi rotor and knowledge in electronic engineering, systems integration, and
single rotor) and ground vehicles. software engineering, with the additional requirements of

978-1-4799-5011-9/14/$31.00 ©2014 IEEE 621


the aircraft returning to straight and level flight should control
be relinquished. Finally, in guided mode APM will handle all
high level control itself, allowing a vehicle to to fly or drive
a series of predefined waypoints autonomously. This variable
degree of control authority makes it an extremely useful basis
upon which higher order control strategies can be built.
III. S YSTEM C ONFIGURATION
The APM system hardware consists of a main processing
board and an additional external board which houses a GPS
module and a triple axis magnetometer, as shown in Fig. 1. The
main board combines an ATmega2560 processor with several
different sensors, including a full set of Microelectromechan-
Fig. 1: ArduPilot 2.6 (APM) in a protective case alongside an
ical Systems (MEMS) sensors (triple axis accelerometers and
external GPS receiver and magnetometer
a three axis gyroscope), a barometric pressure sensor and
the ability to add an external pitot static sensor, to measure
airspeed. APM’s inbuilt Attitude Heading Refrence System
aerospace engineering when developing an autopilot. This can (AHRS) code conducts sensor fusion on the sensor data to
result in a long development time and considerable expense give the autopilot sufficient data for realtime control.
to reach a functional state. Within the context of the overall system, APM is also
In addition, continued development will be required to en- intended to provide the hardware interface for controlling
sure the systems remain current and compatible with ongoing motors and servomechanisms, such as the control surfaces
work. If an autopilot is designed as software running on a on an aircraft or the steering assembly in a car. A major
single embedded system, any change in hardware type would benefit of using APM is its ability to be ’re-flashed’ with
require complete redevelopment of the autopilot software for different firmware, appropriate for different vehicle types. This
compatibility with new hardware. allows the same hardware to be used on any vehicle, with the
Alternatively, there are also several Commercial Off The hardware set up for a quad-rotor being very similar to that
Shelf (COTS) autopilots available for small scale aircraft. By used on a ground robot.
purchasing a COTS autopilot system, the time and effort that The biggest limitation of APM is the ATmega2560 proces-
would be spent in development can be saved and directed at sor, which is relatively slow and only capable of executing
primary research instead. 256 Kb of code. While algorithms could be tested by altering
The system was built to utilise ArduPilot Mega (APM), an the APM code, it is difficult, unreliable, and perhaps even
Arduino [4] based commercial autopilot intended for use on unsafe to do so. Futhermore, APM is difficult to interface
remote control vehicles. Although originally designed for fixed with MATLAB as the existing autopilot software is written
wing aircraft, continuous updates toAPM have extended the in C++. When used alone, APM is not an effective system for
functionality to include control schemes for ground vehicles, research.
as well as various rotary wing platforms; including heli-, tri- Fortunately, ArduPilot can communicate with a external
, quad-, hexa-, and octa-copters. The autopilot software is devices over a serial connection from its telemetry port, and
written in C++, and is completely open source with an active any level of its control structure can be directed via the rele-
development community, ensuring that it will continue to get vant MAVLink message [1]. MAVLink is a very lightweight,
developed and improved. It is due to the flexibility, reliability, header-only message marshalling library for Micro Aerial
and multi platform capabilities that APM was chosen as the Vehicles (MAVs), written in C/C++. It encodes data structures
basis of the system. into high efficiency data packets which use binary instead of
Although APM is usually referred to as an autopilot, the ASCII encoding, yielding faster data transfer and higher data
degree of authority it has over the vehicle can be varied at integrity. Any device that can communicate in MAVLink can
any point through an input command. Under manual override talk to ArduPilot.
mode, any vehicle can be controlled directly by a human pilot Using this messaging protocol, vehicle data can be extracted
or external program, providing direct Pulse Width Modulation (such as AHRS data or GPS position) and commands given
(PWM) commands to the APM which in turn relays them to to the autopilot. This can either be direct control commands
the servos. (such as motor speeds), attitude control, or position control.
Levels of increasing autonomy can then be applied; for Each can be commanded by simply using a different MAVLink
example when APM is onboard an aircraft, Fly by Wire (FBW) message. The open source software used in APM allows for
mode enables a human pilot to give pitch and roll angles minor customisation, should it be required.
rather than directly controlling servos. As fixed limits on the The primary goal of this system is to gain access to all
maximum roll and pitch angles can be set within APMs inner the base functionality of APM remotely, so as to allow a
loop control, this makes the aircraft much easier to fly, with vehicle to be controlled externally by MATLAB/Simulink.

622
Fig. 2: The internal layout of a ground robot
Fig. 3: Quad-rotor with gyro-stabilised camera

However, hardware communication with APM is performed


via serial port. For normal consumer use, a wireless serial option of the vehicle to be connected to the Internet.
modem would be used to relay these messages to a Global The embedded system can also provide extra levels of
Coordinate System (GCS) running Mission planner (GCS autonomy, giving commands to APM in cases of data loss.
software), where a mission can be monitored and various Fig. 2 shows a practical layout inside a ground robot.
commands sent. However, as this method of communication is
usually point to point, multi vehicle communications become IV. M AV N ODE
much more difficult and require a centralised distribution As MAVLink is a protocol original designed for serial com-
point, as discussed in [2]. munication, a method of encoding and decoding MAVLink
Instead, this system has been designed to re-broadcast data packets is required for network communication. This is
MAVLink via Local Aera Network (LAN). This is achieved by achieved using MAVNode, a Node.js module which is present
adding an Ethernet-equipped embedded system to the vehicle, on all vehicles and communication devices on the network.
which communicates with APM via serial cable. As with the MAVNode utilises the asynchronous, event driven nature of
autopilot, to ensure that the system is affordable and easy to Node.js to deal with messages from multiple systems via
replicate, it was decided that COTS components should be multiple communication channels efficiently. In addition to
used throughout. Therefore, an established system such as handling MAVLink messages generated by systems such as
Beaglebone or the Raspberry Pi is used on each vehicle to APM, MAVNode exposes a modified RESTful API to enable
encode and decode the Mavlink message. packets to be encoded/decoded over HTTP.
The MAVLink message is then broadcast via wireless router All API interactions are handled via HTTP GET strings
to the wider LAN. This enables any other vehicle or connected so as to be directly compatible with the MatLab urlread
device to communicate with the onboard system, provided they command. A typical interaction between an application and
are both on the same network. To decode and re-broadcast MAVNode is shown in Figure 4. In this example, an appli-
MAVLink via LAN, a program called MAVNode.js (written cation requests a data stream from MAVNode by issuing the
in Node.js) is used to format the data into packets for network HTTP request shown. This requests data from System and
transmission. This program runs on the embedded systems Component IDs 1, since MAVNode is able to handle multiple
start up, and will be described in a greater detail in a later systems. MAVNode responds with a plain text port number,
section. which corresponds to the UDP port over which the requested
Additional benefits of using embedded systems on board data will be sent. The application can now listen on this port
include the ability to interface with a multitude of sensors for incoming data until it no longer requires it, informing
and peripherals, normally outside the scope of APM; such MAVNode to terminate the stream.
as microphones, cameras or LIDAR sensors. These can be An application can also send MAVLink messages through
connected via I/O ports (such as serial I 2 C, CAN, and ADC) MAVNode by using a similar HTTP GET string http :
or through conventional computer ports, such as USB. In //send/1/1/ROLL P IT CH T HRU ST SET P OIN T
addition to the MAVLink messages, the data from these ?&roll = 1&pitch = 0&thrust = 0.5
sensors can also be sent via LAN for processing on an external This string would send the
system, as demonstrated in Section VII. As the messages are ROLL P IT CH T HRU ST SET P OIN T message to
already reformatted for use on a network, it also gives the System and Component ID 1, with the parameter values

623
an Ethernet cable and communicates with a ground robot and a
fixed wing aircraft via airMAX PicoStations on each vehicle.
A short range quad-rotor is also connected via USB Wi-Fi
dongle, directly to the wireless router.
Many networking devices support additional features useful
for research , such as the ability to extend the wireless network
beyond the range of a single access point. For example Wire-
less Distributed Service (WDS), and mesh networks. Provided
that a communication route can be established, it is possible
to send a message to a device that is beyond Wi-Fi range
by ’bouncing’ the signal off one that is closer. For example,
Fig. 4: Application requesting data stream from MAVNode a series of ground robots stationed a few hundreds metres
apart would allow the furthest robot to communicate with the
GCS, despite being well beyond Wi-Fi range. The downside
to this approach is the added latency, as each redirection
shown. For any message sent in this way, all parameter values
requires additional processing. A detailed summary of mesh
must be set or the message will not be delivered. In this
networking is shown in [11].
example message, a pitch angle of 0 rad and a roll angle of
An additional benefit of using Wi-Fi is that the MavLink
1 rad, and a throttle setting of 0.5 are demanded and would
data packages only constitute a tiny percentage of the total
be used to control the FBW mode on a fixed wing aircraft.
available bandwidth. This allows other data, such as video
V. N ETWORK A RCHITECTURE footage or sensor readings, to be sent back to the ground
station, or data from the ground station to be sent to a vehicle.
The most generic and reconfigurable part of the system If the GCS is given access to the Internet, this connection can
is the network architecture. Depending on the range and be shared throughout the network bringing a number of new
bandwidth required, different implementations of Wi-Fi com- possibilities; such as streaming webcam footage, gaining live
munications can be used. For short range testing, such as updates on predicted weather data or even remote controlling
within a single room, a generic wireless router can be used as vehicles from extreme range. If the entire network is outside,
the gateway and wireless access point, with small scale USB and there is no local wired internet connect, a mobile network
WiFi Adaptors used aboard the vehicles. This is sufficient to 3G/4G dongle could be attached to any device to enable web
provide full 150 Mbit/s network capability, with the additional access for the whole network from the cell towers. Rather
benefit of the minimal payload increase. than the GCS providing internet to the attached vehicles,
For long range testing, such as when using aircraft outdoors, the vehicles equipped with mobile broadband can be used to
more specialised equipment is required. After experimentation provide internet to the GCS.
with multiple COTS Wi-Fi systems, the airMAX range of
equipment from Ubiquiti Networks was found to be the VI. S OFTWARE AND HARDWARE IN THE LOOP
most practical solution, offering long range, high integrity Due to the inherent risks in testing novel algorithms on
communications wireless communications. vehicles it is important to verify code functionality before per-
To avoid adding additional complexity, airMAX devices forming real world testing. Risk can be significantly reduced
connect directly to an ethernet port on the embedded system by putting algorithms through the full development cycle of
and operate as wireless bridge. As the Wi-Fi configuration SIL, HIL, which enables systematic debugging to occur. SIL
is dealt with by the device, no additional drivers or software testing allows the flaws in the software implementation to
are required, allowing the devices to be both plug-and-play, be resolved, whilst HIL brings to light any issues which
and interchangeable. For example, the airMAX PicoStation are caused by the software’s interaction with the hardware
is small, has an omni-directional antenna and offers full or networking setup. Eventual real-world testing should then
bandwidth Wi-Fi up to 300m range, making it suitable for use validate the systems capabilities, only testing the functionality
onboard an aircraft. The airMAX Nano station is larger, and of the communications system at range, and the systems ability
is restricted by its 60◦ sector antenna, but offers greater range to handle real world interference, such as weather conditions.
and bandwidth options, making it useful as the connection This SIL and HIL method of testing is easy to implement
point on the ground. When used together, this setup provides using the chosen system arrangement, especially when using
a very reliable connection. APM. In addition to being open source, it is possible to build
An example network configuration is shown in Fig. 5, where the ArduPilot autopilot code to run on a conventional desktop
a network is based around a gateway router. Two GCS are computer running Linux. By interfacing this software-only
connected to the gateway; one of which is monitoring the ArduPilot with a simulated environment, it is possible to have
vehicles through the mission planner software, while the other any control strategy tested on a virtual vehicle beforehand.
is running the high level control algorithms for vehicle control. As ArduPilot is already well established, many software titles,
An airMAX NanoStation is connected to the gateway through such as the X-Plane flight simulator, already support this form

624
GCS 1 GCS 2
Internet 10.0.255.2 10.0.255.3 Human
Supervisor

Gateway Router Mode


and short range Select
Long Range
Wi-Fi Receiver ArduPilot
Wi-Fi
10.0.255.20 Input
Netmask: 255.255.0.0 Channels Serial
MAVLink

Ethernet Embedded System


MAVLink
Camera stream
10.0.4.1
Sensor Data

airMAX Wi-Fi

Quad-rotor Ground Robot 10.0.255.21 Sensors


Fixed wing
10.0.2.1 10.0.3.1

Fig. 5: Example Network Layout

of interaction. Therefore, the model vehicle within the simula- ground vehicle moved independently and a helipad symbol
tion software is controlled by a desktop build of ArduPilot, and was displayed on its roof for the quadrotor to track. This was
in turn ArduPilot is controlled by an external system (such as intended to represent a scenario such as a helicopter landing
MatLab/Simulink) running high level control algorithms. This on a moving ship, where the general coordinates are known
SIL method enables development to be undertaken without via GPS, and the precision navigation can be achieved through
any ArduPilot hardware, or the associated risks. visual means.
HIL is much the same, but with the actual ArduPilot
hardware physically connected to the computer running the
simulator using a USB cable. As APM is usually the source A. Requirements
of vehicle data (attitude and position) the same simulated data
as used in SIL must be sent out to the ArduPilot hardware As the task was to locate and manoeuvre the aircraft to
for processing, before its response can be assessed. The remain above the helipad (although the landing stage was not
communication between the simulator and ArduPilot is now considered for this demonstration) the process needed to be
done over a virtual serial connection. For actual flight tests both robust and obtainable quick enough for real time control.
APM is once more used as the source of sensor data, however The main disadvantage of this approach is that it limits the
there is no need to adjust the software onboard the autopilot helipad detection to only occurring directly within the field
or the embedded system. ArduPilot and MAVLink have both of view below the aircraft. Therefore, prior knowledge of the
been abstracted to such a high level that the same code can helipads location, such as a rough GPS position, is required
be used unchanged, throughout the whole development cycle. for the aircraft to be in right place for detection to occur.
However, the downward view allows for a great reduction
VII. V ISUAL H ELIPAD D ETECTION AND T RACKING in complexity in the detection process itself. If the aircraft
As a demonstration of the capabilities of this system, an only achieves small angles in roll and pitch, or if the camera
image processing algorithm was developed where the quad- itself is stabilised via a gimbal to look downwards, no reverse
rotor visually tracks a moving ground vehicle. The intention affine or perspective transformation is required. Therefore, the
of the demonstration was to produce similar results to [12], helipad will appear as a shape on a plane, and two dimensional
but using off board processing, an alternative helipad detection template matching can be employed. As the height of the
algorithm and with a moving target on the ground. quad-rotor and its orientation compared to the helipad will
Using the network connection, a video image from the vary, the template recognition algorithm must be scale,rotation
quad-rotor was transmitted to a ground station for processing, and translation invariant. This is know as Invariant Object
before commands were relayed back via MAVLink. The Recognition (IOR).
Fig. 7: Circular and Radial Sampling Filters

for real time usage.


Cirafeti [13] builds upon work undertaken in [14] and [15].
Through experimentation and the results shown in [15], it was
found that the technique can also be applied to binary images
Fig. 6: Colourspace reduction and template matching.
and therefore justifies the aforementioned reduction in colour
space data.
As a binary image is being used instead of a greyscale
B. Colourspace Reduction image, it is possible to use blob analysis in both the template
Under most algorithms, the level of data present in the tem- stage and at runtime. As the helipad shape should match the
plate will indicate the amount of data required in the captured template, the relative centroids will be the same. Therefore,
image. As helipads are traditionally a single bright colour, a using a blob detection algorithm to determine the centre of
simple binary image of a letter H is sufficient, provided that the the shapes reduces the number of candidate pixels down from
dimensions are correct. Therefore, a reduction in colourspace every pixel in the image to only the number of detected blobs.
was determined to have little effect on the recognisability This can produce a speed increase of several thousand fold,
of the helipad and an initial binarization was employed to at the expense of being restricted to binary images. In addition,
simplify the problem. the correlation coefficients gained through the two sampling
filters proved to be sufficient to determine if the shape is a
C. Template Matching match. The final ‘brute force’ template match reduced the
The template matching algorithm used to detect the helipad run speed of the algorithm without offering a significant
is based on the work undertaken in [13], known as the Ciratefi improvement in detection rate. Although this final check does
technique. The original algorithm is designed to work on help in false positive rejection, this was outweighed by the
greyscale images and is capable of detecting template matches inability to run the algorithm fast enough for real time control.
in very busy images, in far less time than other techniques.
However, although the technique is extremely robust, it can D. Vision Processing Results
take many seconds for a single image to process. As the In detecting a helipad of 0.25m by 0.25m, using a 640x480
technique is designed to work in very cluttered and complex pixel video stream, the detection algorithm could detect the
images, it was determined that simplifications could be made helipad consistently up to a height of 10m.
to enable the software to run fast enough for real time vehicle During this time, the average processing time per frame was
tracking. In order to achieve this, the technique must be 0.0918 seconds, giving around ten frames per second, which
separated into its three constituent parts, a circular sampling was found to be adequate for accurate position holding.
filter, a radial sampling filter and a final brute force template Obviously, this method is highly affected by the amount of
matcher. bright coloured objects on screen. Therefore, when operating
The highest circular filter correlation is used to determine over a highly cluttered environment, it was found that the
shapes that have a high probability of being the desired shape, maximum processing time per frame was 0.326 seconds. This
in addition to estimating the size of the shape within the was still felt to be sufficiently fast for providing waypoint
image. Radial sampling is then used to confirm that the shape updates, as detailed below. For scenarios in which there is
matches the template image, in addition to estimating its likely to be much clutter in the image, a different thresholding
angular orientation. algorithm may be applied - for example one dependent more
Within [13], the purpose of the first two filters is to on the hue of the pixels than their intensity.
reduce the number of potential candidate pixels down before
a traditional ‘brute force’ algorithm was applied. This greatly E. Ground Vehicle Following
reduces the number of pixels to be identified and therefore To get the quad-rotor to follow a helipad on top of a ground
speeds up the entire process. However it is still far too slow robot, the relative position of the helipad from the quad-rotor is

626
developed and tested. As each vehicle is network enabled, any
25 Quad
device on the network can read data or send commands.
Robot
By using COTS components, the system is cheap and easy
to integrate into a range of vehicles. As APM is well developed
20 and extremely robust, using the system makes real world
testing easier, quicker and safer.
The flexibility of the system is demonstrated in the above
15 example. Despite the quad-rotor following the helipad well,
North (m)

there is no code specific to the aircraft type. As the helipad


tracking algorithm only sends outer loop position control
10 commands very little is needed to be known about the vehicle,
and in fact the exact same code could be used on a helicopter.
An example use was shown, is clearly illiterates the flex-
ibility of this system as an external sensor with the high
5
bandwidth network communication enable off board image
processing. While the network structure does ensure maximum
communications flexibility, it also adds latency to the system,
0
proportional to the complexity of the network layout.
−20 −10 0 10
East (m) This latency means the system is incapable of performing
very low level control on a high bandwidth system such as a
Fig. 8: Ground robot, and quad-rotor’s paths quad-rotor; however as the system is meant for more high level
control, such as path planning or multi-vehicle swarming, this
is not expected to be an an issue.
used with the position hold controller on APM. The outer loop
position hold function is sent the robot’s latitude and longitude R EFERENCES
as a hold position at 5 hz via the MISSION ITEM MAVLink [1] [Online]. Available: http://qgroundcontrol.org/mavlink/start
message. This example is to show how quickly and easily it [2] M. Coombes, O. McAree, W.-H. Chen, and P. Render, “Development
of an autopilot system for rapid prototyping of high level control algo-
is to prototype complex functions using the MatLab/Simulink rithms,” in Control (CONTROL), 2012 UKACC International Conference
and any of the native functions of the autopilot. Compared to on. IEEE, 2012, pp. 292–297.
the system in [12] where they do a similar visual tracking task, [3] G. Cai, B. M. Chen, T. H. Lee, and M. Dong, “Design and implemen-
tation of a hardware-in-the-loop simulation system for small-scale uav
as the inner and outer control is performed by the autopilot, helicopters,” Mechatronics, vol. 19, no. 7, pp. 1057 – 1066, 2009.
the development and testing time will be much shorter. [4] [Online]. Available: http://www.arduino.cc/
Using the location of the helipad in the image to ap- [5] D. Kingston, R. Beard, A. Beard, T. McLain, M. Larsen, and W. Ren,
“Autonomous vehicle technologies for small fixed wing uavs,” in AIAA
proximate the actual position of the helipad relative to the Journal of Aerospace Computing, Information, and Communication,
quad-rotor in metres. After the coordinates of the helipad 2003, pp. 2003–6559.
were visually estimated, they were converted from a local co- [6] Y. C. Paw and G. J. Balas, “Development and application of an integrated
framework for small uav flight control development,” Mechatronics,
ordinate system into global GPS coordinates using an inverse vol. 21, pp. 789 – 802, 2011.
haversine conversion. [7] A. Mehta and K. Pister, “Warpwing: A complete open source control
platform for miniature robots,” in Intelligent Robots and Systems (IROS),
2010 IEEE/RSJ International Conference on, oct. 2010, pp. 5169 –5174.
LatH = asin(s(lat) ∗ c(d/R) + c(lat) ∗ s(d/R) ∗ c(brng)) [8] [Online]. Available: http://paparazzi.enac.fr/wiki
[9] [Online]. Available: http://www.micropilot.com/
[10] M. G. P.-S. H. Pascal Brisset, Antoine Drouin and J. Tyler. (2006,
s(brng)∗s(d/R)∗c(lat)
LngH = latH + atan( c(d/R)−s(lat)∗s(lat H)
) October) The paparazzi solution. ENAC.
[11] I. Akyildiz and X. Wang, “A survey on wireless mesh networks,”
(1) Communications Magazine, IEEE, vol. 43, no. 9, pp. S23–S30, 2005.
Where cos and sin are abbreviated to c and s respectively. [12] S. Lee, J. won Jang, and K.-R. Baek, “Implementation of vision-based
Where R is the radius of the earth which is 6378.1 Km,. lat, real time helipad detection system,” in Control, Automation and Systems
(ICCAS), 2012 12th International Conference on, 2012, pp. 191–194.
and lng are the latitude, and longitude of the quad-rotor, and [13] H. Kim and S. A. de Arajo, “Grayscale template-matching invariant
latH , and lngH are the latitude, and longitude of the Helipad. to rotation, scale, translation, brightness and contrast,” in Advances in
brng is the bearing of the helipad from the quad-rotor, and d Image and Video Technology, D. Mery and L. Rueda, Eds., vol. 4872.
Springer Berlin Heidelberg, 2007, pp. 100–113.
is the distance of the helipad from the quad-rotor. [14] L. A. Torres-Torres-Mendez, J. C. Ruiz-suarez, L. E. Sucar, and
Shown in Fig. 8 is the 25x20m rectangular path of ground G. Gomez, “Translation, rotation, and scale-invariant object recognition,”
robot, and the path of the tracking quad-rotor. IEEE Trans. on System, Man and Cybernetics Part C, vol. 30, pp. 125–
130, 2000.
[15] W.-Y. Kim and P. Yuan, “A practical pattern recognition system for
VIII. C ONCLUSION translation, scale and rotation invariance,” in Computer Vision and Pat-
tern Recognition, 1994. Proceedings CVPR ’94., 1994 IEEE Computer
For autonomous vehicles research, a system has been devel- Society Conference on, 1994, pp. 391–396.
oped that enables high level control algorithms to be quickly

627

You might also like