LabVIEW-based Autonomous Vehicle Navigation System Using
LabVIEW-based Autonomous Vehicle Navigation System Using
2, 2011, 129-136
ISSN 1405-7743 FI-UNAM
(artculo arbitrado)
Martnez-Carballido J.
Coordinacin de Electrnica
Instituto Nacional de Astrofsica, ptica y Electrnica
E-mail: [email protected]
Coordinacin de Electrnica
Instituto Nacional de Astrofsica, ptica y Electrnica
E-mail: [email protected]
Gmez-Gil P.
Lpez-Larios F.
Coordinacin de Computacin
Instituto Nacional de Astrofsica, ptica y Electrnica
E-mail: [email protected]
Abstract
This paper describes a navigation system for an autonomous vehicle using
machine vision techniques applied to real-time captured images of the track,
for academic purposes. The experiment consists of the automatic navigation
of a remote control car through a closed circuit. Computer vision techniques
are used for the sensing of the environment through a wireless camera. The
received images are captured into the computer through the acquisition card
NI USB-6009, and processed in a system developed under the LabVIEW platform, taking advantage of the toolkit for acquisition and image processing.
Fuzzy logic control techniques are incorporated for the intermediate control
decisions required during the car navigation. An ecient approach based on
logic machine-states is used as an optimal method to implement the changes
required by the fuzzy logic control. Results and concluding remarks are presented.
Keywords
fuzzy
control
robot
vision
autonomous
navigation
A LabVIEW-based Autonomous Vehicle Navigation System using Robot Vision and Fuzzy Control
Resumen
En este artculo se presenta un sistema de navegacin para un vehculo autnomo
usando tcnicas de visin robtica, desarrollado en LabVIEW con nes acadmicos.
El sistema adquiere en tiempo real las imgenes del camino por recorrer. Estas imgenes son enviadas en forma inalmbrica a una computadora, en donde un sistema
de control, basado en reglas de control difuso, toma las decisiones de movimiento
correspondientes. La computadora enva en forma inalmbrica las seales adecuadas
al vehculo de control remoto, cerrando de esta manera el lazo de control. Las imgenes son capturadas en la computadora a travs de la tarjeta de adquisicin NI
USB-6009 y procesadas en un sistema desarrollado bajo la plataforma de LabVIEW
y sus herramientas de adquisicin, procesado de imgenes y control difuso. Se incorpora un eciente esquema de diseo basado en mquinas de estados para la navegacin por las diversas escenas detectadas por la cmara. Se presentan resultados y
conclusiones de este trabajo.
Introduction
An autonomous navigation system consists of a selfpiloted vehicle that does not require an operator to navigate and accomplish its tasks. The aim of an
autonomous vehicle is to have self suciency and a decision making heuristic installed within it, which allows
it to automatically move in the corresponding environment, as well as to accomplish the tasks required (Armingol et al., 2007). Some of the areas where autonomous
vehicles have been successfully used are space rovers,
rice planting and agricultural vehicles (Tunstel et al.,
2007, Nagasaka et al., 2004, Zhao et al., 2007), autonomous driving for urban areas (De la Escalera et al.,
2003), security and surveillance (Srini, 2006, Flan et al.,
2004, Micheloni et al., 2007), and also exploration of any
place where human life may be at risk, like a mine with
toxic gases or during a nuclear plant disaster (Isozaki et
al., 2002). There is a whole variety of autonomous vehicles present, with an extensive classication and categories depending on their characteristics (Bertozzi et al.,
2000). Some of those characteristics described in the literature are: autonomy level, methods of data acquisition, methods of localizations, goal tasks, displacement
techniques, control methods, and so on. The project
presented in this paper is restricted to the autonomous
navigation of a small control remote car through a closed circuit. Although it is a very specic task in a controlled environment, it is aimed to provide a platform
for academic purposes, in which several approaches of
navigation control can be tried, and dierent schemes
of pre-processing image techniques can be used in educational experiments. The software package LabVIEW
and their available toolboxes on image analysis, computer vision, and fuzzy logic control (NI 2002, 2005),
130
Descriptores
difuso
control
visin
navegacin
autnoma
robot
have been found to be an excellent platform for experimentation purposes for a quick design, implementation, and test of the prototypes. It is expected to continue
the experimentation with this prototype in order to explore further tasks, which would require the use of
more sophisticated control heuristics in the eld of articial intelligence or neural networks techniques.
Hardware description
The implemented system basically consists of the wireless control of a small remote control car from a laptop
computer as shown in gure 1. The vehicle is equipped
with a wireless camera which sends in real time the video signal corresponding to the path. Once the streaming data corresponding to the video signal of the path
is entered into the computer, it is processed through a
LabVIEW application, which generates the control signals to be applied to the remote control of the vehicle. A
block diagram of the system is presented in gure 1.
The used hardware is listed as follows: data acquisition
card (DAQ) NI USB-6009, wireless analog mini-camera
JMK, Dazzle USB video capture card, and a small digital-remote control car.
The wireless camera transmits a video signal with a horizontal resolution of 380 TV lines in the frequency of
1.2 GHz on the ISM (Industrial, Scientic, and Medical)
radio band. Due to the limitations on the maximum
current provided by the output port of the I/O card, a
simple optocoupler-based interface was included between the NI card and the remote control as a conditioning signal stage. The digital signals obtained from the
interface are applied to the remote control, which sends
the movement commands to the 27 MHz radio controlled car and the cycle is closed.
Ingeniera Investigacin y Tecnologa. Vol. XII, Nm. 2, 2011, 129-136, ISSN 1405-7743 FI-UNAM
chanism of the vehicle. Each signal allows sixteen values coded in 4 bits. In the case of the speed it covers
positive and negative values for movements in reverse,
and in the case of the steering wheel angle it covers an
angle range of 45 Figure 2 shows the partition of the
variable Driving wheel angle, in ve fuzzy sets. In
a similar way, the output variable Speed is partitioned in ve fuzzy sets as: BM: back medium, BS: back
small, SS: straight small, SM: straight medium, SH:
straight high. The input information needed to locate
the relative position of the vehicle relies on the image
sequence detected by the wireless camera, which is a
streaming data of 30 images per second. The program
automatically segments and marks the frontal part of
the car as the image reference with a yellow box, and
the path lanes are segmented and marked with two red
lines. The marking is presented in the screen as a visual
representation, and simultaneously registered in the
image le. The numerical information obtained from
both markings is further used during the tracking algorithm. The fuzzy input variables used in this work are:
lateral distance to the nearest lane, and the inclination
angle of the incoming curve obtained as the average of
the two angles detected from the lateral borders of the
road. These variables are represented in membership
functions derived from the partition of the variables in
ve fuzzy sets. Figure 3 shows the partition of the input
variable x Lateral displacement. In a similar way, the
input variable Angle of the incoming curve is partitioned as: LTC: left tight curve, LSC: left soft curve, S:
straight, RSC: right soft curve, RTC: right tight curve.
The inference rules of the fuzzy control system in the
form IF-THEN are located in a database which is accessed in each iteration. The database was constructed
considering the actions that a human being would perform in every situation during the trajectory, with the
restrictions of the range designated for each variable.
This information is further analyzed in order to make a
decision concerned to the car position with respect to
the path, and the required action to keep the track in the
Ingeniera Investigacin y Tecnologa. Vol. XII, Nm. 2. 2011, 129-136, ISSN 1405-7743 FI-UNAM
131
A LabVIEW-based Autonomous Vehicle Navigation System using Robot Vision and Fuzzy Control
Programming approach
BLT: big left turn, SLT: small left turn, R: straight, SRT: small right turn,
BRT: big right turn
The control program developed in LabVIEW was developed following an approach of machine states logic
design. According to the scene detected, the vehicle
could be located in one of six states as described in the
state diagram of gure 4. Once the system is turned on,
the car stays in the state init until the user presses the
buon Start. Pressing this buon initializes the route
and moves the car to the state straight. In each state,
the system waits for the camera to sense the road, derives the numerical information from the relative position of the car, and makes a decision according to the
fuzzy logic inference rules stored in the database. After
defuzzication, the output signals are sent to the car
through the wireless remote control, which closes the
control loop by making the corresponding movement
132
Figure 5 shows the code in the LabVIEW graphical language corresponding to the state straight. In this
diagram there is an input signal corresponding to the
image of the road obtained from the wireless camera,
and an output signal connected through the DAQ assistant, which will generate the electric signal required by
the remote control. The main operation relays on three
sub-virtual instruments named straight, fuzzy, and
wheel, which are designed to evaluate the video-sig-
Ingeniera Investigacin y Tecnologa. Vol. XII, Nm. 2, 2011, 129-136, ISSN 1405-7743 FI-UNAM
the same with small dierences according to the corresponding position, so for the purposes of this paper only
the state straight will be explained.
Inside the main block in gure 5, we can distinguish
a sub-virtual instrument called straight, which has the
purpose of obtaining the numerical
representation derived from the visual information of the road. Figure
6 corresponds to the code used to
derive two values named as max
and min with respect to the center
of the vehicle, from the right and
left lines obtained from the input
image of the road.
The obtained values max, min,
and center, are entered to the next
stage, which applies the fuzzy logic
rules to obtain the required output
value used to control the steering
wheel angle, and the displacement
of the vehicle, in consequence. The
output value is converted to the 4
bit digital word required in the remote control through a table containing the corresponding codes, as
shown in gure 7. Once all the operations are completed, the process
starts again in a new state depending on the position of the vehicle.
When the buon stop in the main
display is pressed, the program
goes to the state init, where the
program waits until the user decides to resume the car movement or
to nish the process.
Results
Figure 7. Block code to obtain the steering angle code from the max and min values
Ingeniera Investigacin y Tecnologa. Vol. XII, Nm. 2. 2011, 129-136, ISSN 1405-7743 FI-UNAM
133
A LabVIEW-based Autonomous Vehicle Navigation System using Robot Vision and Fuzzy Control
(a)
(b)
(a)
(b)
134
Ingeniera Investigacin y Tecnologa. Vol. XII, Nm. 2, 2011, 129-136, ISSN 1405-7743 FI-UNAM
turned on. The second one shows the case in which the
vehicle detects a curve to the right and the steering
wheel is conditioned to do the turn.
A simple experiment aimed to test the response of
the vehicle was implemented. The car was located in a
linear track of 4 meters, with an initial position of 70
pixels out of the center. The car was expected to correct
its position until the center is reached. The experiment
was carried out several times using partitions of the variables in three and ve fuzzy sets. After averaging the
trajectories, the curves shown in gure 9 were obtained.
It can be seen that the vehicle stabilizes after some oscillations in approximately 1.5 meters. Additional experiments and results can be checked at the document in
the complete project report (Lopez, 2007).
Conclusions
An autonomous vehicle navigation system, based
on fuzzy logic control techniques with robot vision capabilities has been presented. This experiment was designed with academic purposes on a LabVIEW
platform, taking advantages of the toolbox on fuzzy logic control and the acquisition library IMAQ-VISION.
These resources were found to be an excellent tool-set
to develop in short time with a very good exibility and
excellent performance, the design, implementation and
testing of a control system such as the one described in
this paper. Particularly, the paradigm of control based
on the use of machine states, represents an interesting
approach for automatic vehicle navigation, as well as a
didactic case of study. This prototype is ready to support further experimentation in dierent tasks, including dierent heuristics of control with the use of
articial intelligence techniques in dierent environments. It is also worth to point out that a rigid camera
like the one used in this project is a considerable limitation. The use of a more sophisticated camera with options like pan, tilt, or zoom, could be a very good
improvement, for the possibility to anticipate possible
trajectories and do a movement strategy in advance,
with a reasonable increasing in the cost of the prototype.
Acknowledgments
The authors would like to thank the anonymous reviewers for their detailed and helpful comments.
References
9162.
Tunstel, E., Anderson, G.T., Wilson, E.W. Autonomous Mobile
Surveying for Science Rovers Using in Situ Distributed Remote Sensing. On: IEEE International Conference on Systems,
Man and Cybernetics. Montreal, Canada. October 2007, pp.
2348-2353.
Armingol J.M., De la Escalera A., Hilario C., Collado J.M., Carrasco J.P., Flores M.J., Pastor J.M., Rodrguez J. IVVI: Intelligent
Ingeniera Investigacin y Tecnologa. Vol. XII, Nm. 2. 2011, 129-136, ISSN 1405-7743 FI-UNAM
135
A LabVIEW-based Autonomous Vehicle Navigation System using Robot Vision and Fuzzy Control
Zhao B., Zhu Z., Mao E.R., Song Z.H. Vision System Calibration of
Agricultural Wheeled-Mobile Robot Based on BP Neural Net-
136
Ingeniera Investigacin y Tecnologa. Vol. XII, Nm. 2, 2011, 129-136, ISSN 1405-7743 FI-UNAM