(2004IVS) LaneMarkingDetection Formule
(2004IVS) LaneMarkingDetection Formule
University of Parma
Parma, Italy June 1447,2004
[email protected], [email protected]
Computer Vision and Robotics Research Laboratory
University of California, San Diego
La Jolla, CA, 92093-0434
http://cvrr.ucsd.edu
Authorized licensed use limited to: Univ of Calif Irvine. Downloaded on May 26,2010 at 22:22:29 UTC from IEEE Xplore. Restrictions apply.
Studies have been done in the state of the art of lane
detection [7]. While these methods are all very effective
at solving these problems, they tend too be very specific
to particular road types or conditions. Robust lane
detection remains an unsolved problem because in order
to have a robust lane detector, the system must be
invariant to all manners of road markings, road
conditions, lighting changes, shadowing, and occlusion.
where,
A = JG, 2
- 2G,G, +'G, 2
+ 4Gxy2
Figure 2 A basis set for steerable filters based on the second derivatives
of a two-dimensional Gaussian 2.2 Application of steerable filters to road
marking detection
Using the formulas 4, 5, and 6, we can find the values and
angles of the minimum and maximum responses, or the
response at a given angle. This is useful for detecting
bots dots because, for circular objects, the minimum and
maximum responses will be very similar. For detecting
lanes, the response in the direction of the lane should be
near the maximum, and the minimum response should be
low.
By applying a threshold to the minimum filter response
we can find the bots dots within an image. Also, applying
a threshold to the difference between the response in the
direction of the lane marking and the minimum response,
534
Authorized licensed use limited to: Univ of Calif Irvine. Downloaded on May 26,2010 at 22:22:29 UTC from IEEE Xplore. Restrictions apply.
we can detect lanes of a specific angle. Figure 3 shows a inputs are then fed into the tracking system to determine
typical California highway scene with lane markings the state of the vehicle and road. The lane angles in the
consisting of both Bots Dots and lines. Figure 4 shows the image coordinates are then feed back into the filtering
image after being filtered and thresholded by the algorithm in order to tune the filters for specific lanes.
minimum value. Figure 5 shows the response to lines in
the orientation of the current lane parameters. Input Video
These results show the usefulness of the steerable filter 3.1 Road Modeling
set for relatively normal highway conditions. This The road model used in our system is similar in form to
filtering technique is also very useful for dealing with that used in [SI. The state variable include the vehicle
shadowed regions of road. Figure 6 below shows a road offset from the center of the lane, the vehicle heading
section that is shadowed by trees and the filter response with respect to the lane, the rate of change of the lane
heading, the steering angle of the vehicle, and the velocity
of the vehicle. Currently lane curvature is estimated
using the steering angle and the rate of change of the lane
heading and is only estimated at the vehicles location.
The camera pitch and lane width are assumed constant in
this implementation. The vehicle state is updated in time
each frame as well as updated from the measurements via
a Kalman filter, which is described at the end of the next
section.
535
Authorized licensed use limited to: Univ of Calif Irvine. Downloaded on May 26,2010 at 22:22:29 UTC from IEEE Xplore. Restrictions apply.
the lanes for candidates, it requires initialization. In
testing, it was sufficient to initialize the lane tracker
position and trajectory to zero (corresponding to the
center of the lane).
These computed headings and positions in image space
are then transformed into real-world coordinates via an
inverse perspective calculation. The state variables are
then updated using these measurements as well as
measurements of steering angle and wheel velocity
Figure 9 Results during a lane change. (Frame 258 of the test sequence)
provided by the vehicles CAN bus.
These measurements are then feed into a discrete time
Kalman filter for the road and vehicle state as described in
section 3.1. The system and measurement equations as
well as the Kalman update equations at time k are shown
below.
Xk+llk =h k I k + Buk (7)
Yk = "k (8)
Figure 10 Results with lighting changes fiomt going through an overpass.
(Frame 1657 of the test sequence)
Figure 8 Results with clutter from nearby cars and tree shadows. (Frame
13 of the test sequence) Figure 12 Lane Position Output from the lane Tracking System
536
Authorized licensed use limited to: Univ of Calif Irvine. Downloaded on May 26,2010 at 22:22:29 UTC from IEEE Xplore. Restrictions apply.
[11 W. T. Freeman and E. H. Adelson. “The design and use of
steerable filters”, PAMI, 13(9):891--906, 1991.
[2] J. McDonald. “Detecting and tracking road markings
using the Hough transform.” Proc. of the Irish Machine
Vision and Image Processing Conference, 2001.
[3] D. Pomerleau. Ralph “Rapidly adapting lateral position
handler.” Proc. IEEE Symposium on Intelligent Vehicles,
September 25-26, 1995.
[4] Stefan Gehrig, Axel Gem, Stefan Heinrich and Bemd
Woltennann, Lane recognition on poorly structured roads
- The Bot Dot problem in California”, Proc. of the 5&
Conference on Intelligent Transportation Systems, 2002.
[5] J. B. Southall and C.J. Taylor “Stochastic road shape
estimation” International Conference on Computer Vision
,pp. 205-212, June 2001
[6] M. Bertozzi and A. Broggi. “GOLD: a Parallel Real-Time
Stereo Vision System for Generic Obstacle and Lane
Figure 13 Steering angle output from the lane tracking system Detection.” IEEE Transactions on Image Processing,
1997.
Lane changes occur around frames 500,2800,and 5200 [7] J. Kosecka, R. Blasi, C. Taylor and J. Malik. “A
and are successfully detected by the system. The road is Comparative Study of Vision-Based Lateral Control
straight in frames 0 though 3700. After that the road Strategies for Autonomous Highway Driving”. In IEEE
curves as can be seen in the sustained steering angles. Int. Conf. on Robotics and Automation, pp 1903-1908,
May 1998.
[8] C.J. Taylor and Jitendra Malik and Joseph Weber. “A
5 Summary and Conclusions Real-Time Approach to Stereopsis and Lane-Finding.“
Lane detection is often complicated by varying road. Intelligent Vehicles 1996,pp 207-212
markings, clutter from other vehicles and complex [9] K. Huang, M. M. Trivedi, T. Gandhi, “Driver’sView and
shadows, lighting changes from overpasses, occlusion Vehicle Surround Estimation using Omnidirectional
from vehicles, and varying road conditions. In this paper Video Stream,’’ Proc. IEEE Intelligent Vehicles
we have presented a solution to the lane detection Symposium, Columbus, OH, pp. 444-449, June 9-11,
problem that shows robustness to these conditions. We 2003
[lo] H. Furusho, R Shirato, M. Shimakage. “A Lane
have shown that using a steerable filter bank provides Recognition Method Using the Tangent Vectors of White
robustness to lighting changes, road marking variation, Lane Markers,” 6th International Symposium on
and shadowing. Further post-processing based on the Advanced Vehicle Control, Sept. 9-13,2002.
statistics of the road marking candidates increases the [ll] J. McCall, 0. Achler, M. M. Trivedi. “The LISA-Q
robustness to occlusion by other vehicles and changing Human-Centered Intelligent Vehicle Test Bed” To Appear
road conditions. Future work will include expanding the in Proc. IEEE Intelligent Vehicles Symposium, Parma,
road model to incorporate piecewise estimates of Italy, June 14-17,2004.
trajectory and curvature. Also, this system can also be
combined into a vehicle surround analysis system such as
that described in H u n g et. a1.[9] to create a intelligent
vehicle driver assistance system.
Acknowledgments
References
537
Authorized licensed use limited to: Univ of Calif Irvine. Downloaded on May 26,2010 at 22:22:29 UTC from IEEE Xplore. Restrictions apply.