High Definition Map-Based Localization Using ADAS
High Definition Map-Based Localization Using ADAS
Featured Application: High definition (HD) map, advanced driver assistance systems (ADASs),
localization, iterative closest point (ICP), automated driving vehicle.
Abstract: This paper presents high definition (HD) map‐based localization using advanced driver
assistance system (ADAS) environment sensors for application to automated driving vehicles. A
variety of autonomous driving technologies are being developed using expensive and high‐
performance sensors, but limitations exist due to several practical issues. In respect of the
application of autonomous driving cars in the near future, it is necessary to ensure autonomous
driving performance by effectively utilizing sensors that are already installed for ADAS purposes.
Additionally, the most common localization algorithm, which is usually used lane information only,
has a highly unstable disadvantage in the absence of that information. Therefore, it is essential to
ensure localization performance with other road features such as guardrails when there are no lane
markings. In this study, we would like to propose a localization algorithm that could be
implemented in the near future by using low‐cost sensors and HD maps. The proposed localization
algorithm consists of several sections: environment feature representation with low‐cost sensors,
digital map analysis and application, position correction based on map‐matching, designated
validation gates, and extended Kalman filter (EKF)‐based localization filtering and fusion. Lane
information is detected by monocular vision in front of the vehicle. A guardrail is perceived by radar
by distinguishing low‐speed object measurements and by accumulating several steps to extract wall
features. These lane and guardrail information are able to correct the host vehicle position by using
the iterative closest point (ICP) algorithm. The rigid transformation between the digital high
definition map (HD map) and environment features is calculated through ICP matching. Each
corrected vehicle position by map‐matching is selected and merged based on EKF with double
updating. The proposed algorithm was verified through simulation based on actual driving log data.
Keywords: high definition(HD) map; advanced driver assistance systems (ADASs); localization;
iterative closest point (ICP); automated driving vehicle
1. Introduction
Recently, vehicles with partially automated driving capabilities developed from advanced
driver assistance systems (ADASs) have been competitively introduced by major carmakers [1,2].
Most vehicle manufacturers aim to commercialize fully automated driving cars in the near future,
and they have concentrated on autonomous driving research to satisfy both safety and marketability.
In order to realize urban autonomous driving with full navigation, it is considered important to have
accurate and robust localization without the use of expensive Global Positioning System(GPS).
A comprehensive overview of previous localization studies is shown in [3]. Adam et al. [4]
proposed a data fusion approach that utilizes both stationary and dynamic radar detections for
estimating relevant road parameters by using only radar. This paper not only utilizes radar for the
detection of low‐speed objects but also implements HD maps to estimate the current vehicle position
with a variety of road information such as road curvature.
Lately, a great deal of attention has been paid to vehicle localization based on maps [5–7]. A
digital HD map is made up of plenty of place information to estimate the current location of the
autonomous vehicle. Consequently, the digital map is used as a strong extra sensor to raise the quality
of vehicle localization performance. As the speed of ego‐vehicles increases, the position estimation
performance is degraded, which is the limitation or source of error for precise localization [8].
In this research, we take into account the three major issues in vehicle position estimation based
on map information: environment features, correction method of vehicle position, and filtering with
information fusion. The localization algorithms are possible to categorize depending on the road
features used for matching. Lane markings and road boundaries are generally found on most roads
and are easy to use as they are standardized. Therefore, many studies have taken lane marking [6,7]
and road boundary [6,9] for map‐building. According to previous research, the effective utilization
of road features can contribute to improving location performance. In order to correct the position of
the host vehicle using environment features and a digital map, a proper map‐matching algorithm is
necessary. The most well‐known way of map‐matching is the iterative closest point (ICP) algorithm
initially proposed in [10]. The point‐to‐plane approach is known to be quite fast and accurate among
the different ICP algorithms [11]. That is the reason why we apply this method to matching part of
our algorithm in this research.
The key contribution of this paper is a thorough experimental evaluation of a vehicle localization
algorithm for automated driving on real highway roads. The localization algorithm is based on well‐
understood approaches, including lane and guardrail detection, ICP point‐to‐plane matching, and
extended Kalman filter. We analyze the accuracy of the integrated system and show that the proposed
algorithm can robustly localize the vehicle for proper control on real expressways at high speeds.
The rest of this paper is composed as follows: Section 2 gives the architecture of the overall
automated driving system, the vehicle localization structure, and a description of the test vehicle for
automated driving. Section 3 contains road environment information and the attributed HD map for
map‐matching. Section 4 explains the map‐matching method, fusion process, and Extended Kalfman
Filter(EKF)‐based position estimation. Section 5 includes simulation results of the proposed position
estimation of the host vehicle by using real driving data acquired on the Korean Expressway,
including junctions (JC). Finally, Section 6 shows the conclusions.
2. Algorithm Architecture
In order to realize high‐level automated driving and the secure safety of passengers, a more
accurate position estimation of the host vehicle is required. In this section, the architecture of our
automated driving system, which we designed, and the proposed vehicle localization structure are
explained. The configuration of the test vehicle for automated driving is also included in this section.
ego‐vehicle (called localization with commercialized sensors) and also computes collision probability
and probabilistic predictions of surrounding objects. All system modules make use of information
from various installed environment sensors. The main modules are vision, lidar, radar, and vehicle
sensors. This information is used to predict the motions of the surrounding objects over the prediction
horizon. The motion‐planning part utilizes the perception result along with the predicted information
of the environment to determine lane changes or to maintain the motion of the subject vehicle. The
control block uses both the environment representation and planned vehicle motion to optimize the
steering and longitudinal acceleration inputs to the vehicle.
RT3002) was also installed for the reference position of the ego‐vehicle. The vehicle setups for this
paper aimed to use close‐to‐market sensors for the much faster and more feasible realization of
autonomous driving technology in the near future [12]. We used N‐TRIP as correction signals for
DGPS. In addition, the DGPS is independent of the GPS position input to the system since it only
serves as a reference to verify the development of algorithm performance. The throttle, brake, and
steering actuators were used for vehicle automation. These systems are connected through the control
area network (CAN), and the discretized command signals are transmitted on each cycle. We set up
an embedded PC and MicroAutoBox as high‐ and low‐level controllers. Figure 3 depicts the test
vehicle and the installed sensors, including detection range and field of view (FOV). For lane‐level
localization using ADAS commercial sensors, we used low‐cost GPS, vehicle proprioceptive sensors
(velocity, yaw‐rate), and front vision sensors and radars. The specifications of the major sensors
utilized in this research are listed in Table 1.
Properties
Sensor Type
Range Resolution Noise (RMS) Units
Wheel speed 0–130 0.035 0.3 m/s
Yaw rate ±120 0.0625 0.5 deg/s
its position exists within 20 m of the ego‐vehicle center. The guardrail information ( Z guardrail
) for use in
map‐matching can be obtained by accumulating points using vehicle sensor information.
where
1
cos t sin t v t
Tbackward sin t cos t 0
0 0 1
Figure 4. Major characteristics of an HD map: estimate current vehicle position with a variety of road
information such as road curvature.
Appl. Sci. 2020, 10, 4924 6 of 11
4. Localization Algorithm
The several important stages of vehicle positioning are presented in this section. In order to
improve the position estimation accuracy, the map‐matching result can be updated with vehicle
sensor data. The ICP‐based map‐matching results are evaluated and selected to update the estimation
filter. To avoid the problem of false matching, a validation gate was designed.
J i1 i R pi T qi
N 2
(2)
T
where pi pi,x , pi,y shows the point clouds acquired from the environment sensors, vision, and
T
radar, and qi qi,x , qi,y indicates the points of the HD map with regard to the host vehicle’s local
frame. R and T represent the transformation matrices, rotation and translation, by using ICP
matching. Additionally, i represents the normal surface at qi .
Since we used two‐dimensional matching in this study, R , T can be shown below:
where the matching result is expressed as r , t x , t y . r is rotation angle in radian, and t x , t y is the
amount of translation in the direction of x and y of the local vehicle frame. In order to correct the
vehicle position, this result will be applied. Let X V , represent the present global vehicle position.
The result of ICP‐matching obtained X *V , * is shown below:
X V* R T X V cos( ) sin( )
where R (4)
r
* sin( ) cos( )
X x y T (5)
The basic framework for the EKF includes an evaluation of the states of the discretized nonlinear
dynamic system, which can be derived as
Appl. Sci. 2020, 10, 4924 7 of 11
X k f X k 1 , U k wk
Z lane , k h X k 1 vlane , k (6)
Z guardrail , k h X k 1 v guardrail , k
where w k is the process noise associated with proprioceptive sensors that is assumed to have a
zero mean and a Gaussian distribution; v la n e , k and v g u a r d r a il , k are the measurement noises of lane
and guardrail matching. U k indicates the external input (longitudinal speed and yaw rate) and
Z la n e , k and Z g u a r d r a il , k are the position corrections by each matching method.
Velocity and yaw‐rate are approximately fixed during the sampling time period by integrating
the Euler approximation and assuming that the control signals of the nominal discrete process model
equations are derived as
where H 0 1 0
S k HPk k 1 H Rk
T
0 0 1
Pk k I Kk H Pk k1 (9)
The time update rate is the same as the frequency of the vehicle sensor, 100 Hz, and that of the
measurement update is also the same as the vision (10 Hz) and radar (20 Hz) sampling rates.
1
e ( Z k HX k k 1 ) S k ( Z k HX k k 1 )
2 T
k longi
Vk
&( e ) & ( e
lati lati yaw
yaw )
2
where g 2 is selected as the confidence level. The normalized error, e , differs as a chi‐squared
distribution with the number of measurement degrees of freedom. elongi , elati , e yaw are the
differences between the localization results and the references. longi , lati , yaw are the error
thresholds of each state, which were determined as 10.0, 3.0, and 45 degrees in this research.
5. Experimental Results
(a) (b)
Figure 6. Algorithm verification scene (vehicle Color: red—localization result, black—Low‐cost GPS,
blue—DGPS (RT3002) for reference). (a) Lane + guardrail matching (junction (JC): entry). (b)
Guardrail matching only (no lane; JC: middle).
Figure 7a shows the distance error of the localization result based on reference DGPS (Oxts‐
RT3002) output over time. Figure 7b shows changes of localization errors on the longitudinal
direction and Figure 7c,d expresses errors on lateral and yaw angles over time for the partial section,
including the junction, plotted with 3‐sigma bounds (blue dashed lines) to show how the filter is well‐
tuned in order to estimate the vehicle’s position and posture. The four error calculations happen
during a 570‐s time period. For some sections that have no environment measurements or were failed
to map‐matching properly, the accumulated positioning error increases slightly until the proper map‐
matched correction is obtained. The performance of the proposed positioning algorithm enables
lateral steering control to be applied safely.
(a)
(b)
(c)
Appl. Sci. 2020, 10, 4924 10 of 11
(d)
Figure 7. Test data‐based localization simulation results on highways. (a) Distance error; (b)
longitudinal position error; (c) lateral position error; (d) yaw angle error.
6. Summary
Lane and guardrail detection‐based vehicle localization using ADAS environment sensors for
realizing autonomous highway driving is presented in this paper. The proposed algorithm consists
of ADAS sensor‐based environment representation, application of digital HD maps, map‐matching‐
based position correction, and localization filters. In the environment representation part, we used
front vision (Mobileye)‐based lane detection results and extracted guardrail features by using front
radar (Delphi). In addition, we used the HD map of the Korean Expressway distributed by the NGII
of South Korea. We processed environment measurements such as lanes and guardrails and applied
the HD map. Map‐matching was conducted with the environment representation result, and then the
designated filter was utilized to fuse and update the correction results. The validation gate was
designed to prevent inaccuracies that might be caused by false matching results.
The positioning performance of the proposed algorithm was proven through open‐loop
simulation based on actual driving data. The test showed that accuracy performance within a few
centimeters can be achieved for enough self‐driving control. We are now in the process of applying
a simulated positioning algorithm to actual vehicle tests.
Meanwhile, there are still challenging tasks of localization under various circumstances such as
low visibility conditions of road markers, tunnel sections (long GPS shadow zones), lane‐split
sections when entering toll gates, and so forth. We will develop this algorithm of ADAS‐sensor‐based
toll‐to‐toll autonomous highway self‐driving and also autonomous driving technologies near
commercialization by increasing the completion of autonomous driving in urban areas.
Author Contributions: Conceptualization, D.S. and K.‐m.P.; methodology, D.S.; software, D.S.; validation, D.S.
and K.‐m.P.; formal analysis, D.S.; investigation, M.P. and K.‐m.P.; resources, D.S.; data curation, M.P.; writing—
original draft preparation, D.S.; writing—review and editing, M.P.; visualization, D.S.; supervision, M.P.; project
administration, M.P.; funding acquisition, M.P. All authors have read and agreed to the published version of the
manuscript.
Funding: This work was supported by a Road Traffic Authority grant funded by the Korea government
(KNPA; POLICE‐L‐00003‐02‐101, Development of Information Provision Technology with IoT‐based Traffic
Control Devices and Its Operation Management), the Basic Science Research Program through the National
Research Foundation of Korea (NRF) funded by the Ministry of Education (2018R1D1A1B0704814313), and
Sookmyung Women’s University Research Grants (1‐2003‐2008).
Acknowledgments: This work was supported by a Road Traffic Authority grant funded by the Korea
government (KNPA; POLICE‐L‐00003‐02‐101, Development of Information Provision Technology with IoT‐
based Traffic Control Devices and Its Operation Management), the Basic Science Research Program through the
National Research Foundation of Korea (NRF) funded by the Ministry of Education (2018R1D1A1B0704814313),
and Sookmyung Women’s University Research Grants (1‐2003‐2008).
References
1. Shin, D.; Kim, H.; Park, K.; Yi, K. Development of Deep Learning Based Human‐Centered Threat
Assessment for Application to Automated Driving Vehicle. Appl. Sci. 2020, 10, 253.
2. Shin, D.; Yi, S.; Park, K.; Park, M. An Interacting Multiple Model Approach for Target Intent Estimation at
Urban Intersection for Application to Automated Driving Vehicle. Appl. Sci. 2020, 10, 2138.
Appl. Sci. 2020, 10, 4924 11 of 11
3. Skog, I.; Handel, P. In‐car positioning and navigation technologies—A survey. IEEE Trans. Intell. Transp.
Syst. 2009, 10, 4–21.
4. Adam, C.; Schubert, R.; Mattern, N.; Wanielik, G. Probabilistic road estimation and lane association using
radar detections. In Proceedings of the 14th International Conference on Information Fusion, Chicago, IL,
USA, 5–8 July 2011; pp. 1–8.
5. Han, S.‐B.; Kim, J.‐H.; Myung, H. Landmark‐based particle localization algorithm for mobile robots with a
fish‐eye vision system. IEEE/ASME Trans. Mechatron. 2012, 18, 1745–1756.
6. Schreiber, M.; Knöppel, C.; Franke, U. Laneloc: Lane marking based localization using highly accurate
maps. In Proceedings of the 2013 IEEE Intelligent Vehicles Symposium (IV), Gold Coast, Australia, 26–28
June 2013; pp. 449–454.
7. Tao, Z.; Bonnifait, P.; Fremont, V.; Ibanez‐Guzman, J. Lane marking aided vehicle localization. In
Proceedings of the 16th International IEEE Conference on Intelligent Transportation Systems (ITSC 2013),
The Hague, The Netherlands, 6–9 October 2013; pp. 1509–1515.
8. Weber, Y.; Kanarachos, S. The correlation between vehicle vertical dynamics and deep learning‐based
visual target state estimation: A sensitivity study. Sensors 2019, 19, 4870.
9. Hata, A.Y.; Osorio, F.S.; Wolf, D.F. Robust curb detection and vehicle localization in urban environments.
In Proceedings of the 2014 IEEE Intelligent Vehicles Symposium Proceedings, Ypsilanti, MI, USA, 8–11
June 2014; pp. 1257–1262.
10. Besl, P.J.; McKay, N.D. Method for registration of 3‐D shapes. IEEE Trans. Pattern Analysis Mach. Intell. 1992,
14, 239–256.
11. Park, S.‐Y.; Subbarao, M. An accurate and fast point‐to‐plane registration technique. Pattern Recognit. Lett.
2003, 24, 2967–2976.
12. Furgale, P.; Schwesinger, U.; Rufli, M.; Derendarz, W.; Grimmett, H.; Mühlfellner, P.; Wonneberger, S.;
Timpner, J.; Rottmann, S.; Li, B. Toward automated driving in cities using close‐to‐market sensors: An
overview of the v‐charge project. In Proceedings of the 2013 IEEE Intelligent Vehicles Symposium (IV),
Gold Coast, Australia, 26–28 June 2013; pp. 809–816.
13. Chen, Y.; Medioni, G.G. Object modeling by registration of multiple range images. Image Vis. Comput. 1992,
10, 145–155.
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).