0% found this document useful (0 votes)
124 views23 pages

LiDAR-Based Sensor Fusion SLAM and Localization Fo

Uploaded by

FALAK FATIMA
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
124 views23 pages

LiDAR-Based Sensor Fusion SLAM and Localization Fo

Uploaded by

FALAK FATIMA
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 23

Journal of

Imaging

Article
LiDAR-Based Sensor Fusion SLAM and Localization for
Autonomous Driving Vehicles in Complex Scenarios
Kai Dai 1 , Bohua Sun 1, * , Guanpu Wu 1 , Shuai Zhao 2 , Fangwu Ma 1 , Yufei Zhang 1 and Jian Wu 1

1 State Key Laboratory of Automotive Simulation and Control, Jilin University, Changchun 130025, China
2 Automotive Data Center, CATARC, Tianjin 300000, China
* Correspondence: [email protected]

Abstract: LiDAR-based simultaneous localization and mapping (SLAM) and online localization
methods are widely used in autonomous driving, and are key parts of intelligent vehicles. However,
current SLAM algorithms have limitations in map drift and localization algorithms based on a single
sensor have poor adaptability to complex scenarios. A SLAM and online localization method based
on multi-sensor fusion is proposed and integrated into a general framework in this paper. In the
mapping process, constraints consisting of normal distributions transform (NDT) registration, loop
closure detection and real time kinematic (RTK) global navigation satellite system (GNSS) position
for the front-end and the pose graph optimization algorithm for the back-end, which are applied
to achieve an optimized map without drift. In the localization process, the error state Kalman filter
(ESKF) fuses LiDAR-based localization position and vehicle states to realize more robust and precise
localization. The open-source KITTI dataset and field tests are used to test the proposed method. The
method effectiveness shown in the test results achieves 5–10 cm mapping accuracy and 20–30 cm
localization accuracy, and it realizes online autonomous driving in complex scenarios.

Keywords: LiDAR SLAM; autonomous vehicle; localization; multi-sensor fusion

Citation: Dai, K.; Sun, B.; Wu, G.; 1. Introduction


Zhao, S.; Ma, F.; Zhang, Y.; Wu, J.
With the development of intelligent and connected vehicle technology, the intelligent
LiDAR-Based Sensor Fusion SLAM
transportation system with autonomous driving passenger cars, commercial vehicles and
and Localization for Autonomous
taxis has undergone tremendous changes in the perception of the complex scenarios.
Driving Vehicles in Complex
Scenarios. J. Imaging 2023, 9, 52.
Vehicle localization is the key issue that should be solved in autonomous driving and how
https://doi.org/10.3390/
to realize high-precise vehicle localization under the condition of unavailable satellites or
jimaging9020052 unstructured roads is one of the technical problems to be solved urgently. The localization
technique based on the vision [1] and satellites observations can achieve centimeter-level
Academic Editor: Pierre Gouton
localization but heavily rely on satellite signals, traffic signs, and initialization [2]. LiDAR-
Received: 17 November 2022 based localization techniques are largely invariant to illumination and satellite signal
Revised: 21 January 2023 changes. Therefore, high precision maps with denser point clouds are required, and the
Accepted: 13 February 2023 map-based multi-sensor fusion localization should be widely used to cover different driving
Published: 20 February 2023 conditions [3].
LiDAR SLAM is widely used in the construction of 3D point cloud maps [4]. The
architecture of a simultaneous localization and mapping (SLAM) system consists of the
front-end and the back-end. The front-end seeks to interpret the sensor data to obtain
Copyright: © 2023 by the authors. constraints as the basis for optimization approaches, such as point cloud registration,
Licensee MDPI, Basel, Switzerland.
loop closure detection, or Global Navigation Satellite System (GNSS) pose. The back-end
This article is an open access article
focuses on computing the best map result based on optimization techniques with the given
distributed under the terms and
constraints [5]. Many registration methods have been proposed for the front-end, such
conditions of the Creative Commons
as the iterative closest point (ICP) [6], normal distribution transformation (NDT) [7], and
Attribution (CC BY) license (https://
feature-based [8]. However, typical registration methods suffer from drift in large-scale
creativecommons.org/licenses/by/
tests, due to the poor performance in the loop closure detection and the position correction
4.0/).

J. Imaging 2023, 9, 52. https://doi.org/10.3390/jimaging9020052 https://www.mdpi.com/journal/jimaging


J. Imaging 2023, 9, 52 2 of 23

with absolute measurements. The back-end optimization process can reduce the drift based
on the typical back-end algorithms, such as the early-used extended Kalman filter (EKF) [9]
or the current commonly pose graph optimization [10]. Besides the accuracy and efficiency
performance advantages, the back-end optimization process provides a framework that is
more amenable to analysis as well.
Multi-sensor fusion localization for autonomous vehicles is mainly based on the GNSS,
inertial measurement unit (IMU), camera, LiDAR, and vehicle states [11,12]. LiDAR-based
methods can provide precise localization under the condition of weak satellite signals [13].
However, global localization and environmental degradation are still important issues for
LiDAR-based methods. Complementary sensor fusion is an effective method to solve these
issues. LiDAR shows good performance in scenarios with full 3D or texture features, real
time kinematic (RTK) GNSS provides a precise absolute position, and IMU and vehicle
states provide the position and orientation of the vehicle getting rid of the external scenarios.
Taking the above-mentioned into consideration, LiDAR-based SLAM and localization
still have problems to be solved. A SLAM and localization method based on multi-sensor
fusion is proposed and integrated into a general framework in this paper, to solve the map
drift and localization failure and meet the demand of high-precision localization under
the condition of unavailable satellites, extreme climate, or road structure changes. A pose
graph considering the loop closure and RTK-GNSS position is used to optimize the map.
The LiDAR-based localization result and vehicle states are integrated into an error state
Kalman filter (ESKF) to obtain robust and precise localization.
Figure 1 is the framework of this article. In the process of offline mapping, a pose-
graph optimization LiDAR SLAM is proposed based on NDT registration, loop closure
detection and RTK-GNSS position constraints to generate an optimized 3D point cloud
map. In the online localization process, the inertial navigation system (INS) is used as a
J. Imaging 2023, 9, x FOR PEER REVIEW
prediction model in the Kalman filter propagation phase, LiDAR localization and vehicle 3 of 24

velocity are used by an error-state Kalman filter as the measurements.

Figure
Figure 1. 1.
TheTheframework
frameworkthat
thatestimates
estimatesthe
theoptimal
optimalposition
position and
and attitude
attitude (PA) for the
(PA) for the autonomous
autonomous
vehicle by combining offline mapping and online localization. NDT: normal distributions
vehicle by combining offline mapping and online localization. NDT: normal distributions transform,trans-
form, RTK-GNSS: real time kinematic-global navigation satellite system, IMU: inertial measurement
RTK-GNSS: real time kinematic-global navigation satellite system, IMU: inertial measurement unit,
unit, INS: inertial navigation system
INS: inertial navigation system.
2. Related Work
The main contributions of this paper are summarized as follows.
In this section, a brief overview of algorithms related to LiDAR SLAM and multi-
• sensor
Thefusion
NDT localization
registration,methods
scan context-based loop
are introduced, closurethe
including detection and registration
point cloud RTK-GNSS
are integrated into a LiDAR SLAM framework and innovative use of
algorithms, loop closure detection algorithms, pose graph algorithms, filter-based pose graph
sensorto
combine multiple methods to optimize
fusion algorithms, and their interaction. position and reduce map drift.
• LiDAR matching
With the localization
development position
of LiDAR SLAM,andvarious
vehicleregistration
states are fused by ESKF,
algorithms havewhich
been
takes full advantage of the vehicle velocity constraints of ground autonomous
proposed. The ICP algorithm is widely used in the registration of point cloud. Due vehicles
to the
to optimizeoflocalization
improvement results
computational and provide
efficiency robust and
and accuracy accurate localization
requirements, a variety ofresults.
variant
ICP algorithms have been derived [15]. However, the ICP is very sensitive to the initial
guess. Different from the ICP, the NDT registration algorithm builds a statistical proba-
bility model of the point cloud, which is more efficient and accurate. Study [16] proposed
a 3D-NDT registration algorithm as the improvement of the 2D-NDT algorithm [17] and
compares qualitatively and quantitatively with the standard ICP algorithm. Results show
J. Imaging 2023, 9, 52 3 of 23

• A general framework with mapping and localization is proposed, which is tested on


the KITTI dataset [14] and real scenarios. Results demonstrate the effectiveness of the
proposed framework.
The rest of the paper is structured as follows. The related work about mapping and
localization is presented in Section 2. The offline mapping process is introduced in Section 3
and the online localization method introduced in Section 4. The experiment evaluation is
given in Section 5. The discussion is given in Section 6. Finally, the conclusion and future
work are presented in Section 7.

2. Related Work
In this section, a brief overview of algorithms related to LiDAR SLAM and multi-
sensor fusion localization methods are introduced, including the point cloud registration
algorithms, loop closure detection algorithms, pose graph algorithms, filter-based sensor
fusion algorithms, and their interaction.
With the development of LiDAR SLAM, various registration algorithms have been
proposed. The ICP algorithm is widely used in the registration of point cloud. Due to the
improvement of computational efficiency and accuracy requirements, a variety of variant
ICP algorithms have been derived [15]. However, the ICP is very sensitive to the initial
guess. Different from the ICP, the NDT registration algorithm builds a statistical probability
model of the point cloud, which is more efficient and accurate. Study [16] proposed a
3D-NDT registration algorithm as the improvement of the 2D-NDT algorithm [17] and
compares qualitatively and quantitatively with the standard ICP algorithm. Results show
that the method is faster and more reliable. Study [7] proposed an NDT-based SLAM
method, which can achieve long-range high-precision map establishment and localization
in dynamic scenarios. Li et al. [18] improved the accuracy of stereo visual SLAM by
removing dynamic obstacles. Wen et al. [19] compared the performance of NDT-based
graph optimization SLAM under complex urban conditions; the results show that the
performance of the NDT SLAM algorithm is positively related to the traffic environment.
Loop closure is essential for correcting drift error and building a globally consistent
map [5]. Visual-based loop closure detection is often limited by illumination variance and
environment changes. The early LiDAR-based methods for place recognition focus on
descriptors from structural information [20]. However, the descriptor method is limited by
rotational invariance and poor point cloud resolution. Study [21] proposed a histogram
method to address these problems but still causes false recognition. To address the afore-
mentioned issues, studies [22,23] proposed the scan context method, which proposed a
more efficient bin encoding function for place recognition and is widely used in LiDAR
SLAM currently; in addition, the loop closure detection method based on deep learning
has also been gradually applied to SLAM [24].
Graph-based optimization [25], which optimizes the full trajectory and map of the
vehicle from the full set of measurements, has received attention in many studies in recent
years. Some general frameworks and open-source implementation of a pose-graph method
are proposed by [26,27]. Study [28] proposed a tutorial for the reader to implement graph-
based SLAM. To improve the robustness of pose-graph SLAM, study [29] proposed a
novel formulation that allows the back-end to change parts of the topological structure
of the graph during the optimization process and progress experiments by loop closure
constraints. To obtain accurate positions for mapping in large-scale environment, study [30]
proposed global positioning system (GPS) and LiDAR odometry (GLO)-SLAM, which uses
LiDAR to verify the reliability of GNSS, and the LiDAR odometry also will be optimized by
means of reliable GPS data. In addition, study [31] added IMU/odometry pre-integration
results under the framework of graph optimization, which effectively reduced navigation
drift. With the development of deep learning, related technologies have also been applied
to the field of SLAM [32,33].
The multi-sensor fusion method is usually used in SLAM and localization areas.
Fusing multiple sensors and making the whole system accurate, robust, and applicable
J. Imaging 2023, 9, 52 4 of 23

for various scenes is a challenging task. Study [34] integrated 2D LiDAR/IMU/GPS into
a localization system for urban and indoor scenarios, IMU and RTK-GNSS for full scene
localization, and vehicle velocity is good complementary information for localization,
especially in satellites denied and environmental degradation conditions.

3. The Offline Mapping


The online LiDAR localization module relies on a pre-build map. The offline mapping
aims to obtain a 3D point cloud map representation of the scenario. The NDT-based point
cloud registration and scan context-based loop closure detection are innovatively combined
into the front-end and the pose-graph is used in the back-end to optimize the map.

3.1. LiDAR SLAM Front-End


3.1.1. NDT Based Registration
Comparing with the ICP algorithm, the NDT divides the point cloud space into
cells and each cell is continuously modeled as a Gaussian distribution. Taking the better
calculation efficiency and registration accuracy of NDT into account, the NDT is chosen as
the point cloud registration method. The process of NDT can be expressed as follows.
In the point cloud, point sets X and Y are the consecutive frames, X is the frame at the
previous moment, Y is the frame at the next moment:

X = { x1 , x2 , . . . , x n } (1)

Y = { y1 , y2 , . . . , y n } (2)
Assuming that the transformation between X, Y can be expressed as follow:

p = [ Tx Ty Tz R x Ry Rz ]T (3)

where T is the translation vector, R is the rotation vector.


The mean of all points in X can be expressed as:

1 Nx
Nx i∑
µ= xi (4)
=1

where Nx is the number of points in the X. The covariance of X can be expressed as follow:

Nx
1
∑= Nx − 1 i∑
( xi − µ)( xi − µ)T (5)
=1

Assuming that the transformation p makes point yi transform to yi 0 , the transformation


process can be expressed as the followed:

yi 0 = T ( p, yi ) = Ryi + T (6)

After transformation, the point yi 0 is in the same coordinate system as the target point
set X, and its coincidence degree can be expressed as a Gaussian distribution:

f ( X, yi 0 ) = f
( X, T ( p , yi )) = 
1 ( y 0 − µ )T ∑ −1 ( y i 0 − µ ) (7)
√ √ exp − i
2π | ∑| 2

The joint probability distribution of Y and X can be expressed as follows:

Ny
ψ= ∏ f (X, T ( p , yi )) (8)
i =1
J. Imaging 2023, 9, 52 5 of 23

where Ny is the number of points in the Y.


The objective function can be expressed as follow:

Ny
maxψ = max∏ f ( X, T ( p , yi )) (9)
i =1

Therefore, the maximize of the joint probability ψ means that the transformation has
the highest degree of coincidence and the optimization variables T and R represent the
translation and rotation between two consecutive frames, respectively.

3.1.2. Scan Context Based Loop Closure Detection


Comparing with the feature descriptors of the environment, few studies focus on the
structural information to describe scenes. Scan context proposes a non-histogram method
of global descriptors, which directly records the 3D structure of the visible space and can
be deployed in LiDAR-based place recognition effectively. The lightweight and efficient
encoding method, which can improve the accuracy of loop closure detection, is conducive
to storing point cloud information. The scan context method is applied for the offline
mapping process. Firstly, scan context is used to detect the loop closure frame. After
detecting the candidate frame in the historical frame, NDT is used to register the loop
closure frame with the current point cloud frame to obtain the precise loop pose.
Figure 2 shows the flow chart of scan context and loop closure detection. In the point
cloud segmentation process, the point cloud space is cut into Nr rings along the increasing
radius and the rings are cut into Ns sectors:

Lmax
dr = (10)
Nr
J. Imaging 2023, 9, x FOR PEER REVIEW 6 of 24
dr represents the width of the ring, Lmax represents the maximum measurement
where
distance of LiDAR, Nr is numbers of rings.

Figure2.2.Scan
Figure Scancontext-based
context-basedloop
loopclosure
closuredetection,
detection,current
current3D3Dpoint
pointcloud
cloudscan
scanisisthe
thestart
startofofthis
this
process. Which can provide loop closure detection position for offline mapping position optimiza-
process. Which can provide loop closure detection position for offline mapping position optimization.
tion.
After segmentation, the segmented bin cells can be represented as a set P:
After segmentation, the segmented bin cells can be represented as a set P:
P= ∪ pij (11)
P i=
∈[ Nr ],j∈[ Ns ] p
ij (11)
i[ N r ], j[ N s ]

where pij represents the set of midpoints of the ith circle segmentation unit of the jth sector.
where pij represents the set of midpoints of the ith circle segmentation unit of the jth sector.
In the generation of scan context process, the scan context I is represented as a Nr ×
Ns matrix, each element in the matrix represents the maximum value of all 3D points in
the z-direction.
The distance function between two frames of point cloud scan context is defined as:

Ns  q c 
J. Imaging 2023, 9, 52 6 of 23

In the generation of scan context process, the scan context I is represented as a Nr ×


Ns matrix, each element in the matrix represents the maximum value of all 3D points in the
z-direction.
The distance function between two frames of point cloud scan context is defined as:
q
 
1 Ns c j · ccj
Ns j∑
d( I q , I c ) = 1 − 
q c (12)
=1 c j c j

which can be used for similarity score, where Iq is the current frame scan context, cq j is the
jth of Iq , Ic is the historical frame scan context, and cc j is the jth of Ic .
To solve the problem that the current frame is rotated relative to the historical frame,
the order of the column vectors in the scan context obtained at the current time is changed
and causes a large-distance function between the two frames, the historical frame Ic is
translated by column, which will obtain Ns scan contexts, calculate the distance from the
scan context of the current frame in turn, and select the one with the smallest distance,
which can be expressed as follows:

D ( I q , I c ) = minn∈[ Ns ] d( I q , Inc ) (13)

The loop frame can be found by comparing the similarity of scan contexts between the
current frame and the historical frame point cloud; when the distance function is less than
a certain threshold, the historical frame is considered to be a loop frame.
In the precise position calculation process, scan context is used to calculate the ro-
tation angle ϕ between the current frame and the loop frame to improve the calculation
efficiency and accuracy of the NDT, and ϕ is used as the initial position for the NDT
registration process:
n∗ = argmind( I q , Inc ) (14)
n∈[ Ns ]

2π ∗
ϕ= ·n (15)
Ns

3.1.3. RTK-GNSS Based Localization


Real time kinematic localization is a satellite navigation technique used to enhance
the precision of localization data derived from satellite-based navigation systems. RTK
relies on a single reference station to provide real-time corrections for the rover providing
up to centimeter-level accuracy. There are indeed many situations with severe multipath
and signal blockage under urban buildings or in forests. A stable and precise position and
attitude can be provided for autonomous vehicles by combining RTK-GNSS and IMU.

3.2. Back-End Optimization


After interframe association and submap matching, there are inevitable cumulative
errors in the point cloud map. The method of pose-graph optimization is used to further
eliminate the cumulative errors, and the loop closure position and RTK-GNSS position will
be used as constraints, the back-end optimization step is summarized in Algorithm 1.
J. Imaging 2023, 9, 52 7 of 23

Algorithm 1. The process of back-end optimization


Input:
LiDAR odometry position xi , xj
RTK-GNSS position zi
Loop closure position zi,j 0
Output:
Optimized vehicle position xopt
1: Trajectory alignment for xi , zi and zi,j 0
2: for each position xi do
3: if meet optimization cycle times h then
4: execute optimization process:
5: xopt = arg min F(xi , xj , zi , zi,j 0 )
6: else
7: add RTK-GNSS position zi constraint
8: if loop closure position detected then
9: add loop closure position zi,j 0 constraint
10: end if
11: end if
12: end for
13: return optimized vehicle position xopt

3.2.1. Graph Generation


The graph consists of edges and nodes, as shown in Figure 3; xi represents nodes
from LiDAR odometry, zi represents prior position from RTK-GNSS, ei represents the edge
between xi and zi . zij represents the transformation of xj and xi , zij 0 represents
J. Imaging 2023, 9, x FOR PEER REVIEW 8 of 24 expected

observation from loop closure, and eij represents the edge between zij and zij 0 .

Figure 3.
Figure 3. Graph
Graphstructure forfor
structure back-end optimization.
back-end optimization.

3.2.2. Graph
3.2.2. GraphOptimization
Optimization
Graph
Graphoptimization
optimization takes all the
takes all constraints into a non-linear
the constraints optimization
into a non-linear problem, problem,
optimization
which will consider all the observation measurements:
which will consider all the observation measurements:
 T 
F ( x) =  e( xi , x j , z ij ) 
T
∧ e( x , x , z ij ) ∧
F ( x ) =i , j ∑ e( xi , x j , z ij ) Ωij e( xi , x j , z ij )
ij i j (16) (16)
i,j
where F(x) represents errors between all edges. Ωij is the matrix indicating the importance
of eachF(x)
where represents
constraint in the errors
global between all edges.To
graph optimization. Ωijadjust
is thethe
matrix indicating
state quantity x tothe
min-importance
of each
imize theconstraint
value of the in the global
residual, graph to
it is necessary optimization. To adjust
obtain the Jacobian the state
of the residual quantity x to
relative
minimize the value
to state quantity, of the
and then useresidual,
the gradientit isdescent
necessary to obtain
method the Jacobian
to optimize. of the
The solution of residual
this optimization
relative to state isquantity,
the xopt which
andsatisfying
then usethe following
the gradient function:
descent method to optimize. The
solution of this optimization is the xopt which satisfying the following function:
xopt = arg min F ( x) (17)
x = argminF ( x ) (17)
To integrate the RTK-GNSS into theoptgraph optimization, the error between LiDAR
odometry xi and RTK-GNSS position zi can be represented as follows:
To integrate the RTK-GNSS into the graph optimization, the error between LiDAR
ei =zln(
odometry xi and RTK-GNSS position zi −1 xbe
i can

i ) represented as follows: (18)

1 xi∨can be expressed as follows:


The residual ei after adding disturbance ln (zto−the
e =term i x) i i (18)

ei = ln( zi −1 exp(i  ) xi )  (19)

The error between zij and zij’ can be represented as follows:


−1 −1 
J. Imaging 2023, 9, 52 8 of 23

The residual ei after adding disturbance term to the xi can be expressed as follows:
∧ ∨
ei = ln (zi −1 exp(δξ i ∧ ) xi ) (19)

The error between zij and zij 0 can be represented as follows:


eij = ln (zij −1 xi −1 x j ) (20)

The residual eij after adding disturbance to the xi and xj can be expressed as follows:

∧ ∨
eij = ln (zij −1 xi −1 exp((−δξ i )∧ ) exp(δξ j ∧ ) x j ) (21)

The residual is expanded after adding disturbance term, and the Jacobian matrix J of
the residual with respect to the state quantity can be obtained.
A first-order taylor expansion on the residuals can be expressed as follows:

e( x + ∆x ) ≈ e + J∆x (22)

F ( x + ∆x ) ≈ (e + J∆x ) T Ωij (e + J∆x ) (23)


The state quantity xopt after correction can be expressed as follows:

xopt = x + ∆x (24)

4. The Online Localization


A multi-sensor fusion localization method based on the ESKF is proposed, which will
estimate the vehicle position and attitude (PA) jointly by fusing vehicle states and LiDAR
localization pose.

4.1. LiDAR Localization Based on 3D Point Cloud Map


The LiDAR localization based on a 3D point cloud map estimates the position of the
vehicle in real-time, and the position can be used for the Kalman filter observation update.
In this process, the RTK-GNSS position is used as the initial position for LiDAR localization
to improve matching accuracy and efficiency. The NDT algorithm is used as registration
method to match the real-time point cloud with the local map, the LiDAR localization step
is summarized in Algorithm 2.

Algorithm 2. LiDAR localization in prior 3D point cloud map


Input:
RTK-GNSS position zi
Point cloud pi
Prior 3D point cloud global map M
Output:
LiDAR localization position xlidar
1: Load 3D point cloud map M
2: if get the initial position zi then
3: load local submap Msub from global map M
4: if need update submap Msub then
5: update submap Msub
6: else
7: calculate position between pi and Msub :
8: NDT registration xlidar = pi ∝ Msub
9: end if
10: else
11: wait for initial position zi
12: end if
13: return LiDAR localization position xlidar
J. Imaging 2023, 9, 52 9 of 23

4.2. Filter State Equation


In the filter, the state variables error is expressed as follows:
T
X = [δPT , δV T , ϕT , εT , ∇T ] (25)

where δP is the position error, δV is the velocity error, φ is the attitude error, ε is the
gyroscope bias error, and ∇ is the accelerometer bias error. The state transition equation in
continuous time can be expressed as follows:

X = Ft X + Bt W (26)

According to the derivation of the IMU kinematics model, where


 
03X3 I3X3 03X3 03X3 03X3
03X3 03X3 F23 03X3 Cbn 
Cbn
 
Ft = 
03X3 03X3 F33 03X3 
 (27)
 03X15 
03X15
 
0 − fU − fN
F23 =  fU 0 − fE  (28)
− fN fE 0
The east-north-up (ENU) and right-forward-up (RFU) are chosen as the navigation
reference frame n, and the body frame b, respectively, where fE is the acceleration in the
east direction, fN is the acceleration in the north direction, fU is the acceleration in the up
direction, and Cn b is the direction cosine matrix from b frame to n frame:
 
0 w sin L −w cos L
F33 = −w sin L 0 0  (29)
w cos L 0 0

where ω is the angular velocity of the earth’s rotation, L is the latitude, W includes the
gyroscope noise ω g and accelerometer noise ωa :

W = [w gx w gy w gz wax way waz ]T (30)


 
03X3 03X3
 03X3 Cbn 
Bt = 
 −C n
 (31)
b 03X3 
06X3 06X3

4.3. Filter Measurement Update Equation


To compensate the loss of localization signal under complex driving scenarios and
enhance the robustness of the localization system, LiDAR localization position and vehicle
velocity are used as observation inputs:
h i
Y = δPLT ϕTL δVvT (32)

where δPL is the LiDAR localization position error, ϕL is the attitude error, and δVv is the
vehicle velocity error.
The observation equation is as follows:

Y = Gt X + Ct N (33)
J. Imaging 2023, 9, 52 10 of 23
J. Imaging 2023, 9, x FOR PEER REVIEW 11 of 24

where  
I3X3 03X3 03X3 03X6
G = 0 0 I 0 T (34)
N = ntP nP nP3X3n 3X3
n 3X3n nV3X6nV nV 

J. Imaging 2023, 9, x FOR PEER REVIEW   11 of 24 (35)
03X3 I3X3 03X3
E N U E N U E N U
L L L
03X6
L L L v v v

N is the observation noise and can be expressed as follows:


 I 3 X 3 03 X 3 03 X 3 
 
C = 0 I 0 T iT
(36)
N = n PNE = nnPPN nPn PnUP t nn ϕEn3X 3nnϕ3NX 3nV n3ϕXnU3V nnV VvE nVvN nVvU
h
(35) (35)
0L 3 X 3 03LX 3 I 3 X L3 
E N U E N U E N U
L L L L L L v v v
L L L

 I 3 X I33X3 3 X 3 3X3
 
03 X 3003X3 0
5. Experimental Verification and =  0 I 3 X 3 I03X3
t =  03 X 3 3X3
CCt Performance 0
3 XAnalysis
3 3X3

(36) (36)
I
03 X 3 with 
0 3X3 0 3X3 3X3
This section introduces experiments X 3  KITTI dataset and field tests based on
03 X 3 I 3the
theExperimental
5. proposed method.
Verification and Performance Analysis
5. Experimental Verification and Performance Analysis
This section introduces experiments with the KITTI dataset and field tests based on
5.1. The
ThisExperiment
section Based onexperiments
introduces KITTI Dataset with the KITTI dataset and field tests based on
the proposed method.
the proposed
The KITTI method.
dataset was jointly founded by the Karlsruhe Institute of Technology in
Germany
5.1. and the Toyota
The Experiment Based onAmerican Institute of Technology. It is currently the world’s
KITTI Dataset
5.1. The
largest Experiment Based on KITTI Dataset
Theautonomous
KITTI dataset driving localization
was jointly founded andby computer visionInstitute
the Karlsruhe algorithm of evaluation
Technologyda- in
taset. The
It KITTI dataset
contains LiDAR was jointly
data, IMU founded
data, by the Karlsruhe
RTK-GNSS data, Institute
velocity ofdata,
Technology
and theinlocaliza-
Germany
Germany
and the Toyota
and the Toyota
American
American
Institute
Institute of
of Technology.
Technology.
It is currently
It is localization
the
currently theaccuracy.
world’s
world’s KITTI
tion groundtruth,
largest which is used to evaluate the mapping and
largestautonomous driving
autonomous driving localization
localization andcomputer
and computer vision
vision algorithm
algorithm evaluation
evaluation da- dataset.
has
It multipleLiDAR
contains sequence datasets
data, IMU for various
data, RTK-GNSS scenarios;
data, sequence
velocity 00 was
data, used
and theinlocalization
this study.
taset. It contains LiDAR data, IMU data, RTK-GNSS data, velocity data, and the localiza-
The mapping
groundtruth,
tion groundtruth,
and
which localization
is used
which is used
result
to to is
evaluate shown
evaluatethe
in
the mapping
Figure
mapping and
4.
andlocalization
localization accuracy.
accuracy. KITTI has
KITTI
multiple sequence
has multiple datasets
sequence forforvarious
datasets variousscenarios; sequence0000was
scenarios; sequence was used
used in this
in this study. The
study.
mapping and and
The mapping localization result
localization resultis is
shown
shownin
inFigure
Figure 4.

Figure 4. Optimized map and LiDAR localization result based on KITTI dataset.
Figure 4. Optimized map and LiDAR localization result based on KITTI dataset.
Figure 4. Optimized map and LiDAR localization result based on KITTI dataset.
5.1.1. Mapping Performance Analysis Based on KITTI Dataset
5.1.1.Back-end
5.1.1. Mapping optimization
Mapping Performance
Performance Analysis
Analysis
plays anBased on KITTI
Based
important Dataset
on KITTI
role inDataset
the process of mapping; Figures 5
Back-end
show theoptimization
and 6Back-end results of plays an important
optimization role
performance,in the process
the
optimization plays an important role in the process ofofmapping;
abscissa Figures
in the figure
mapping; 5 5 and 6
represents
Figures
and 6
the indexshow the results
of the ofdata of optimization
frame where performance,
the position the abscissa in the figure represents
show the results optimization performance, theisabscissa
saved; inin the
the figure
following, the abscissa
represents the indexis
the index of the
represented data frame
by Index, where
hasthe position is saved; inIn theFigure
following, it the
canabscissa
seenis that the
of the data frame wherewhich
the position the same meaning.
is saved; in the following, the 6,abscissa berepresented
is by
represented by Index, which has the same meaning. In Figure 6, it can be seen that the
optimized
Index, whichlongitudinal,
has the samelateral andInaltitude
meaning. Figure 6,error
it canare
bereduced
seen thatto thecentimeter-level,
optimized longitudinal,which
optimized longitudinal, lateral and altitude error are reduced to centimeter-level, which
effectively
lateral eliminates
and eliminates
effectively
the
altitude error cumulative
are reduced
the cumulative
error of the front-end
to centimeter-level,
error
odometry
which effectively
of the front-end odometry
and
and improves
improves
eliminates
the
the
mapping
cumulative accuracy.
error of the front-end odometry and improves the mapping accuracy.
mapping accuracy.

Figure 5. The unoptimized position error in longitudinal, lateral, and altitude directions of KITTI
dataset,5.
Figure
Figure 5.respectively.
The unoptimized
The unoptimized position
position error
error in
in longitudinal,
longitudinal, lateral,
lateral, and
and altitude
altitude directions
directions of
of KITTI
KITTI
dataset, respectively.
dataset, respectively.
J.
J.J.Imaging 2023,9,
Imaging2023,
Imaging 2023, 9,52
9, xx FOR
FOR PEER
PEER REVIEW
REVIEW 11 of
12
12 of 23
of 24
24

J. Imaging 2023, 9, x FOR PEER REVIEW 12 of 24

Figure
Figure 6.
Figure 6. The
6. Theoptimized
The optimizedposition
optimized positionerror
position errorin
error longitudinal,
ininlongitudinal, lateral,
longitudinal,lateral, and
and
lateral, altitude
altitude
and directions
directions
altitude of
of KITTI
directions KITTI da-
da-
of KITTI
taset,
taset, respectively;
respectively; the
the error
error is
is significantly
significantly reduced
reduced after
after optimization.
optimization.
dataset, respectively; the error is significantly reduced after optimization.

Figure
FigureFigure 77 shows
shows the
6. The optimized the coincidence
position degree
degree between
error in longitudinal,
coincidence between the
lateral, and groundtruth
thealtitude
groundtruth estimated
estimated
directions trajectory.
trajectory.
of KITTI da-
trajectory.
taset,
It can respectively;
be seen the
that error
there is significantly
is
It can be seen that there isisaalarge a large reduced
deviation
largedeviation after optimization.
between
deviationbetween the unoptimized
unoptimized trajectory and
between the unoptimized trajectory and the
the
groundtruth,
groundtruth, the optimized trajectory error is significantly reduced. Table 1 shows the
the optimized trajectory error is significantly reduced. Table 1 shows the
Figure 7 shows the coincidence degree between the groundtruth estimated trajectory.
trajectory
trajectory accuracy
accuracy after
after optimization
optimization is
is significantly
significantly improved
improved
It can be seen that there is a large deviation between the unoptimized trajectory and the
and
and basically
basically coincides
coincides
with
withthe
groundtruth, groundtruth,
thegroundtruth,
groundtruth,
the optimizedand
and the
the average
theaverage
andtrajectory
average error
error
errorerror is
is about
isabout
about10
is significantly 10 cm.
cm.
10reduced.
cm. Table 1 shows the
trajectory accuracy after optimization is significantly improved and basically coincides
with the groundtruth, and the average error is about 10 cm.

Figure 7.
Figure7.
Figure The
7. The comparison
The comparison of
comparison of trajectory
of trajectory before
trajectory before and
beforeand after
andafter optimization
afteroptimization with
optimizationwith groundtruth.
withgroundtruth.
groundtruth.
Figure 7. The comparison of trajectory before and after optimization with groundtruth.
Table
Table 1. Mapping
1.Mapping
Table1. performance
Mappingperformance comparison
performancecomparison before
comparisonbefore and
beforeand after
andafter optimization
afteroptimization [35]
optimization[35] (units:
[35](units: meter).
(units:meter).
meter).
Table 1. Mapping performance comparison before and after optimization [35] (units: meter).
Max
Max
Max Min
Min
Min Mean
Mean
Mean RMSE
RMSE
RMSE STD
STD
STD
Max Min Mean RMSE STD
Before
Before optimization
optimization
Beforeoptimization
optimization 34.31
34.31
34.31 0.02
0.02 13.61
13.61 16.61
16.61 9.53
9.53
Before 34.31 0.02 0.02 13.61 13.61 16.61 16.619.53 9.53
After
Afteroptimization
After optimization
optimization 0.23
0.23
0.23 0.01
0.01
0.01 0.11
0.11
0.11 0.13
0.13
0.13 0.09
0.09
0.09
After optimization 0.23 0.01 0.11 0.13 0.09

5.1.2.
5.1.2.
5.1.2. Localization
5.1.2.Localization
Localization
Localization Performance
Performance
Performance
Performance Analysis
Analysis
Analysis
Analysis Based Based
Based
Based on
on
on KITTI KITTI
on KITTI
KITTI
DatasetDataset
Dataset
Dataset
TheLiDAR
The
The LiDAR
LiDAR localization
localization
localization result
result
result is fused
is
is fused fused with
withwith IMUvehicle
IMU IMU
and and vehicle
and vehicle velocity
velocityvelocity to improve
to
to improve improve
the the
the
localization
localization accuracy in the case of scenario degradation. As shown in Figures 8 and 9, aa
localization accuracy
accuracy in in
thethe
case case
of of scenario
scenario degradation.
degradation. As As
shown shown
in in
Figures Figures
8 and 9, 8
a and 9,
localization
localization performance
localizationperformance
performance testtest
test is
is conducted
is conducted
conducted basedbased
on theon
based the
prior
on prior
theKITTI
prior KITTI
point
priorKITTI
KITTI point
cloud
point cloud
map.
point cloud
cloudmap.map.
map.

Figure 8. LiDAR localization based on prior KITTI point cloud map; the white points in the figure
are the point cloud map built by the offline mapping process, the red line is the trajectory of LiDAR
matching results, the orange line is groundtruth, and the blue line is the fused trajectory.
J. Imaging 2023, 9, x FOR PEER REVIEW 13 of 2
J. Imaging 2023, 9, x FOR PEER REVIEW 13 of 2

Figure 8. LiDAR localization based on prior KITTI point cloud map; the white points in the figure
Figure
are 8. LiDAR
the point localization
cloud based
map built by the on prior
offline KITTI point
mapping cloud
process, themap; the is
red line white points in the
the trajectory figur
J. Imaging 2023, 9, 52 12 of 23of LiDAR
are the point cloud map built by the offline mapping process, the red line is the trajectory
matching results, the orange line is groundtruth, and the blue line is the fused trajectory. of LiDAR
matching results, the orange line is groundtruth, and the blue line is the fused trajectory.

Figure 9. The comparison of fused trajectory and groundtruth. The fused localization data basically
Figure9.9.The
Figure
coincides The
with comparison
comparison of fused
of fused
the groundtruth, trajectory
trajectory and
andthat
indicating groundtruth.
groundtruth.
the onlineThe Thelocalization
fused fused
localization localization
results data data basicall
meetbasically
expectations.
coincideswith
coincides withthethe groundtruth,
groundtruth, indicating
indicating thatonline
that the the online localization
localization results
results meet meet expectations.
expectations.

As
As shown in Figure 10 and Table 2, the maximum position error on the KITTI datase
Asshown
is within
withinshown ininFigure 10 and
Figure Table
10 and 2, the2,maximum
Table position
theismaximum error on
position the KITTI
error thedataset
onlocalization
KITTI datase
is 3535cm,
cm, the
the average
average position
position errorerror
is withinwithin
20 cm,20and
cm,a and
stablea localization
stable result resul
is within 35
is maintained.
maintained.cm, the average position error is within 20 cm, and a stable localization resul
is
is maintained.

Figure10.
Figure 10.The
Theposition
position error
error of localization
of localization resultresult in longitudinal,
in longitudinal, lateral,lateral, and altitude
and altitude directions
directions,
Figure 10. The position error of localization result in longitudinal, lateral, and altitude directions
respectively.
respectively.
respectively.
Table2.2.Localization
Table Localization error
error compared
compared withwith groundtruth
groundtruth (units: (units:
meter).meter).
Table 2. Localization error compared with groundtruth (units: meter).
Max
Max MinMin MeanMean RMSE RMSE STD STD
Max Min Mean RMSE STD
0.35
0.35 0.080.08 0.18 0.18 0.16 0.16 0.07 0.07
0.35 0.08 0.18 0.16 0.07
5.2.The
5.2. TheField
FieldTest
Test Vehicle
Vehicle andand
TestTest Results
Results
5.2. The Field Test Vehicle and Test Results
To
Tofurther
furtherverify thethe
verify effectiveness of the
effectiveness ofproposed method,
the proposed a four-wheel
method, steeringsteering
a four-wheel and and
To
four-wheel further
hub verify
motor the
drive effectiveness
vehicle is of the
developed proposed
by our method,
team. It is a four-wheel
equipped with
four-wheel hub motor drive vehicle is developed by our team. It is equipped with sensors steering
sensors and
four-wheel
for data hub
collection motor
and candrive vehicle
feedback is
vehicledeveloped
states by our
information team.
through It is
theequipped
controller
for data collection and can feedback vehicle states information through the controller area with
area sensor
for data(CAN)
network
network collection
bus. and can feedback vehicle states information through the controller are
(CAN) bus.
network (CAN) bus.
5.2.1. Test Vehicle and Sensor Configuration
5.2.1.ATest Vehicle
32 beams and Sensor
LiDAR, Configuration
5.2.1. Test Vehicle and an RTK-GNSS
Sensor system and an IMU are equipped on the testing
Configuration
A 32electric
intelligent beamsvehicle.
LiDAR,Sensor
an RTK-GNSS system
specifications of theand
test an IMUare
vehicle areshown
equipped on the
in Figure 11 testing
and A
Table 32
3.beams
Gyro LiDAR,
and an RTK-GNSS
accelerometer bias system
stability of and
IMU an
are IMU
5 are
deg/h equipped
and 0.5 mg,
intelligent electric vehicle. Sensor specifications of the test vehicle are shown in Figure on the
respec- testing
11
intelligent
tively.
and Table electric
In addition,
3. vehicle.
Gyrothe vehicle
and Sensor
velocityspecifications
accelerometer can be obtained
bias stabilityoffrom
theIMU
of test vehicle
theare
on-board are
CAN
5 deg/h shown
andbus, in
0.5 and
mg,Figure
respec11
and
the Table
groundtruth3. Gyro and accelerometer
is provided by RTK-GNSS bias stability
system. of IMU are 5 deg/h and
tively. In addition, the vehicle velocity can be obtained from the on-board CAN bus, and 0.5 mg, respec
tively.
the In addition,
groundtruth the vehicle
is provided byvelocity
RTK-GNSS can be obtained from the on-board CAN bus, and
system.
the groundtruth is provided by RTK-GNSS system.
J. Imaging 2023, 9, x FOR PEER REVIEW 14 of 24
J.J.Imaging 2023,9,9,52x FOR PEER REVIEW
Imaging2023, 13 of
14 of23
24

Figure 11. Test vehicle and sensors configuration.


Figure11.
Figure 11.Test
Testvehicle
vehicleand
andsensors
sensorsconfiguration.
configuration.
Table 3. Sensor specifications of test vehicle (Velodyne HDL-32E from Velodyne, San Jose, CA,
USA.
TableStarNeto,
3. Sensor Newton-M2 from
specifications StarNeto
of test vehicleTechnology, Beijing, China).
(Velodyne HDL-32E from Velodyne, San Jose, CA,
Table 3. Sensor specifications of test vehicle (Velodyne HDL-32E from Velodyne, San Jose, CA, USA.
USA. StarNeto, Newton-M2 from StarNeto Technology, Beijing, China).
StarNeto,Sensors Specifications
Newton-M2 from StarNeto No.
Technology, Beijing, China). Frequency/Hz Accuracy
Sensors Velodyne, HDL-32E,
Specifications No. Frequency/Hz 2 cm,
Accuracy
3D LiDAR
Sensors Specifications No. 1 Frequency/Hz 10 Accuracy
32 beams
Velodyne, HDL-32E, 0.09
2 cm,deg
3D LiDAR Velodyne, HDL-32E, 1 10 2 cm,
3D LiDARsystem StarNeto, Newton-M2,
32 beams 1 1 10 50 2 cm,
0.09 deg
RTK-GNSS 32 beams 0.09 deg
L1/L2
StarNeto, RTK
Newton-M2, 0.1 deg
2 cm,
RTK-GNSS
RTK-GNSS system StarNeto, Newton-M2,
system 1 1 50 50 2 cm,
L1/L2 L1/L2
RTK RTK 5 deg/h,
0.1 deg
0.1 deg
IMU Newton-M2 1 100
50.5
5 deg/h, mg
deg/h,
IMUIMU Newton-M2
Newton-M2 1 1 100 100
Vehicle velocity On-board CAN bus 1 100 0.5 0.1
mg
0.5 m/s
mg
Vehicle velocity On-board CAN bus 1 100 0.1 m/s
Vehicle velocity On-board CAN bus 1 100 0.1 m/s
5.2.2. Field Test Mapping Performance Analysis
5.2.2. Field
5.2.2.TheField Test
TestMapping
proposed Mapping
mapping Performance
method Analysis
Performance Analysis
was tested in a real scenario to verify its perfor-
The
mance. proposed
TheInproposed mapping
the field mapping method
test, the dataset was
methodwas tested in ain
wascollected
tested real scenario
ina the to verify
realindustrial
scenario park
to itswith
verifyperformance.
itsscenario
perfor-
In the
change. field test, the dataset was collected in the industrial
mance. In the field test, the dataset was collected in the industrial park with park with scenario change.scenario
As
change.As shown
shown in in Figures
Figures 12 12 and
and 13,13,two
two point
point cloud
cloud maps
maps werewere constructed
constructed by by offline
offline
mapping
mapping process,
process, and
and the
the map
map drift
drift was
was effectively
effectively eliminated
eliminated
As shown in Figures 12 and 13, two point cloud maps were constructed by offline after
after optimization.
optimization. AsAs
shown
shown in
in Figures
Figures 14
14 and
and 15,
15, due
due to
to the
the use
use of
of graph
graph optimization
optimization
mapping process, and the map drift was effectively eliminated after optimization. As algorithm,
algorithm, the
the optimized
optimized
trajectory
shown in basically
trajectory basically14
Figures coincides
coincides
and 15, due with
withtothe
thegroundtruth.
the groundtruth. It
It can
can be
use of graph optimization be seen from
from Figures
seenalgorithm, the 16
Figures 16and
and17
optimized 17
that
that the position
the position
trajectory basicallyerror
error of the three
of the three
coincides axes
with axes before optimization
before optimization
the groundtruth. gradually
It can begradually
seen from increases
increases
Figures over over
16 and 11 m,
m,
17
and
and the
the average
average error
error before
before the
the optimization
optimization isis1.6
1.6m.
m. As
As shown
shown in
in Figures
Figures 18
18 and
and 19,
19, the
the
that the position error of the three axes before optimization gradually increases over 1 m,
error
error of
ofthethethree axes
axesis
threeerror isreduced
reduced to
tocentimeter-level
centimeter-level after optimization, and the average
and the average before the optimization is 1.6 m.after optimization,
As shown in Figures and18 the
and average
19, the
error
error isis88cm. ItItcan be seen from the above analysis that the proposed mapping method
error of the three axes is reduced to centimeter-level after optimization, and the method
cm. can be seen from the above analysis that the proposed mapping average
can
can reduce
reduce the
theItposition
position error significantly
errorfromsignificantly and
and construct
construct a globally consistent map.
error is 8 cm. can be seen the above analysis thata the
globally
proposed consistent
mapping map.method
can reduce the position error significantly and construct a globally consistent map.

Figure12.
Figure 12.The
Themap
mapdrift
driftin
insmall
smallscale
scalescenario
scenarioisiseliminated
eliminatedby byback-end
back-endoptimization.
optimization.The
Thefigure
figure
on the
Figure left is
12.isTheunoptimized, and
map drift inand
smallthe right one is optimized. It can be seen that the map drift of the
on the left unoptimized, thescale
rightscenario is eliminated
one is optimized. by back-end
It can optimization.
be seen that Theof
the map drift figure
the
optimized map
on the leftmap is significantly
is unoptimized, andreduced.
the right one is optimized. It can be seen that the map drift of the
optimized is significantly reduced.
optimized map is significantly reduced.
J.J.Imaging 2023,9,9,52x FOR PEER REVIEW
Imaging2023, 14 of
15 of23
24
J. Imaging 2023, 9, x FOR PEER REVIEW 15 of 24
J. Imaging 2023, 9, x FOR PEER REVIEW 15 of 24

Figure13.
Figure
Figure 13.The
13. Themap
The mapdrift
map driftin
drift inlarge
in largescale
large scalescenario
scale scenariois
scenario isiseliminated
eliminatedby
eliminated byback-end
by back-endoptimization.
back-end optimization.
optimization. The
The
The figure
figure
figure
Figure
on the 13. The
upper map
left is drift in large
unoptimized, scale
and scenario
the upperis right
eliminated
one isby back-end Itoptimization.
optimized. can be seen The
that figure
the map
on
onthe
theupper
upper left
left is
is unoptimized,
unoptimized, and and the upper
upper right
right one
one isis optimized.
optimized.ItItcan canbe
beseen
seenthat
thatthe
themap
map
on the
drift ofupper
the left is
optimized unoptimized,
map is the upper
significantly right one
reduced. The isbottom
optimized.
figureIt can
is becomplete
the seen that the map
large-scale
driftof
drift
drift ofthe
theoptimized
optimizedmap mapisisis significantly
significantly reduced.
reduced. TheThe bottom
bottom figure
figure is the
is the the complete
complete large-scale
large-scale map
mapof
map
the
after
after
optimized
optimization.
optimization.
map significantly reduced. The bottom figure is complete large-scale
after optimization.
map after optimization.

Figure14.
Figure 14. The
14. Thecomparison
comparisonof
comparison oftrajectory
trajectory before
before and
and after
after optimization
optimization with
with groundtruth
groundtruth ofofofsmall
small
Figure
Figure 14. The comparison ofoftrajectory
trajectorybefore
before and
and after
after optimization
optimization with
with groundtruth
groundtruth of small
small
scale
scale
scale scenario.
scenario.
scenario.
scale scenario.

Figure
Figure15.
Figure 15.
15. The
15.The comparison
Thecomparison of
comparisonof
comparison trajectory
oftrajectory before
trajectorybefore and
beforeand after
andafter optimization
afteroptimization with
optimization groundtruth
with
with of of
groundtruth
groundtruth of large
large
Figure of trajectory before and after optimization with groundtruth oflarge
large
scale
scale
scale scenario.
scenario.
scenario.
scale scenario.
J. Imaging 2023, 9, x FOR PEER REVIEW 16 of 2

J. Imaging 2023,
J. Imaging 2023,9,9,x52
FOR PEER REVIEW 15 of 23 16 of 2

J. Imaging 2023, 9, x FOR PEER REVIEW 16 of 24

Figure 16. Small scale scenario unoptimized mapping trajectory position error in longitudinal, lat
eral, and altitude directions. Before optimization, the error is at the meter level.
Figure
Figure
Figure 16.16.Small
16. Small
Small scale
scale
scale scenario
scenario
scenario unoptimized
unoptimized
unoptimized mapping
mapping mapping
trajectory
trajectory trajectory position
position error
position error error in longitudinal,
in longitudinal,
in longitudinal, lateral,
lat- lat
eral,altitude
and
eral, andaltitude
and altitude directions.
directions. Before
directions. Before optimization,
optimization,
Before the the
optimization, error isthe
error error
atisthe
at is atlevel.
meter
the meter the meter level.
level.

Figure
Figure17.
Figure 17.Large
Largescale scenario
scale unoptimized
scenario mapping
unoptimized
unoptimized mapping trajectory
mapping
trajectory position
position error
trajectory in
in longitudinal,
position
error lat-
error in longitudinal,
longitudinal, lateral, lat
eral, and altitude
eral,altitude
and and altitude directions. Before
directions.
directions. optimization, the
Before optimization,
Before optimization, error
the error istheis at the
error
at the meter
is at
meter level.
the meter level.
level.
Figure 17. Large scale scenario unoptimized mapping trajectory position error in longitudinal, lat
eral, and altitude directions. Before optimization, the error is at the meter level.

Figure 18. Small scale scenario optimized mapping trajectory position error in longitudinal, lateral,
and altitude directions. After optimization, the error is at the centimeter level.
Figure18.
Figure 18.Small
Small scale
scale scenario
scenario optimized
optimized mapping
mapping trajectory
trajectory position
position error inerror in longitudinal,
longitudinal, lateral, lateral
andaltitude
and altitudedirections.
directions. After
After optimization,
optimization, the error
the error is at
is at the the centimeter
centimeter level. level.
Figure 18. Small scale scenario optimized mapping trajectory position error in longitudinal, lateral
and altitude directions. After optimization, the error is at the centimeter level.
J.J.Imaging 2023,9,9,52
Imaging2023, x FOR PEER REVIEW 16 of 23
17 24

Figure 19.
Figure 19. Large
Large scale
scale scenario
scenario optimized
optimized mapping
mapping trajectory position error in longitudinal, lateral,
and altitude directions. After optimization, theerror
and altitude directions. After optimization, the errorisisat
atthe
thecentimeter
centimeterlevel.
level.

5.2.3.
5.2.3. Field
Field Test
TestLocalization
LocalizationPerformance
PerformanceAnalysis
Analysis
Based
Based on on the
the prior
prior point
point cloud
cloud map,
map, five
five different
different field
field tests
tests were
were implemented
implemented to to
verify
verify the
the localization
localization performance.
performance. The The field
field tests
tests included
included different
different driving
driving conditions
conditions
and
and scenarios.
scenarios.
Five
Five sets of field
sets of fieldtests
testsrepresent
representdifferent
differentdriving
driving conditions
conditions andand travel
travel distances.
distances. As
As shown in Figures 20–25, due to the small changes in the scenario,
shown in Figures 20–25, due to the small changes in the scenario, the fused trajectory the fused trajec-
ba-
tory basically
sically coincides
coincides with thewith the groundtruth
groundtruth under under the normal
the normal driving driving scenario
scenario and the and the
curve
curve driving scenario of one lap, and the maximum error does not
driving scenario of one lap, and the maximum error does not exceed 45 cm. From Figures exceed 45 cm. From
Figures
26–31, we 26–31, we conducted
conducted anotheranother
two setstwo sets of experiments
of experiments under different
under different scenarios,
scenarios, due to
due
changes in the environment, the LiDAR localization position has drifted, but thebut
to changes in the environment, the LiDAR localization position has drifted, the
maxi-
maximum positioning error after fusing the IMU and the vehicle velocity
mum positioning error after fusing the IMU and the vehicle velocity can still be controlled can still be
controlled within 55 cm. From Figures 32–34, in the large-scale scenario,
within 55 cm. From Figures 32–34, in the large-scale scenario, the localization performance the localization
performance
is still robust.isItstill
canrobust.
be seenItfrom
can Table
be seen fromthe
4 that Table 4 thatposition
average the average
errorposition
is withinerror
30 cm, is
within 30 cm, which meets the autonomous driving lane-level localization requirements.
which meets the autonomous driving lane-level localization requirements. Field tests of
J. Imaging 2023, 9, x FOR PEER REVIEW
sce-
Field tests scenarios results show that the localization algorithm based on the prior18point 24
narios results show that the localization algorithm based on the prior point cloud map can
cloud map can achieve good performance.
achieve good performance.

Table 4. Average errors from 5 field test scenarios.

Field Test Sequence Test Scenario Average Error RMSE


01 Normal driving, 1 lap 21.6 cm 23.2 cm
02 Curve driving, 1 lap 22.5 cm, 24.3 cm
03 Normal driving, 2 laps 28.3 cm 29.2 cm
04 Curve driving, 2 laps 25.5 cm 27.1 cm
05 Large scale, 1 laps 29.2 cm 29.6 cm

Figure20.
Figure 20. Field
Field test
test NO.1.
NO.1. One
One lap
lap under
under normal
normal driving
driving scenario
scenario localization
localization test
test in
in prior
prior point
point
cloud map. The white points in the figure are the point cloud map built by the
cloud map. The white points in the figure are the point cloud map built by the offline mapping offline mapping
process, the red line is the trajectory of LiDAR matching results, the orange line is groundtruth, and
process, the red line is the trajectory of LiDAR matching results, the orange line is groundtruth, and
the blue line is the fused trajectory.
the blue line is the fused trajectory.
Figure 20. Field test NO.1. One lap under normal driving scenario localization test in prior point
Figure 20. Field
cloud map. The test
white NO.1. One
points in lap
theunder
figurenormal
are the driving scenario
point cloud maplocalization
built by thetest in prior
offline point
mapping
J. Imaging 2023, 9, 52 17 of 23
cloud map. The white points in the figure are the point cloud map built by the offline mapping
process, the red line is the trajectory of LiDAR matching results, the orange line is groundtruth, and
process,
the blue the
linered linefused
is the is thetrajectory.
trajectory of LiDAR matching results, the orange line is groundtruth, and
the blue line is the fused trajectory.

Figure 21.
Figure 21. Field
Fieldtest
testNO.1.
NO.1.One laplap
One under normal
under driving
normal scenario
driving localization
scenario test trajectory
localization com-
test trajectory
Figure 21.
parison. Field
The test NO.1.ofOne
comparison laptrajectory
fused under normal
with driving scenario localization test trajectory com-
groundtruth.
comparison. The comparison of fused trajectory with groundtruth.
parison. The comparison of fused trajectory with groundtruth.

Figure 22. Field test NO.1. One lap under normal driving scenario localization error in longitudinal,
Figure
Figure andField
lateral,22.
J. Imaging 2023, 9, x FOR PEER REVIEW test NO.1.
altitude One lap
directions, under normal
normal
respectively; driving
driving scenario
the localization error localization error inlevel.
is at the centimeter longitudinal,
19 of 24
lateral, and
lateral, and altitude
altitude directions,
directions, respectively;
respectively; the
thelocalization
localizationerror
errorisisat
atthe
thecentimeter
centimeterlevel.
level.

Figure23.
Figure 23.Field
Fieldtest
testNO.2.
NO.2.One
Onelap
lapunder
undercurve
curvedriving
drivingscenario
scenariolocalization
localizationtest
testininprior
priorpoint
pointcloud
cloud
map. The white points in the figure are the point cloud map built by the offline mapping
map. The white points in the figure are the point cloud map built by the offline mapping process, theprocess,
the red line is the trajectory of LiDAR matching results, the orange line is groundtruth, and the blue
red line is the trajectory of LiDAR matching results, the orange line is groundtruth, and the blue line
line is the fused trajectory.
is the fused trajectory.
Figure 23. Field test NO.2. One lap under curve driving scenario localization test in prior point cloud
FigureThe
map. 23. white
Field test NO.2.
points Onefigure
in the lap under curve
are the driving
point cloudscenario localization
map built test inmapping
by the offline prior point cloud
process,
J. Imaging 2023, 9, 52 map. The white points in the figure are the point cloud map built by the offline mapping 18 of 23
process,
the red line is the trajectory of LiDAR matching results, the orange line is groundtruth, and the blue
the red
line line
is the is thetrajectory.
fused trajectory of LiDAR matching results, the orange line is groundtruth, and the blue
line is the fused trajectory.

Figure 24. Field test NO.2. One lap under curve driving scenario localization
Figure localization test
test trajectory
trajectory compari-
compar-
Figure
ison. 24.comparison
The Field test NO.2.
of Onetrajectory
fused lap underwith
curve driving scenario localization test trajectory compar-
groundtruth.
son. The comparison of fused trajectory with groundtruth.
ison. The comparison of fused trajectory with groundtruth.

Figure 25. Field test NO.2. One lap under curve driving scenario localization error in longitudinal,
Figure 25.
Figure
J. Imaging 2023, 9, x FOR PEER REVIEW
lateral, 25. Field
andField test directions,
test
altitude NO.2. One
NO.2. Onerespectively;
lap under
lap under curve
curve driving scenario
the localization error localization error in level.
is at the centimeter longitudinal,
20 of 24
lateral, and
lateral, and altitude
altitude directions,
directions, respectively;
respectively;the
thelocalization
localizationerror
errorisisat
atthe
thecentimeter
centimeterlevel.
level.

Figure26.
Figure 26.Field
Fieldtest
testNO.3.
NO.3.Two
Twolaps
lapsunder
undernormal
normaldriving
drivingscenario
scenariolocalization
localizationtest
testin
inprior
priorpoint
point
cloud map. The white points in the figure are the point cloud map built by the offline mapping
cloud map. The white points in the figure are the point cloud map built by the offline mapping
process, the red line is the trajectory of LiDAR matching results, the orange line is groundtruth, and
process, the red line is the trajectory of LiDAR matching results, the orange line is groundtruth, and
the blue line is the fused trajectory.
the blue line is the fused trajectory.
Figure 26. Field test NO.3. Two laps under normal driving scenario localization test in prior point
Figuremap.
cloud 26. Field
The test
whiteNO.3. Two
points in laps under are
the figure normal driving
the point scenario
cloud map localization
built by thetest in prior
offline point
mapping
J. Imaging 2023, 9, 52 cloud map. The white points in the figure are the point cloud map built by the offline 19 of 23
mapping
process, the red line is the trajectory of LiDAR matching results, the orange line is groundtruth, and
process,
the the red
blue line line
is the is thetrajectory.
fused trajectory of LiDAR matching results, the orange line is groundtruth, and
the blue line is the fused trajectory.

Figure 27.
Figure 27. Field
Fieldtest
testNO.3.
NO.3.TwoTwolaps under
laps normal
under driving
normal scenario
driving localization
scenario test trajectory
localization com-
test trajectory
Figure 27.
parison. Field
The test NO.3.ofTwo
comparison laps
fused under normal
trajectory with driving scenario localization test trajectory com-
groundtruth.
comparison.
parison. TheThe comparison
comparison of fused
of fused trajectory
trajectory withwith groundtruth.
groundtruth.

Figure 28. Field test NO.3. Two laps under normal driving scenario localization error in longitudi-
Figure
Figure
J. Imaging 2023, 9, x FOR PEER REVIEW
nal, 28.Field
28.
lateral,Field testNO.3.
and test NO.3.
altitude Twolaps
Two lapsrespectively;
directions, undernormal
under normal driving
driving
the scenario
scenario
localization localization
localization
error error
is at the error in longitudi-
in longitudinal,
centimeter 21 of 24
level.
nal, lateral,
lateral, and altitude
and altitude directions,
directions, respectively;
respectively; the localization
the localization error error is atcentimeter
is at the the centimeter
level.level.

Figure29.
Figure 29. Field
Field test
test NO.4.
NO.4. Two
Two laps
lapsunder
undercurve
curvedriving
drivingscenario
scenario localization
localization test
test in
in prior
prior point
point
cloud map. The white points in the figure are the point cloud map built by the offline mapping
cloud map. The white points in the figure are the point cloud map built by the offline mapping
process, the red line is the trajectory of LiDAR matching results, the orange line is groundtruth, and
process, the red line is the trajectory of LiDAR matching results, the orange line is groundtruth, and
the blue line is the fused trajectory.
the blue line is the fused trajectory.
Figure 29. Field test NO.4. Two laps under curve driving scenario localization test in prior point
Figure 29. Field test NO.4. Two laps under curve driving scenario localization test in prior point
J. Imaging 2023, 9, 52 cloud map. The white points in the figure are the point cloud map built by the offline mapping 20 of 23
cloud map. The white points in the figure are the point cloud map built by the offline mapping
process, the red line is the trajectory of LiDAR matching results, the orange line is groundtruth, and
process, the red line is the trajectory of LiDAR matching results, the orange line is groundtruth, and
the blue line is the fused trajectory.
the blue line is the fused trajectory.

Figure 30.
30. Field test
Fieldtest NO.4.
testNO.4.
NO.4.TwoTwolaps under curve driving scenario localization test trajectory com-
Figure
Figure 30. Field Two lapslaps under
under curve
curve driving
driving scenario
scenario localization
localization test trajectory
test trajectory com-
parison. The
comparison. comparison of fused trajectory with groundtruth.
parison. The The comparison
comparison of fused
of fused trajectory
trajectory withwith groundtruth.
groundtruth.

Figure 31. Field test NO.4. Two laps under curve driving scenario localization error in longitudinal,
Figure 31.
31. Field
Field test
test NO.4.
NO.4. Two
Two laps
laps under
under curve
curve driving
driving scenario
scenario localization error
error in
in longitudinal,
Figure
J. Imaging 2023, 9, x FOR PEER REVIEW
lateral, and altitude directions, respectively; the localization errorlocalization
is at the centimeter longitudinal,
level. 22 of 24
lateral, and
lateral, and altitude
altitude directions,
directions, respectively;
respectively; the
the localization
localization error
errorisisat
atthe
thecentimeter
centimeterlevel.
level.

Figure
Figure32.32.Field
Fieldtest
test NO.5.
NO.5. Large-scale
Large-scalescenario
scenariolocalization
localizationtest
testin
inprior
priorpoint
pointcloud
cloudmap.
map.The
Thewhite
white
points
points in the figure are the point cloud map built by the offline mapping process, the red lineline
in the figure are the point cloud map built by the offline mapping process, the red is
is the
the trajectory
trajectory of LiDAR
of LiDAR matching
matching results,
results, the orange
the orange line
line is is groundtruth,
groundtruth, andblue
and the the line
blueisline
theisfused
the
fused trajectory.
trajectory.
Figure 32. Field test NO.5. Large-scale scenario localization test in prior point cloud map. The white
Figure 32. Field test NO.5. Large-scale scenario localization test in prior point cloud map. The white
J. Imaging 2023, 9, 52 points in the figure are the point cloud map built by the offline mapping process, the red line21is the
points in the figure are the point cloud map built by the offline mapping process, the red line isofthe
23
trajectory of LiDAR matching results, the orange line is groundtruth, and the blue line is the fused
trajectory of LiDAR matching results, the orange line is groundtruth, and the blue line is the fused
trajectory.
trajectory.

Figure 33.33. Field test


test NO.5. Large
Large scale scenario
scenario localization test
test trajectorycomparison.
comparison. Thecompari-
compar-
Figure 33. Field
Figure Field test NO.5.
NO.5. Large scale
scale scenariolocalization
localization testtrajectory
trajectory comparison.The
The compar-
ison
son of fused trajectory with groundtruth.
isonofoffused
fusedtrajectory
trajectorywith
withgroundtruth.
groundtruth.

Figure 34. Field test NO.5. Large scale scenario localization error in longitudinal, lateral, and altitude
Figure34.
Figure 34. Field
Field test
test NO.5.
NO.5. Large
Large scale
scale scenario
scenario localization
localization error
error in
in longitudinal,
longitudinal,lateral,
lateral,and
andaltitude
altitude
directions, respectively; the localization error is at the centimeter level.
directions,respectively;
directions, respectively;the
thelocalization
localizationerror
errorisisatatthe
thecentimeter
centimeterlevel.
level.

6. Discussion
6. Discussion
Table 4. Average errors from 5 field test scenarios.
From
From the
the experimental
experimental results,
results, this
this can
can all
all be
be summarized
summarized as
as the
the following:
following:
Field Test Sequence Test Scenario Average Error RMSE
01 Normal driving, 1 lap 21.6 cm 23.2 cm
02 Curve driving, 1 lap 22.5 cm 24.3 cm
03 Normal driving, 2 laps 28.3 cm 29.2 cm
04 Curve driving, 2 laps 25.5 cm 27.1 cm
05 Large scale, 1 laps 29.2 cm 29.6 cm

6. Discussion
From the experimental results, this can all be summarized as the following:
The offline mapping method proposed in this paper can effectively eliminate map
drift, provide a mapping accuracy of 5–10 cm, and can be used in localization work to
provide a stable and reliable map data source.
In the online localization process, we use the multi-sensor fusion method to achieve a
positioning accuracy of 20–30 cm, which benefits from the good mapping accuracy and the
design of the multi-sensor fusion model.

7. Conclusions and Future Work


This paper presented a LiDAR-based sensor fusion SLAM and localization method for
autonomous vehicle offline mapping and online localization. In the mapping process, NDT
registration, scan context loop closure detection and RTK-GNSS position are considered in
front-end and back-end, innovative use of the pose graph to combine multiple methods
J. Imaging 2023, 9, 52 22 of 23

to optimize position and reduce map drift, which realizes 5–10 cm mapping accuracy,
and the map drift is eliminated effectively. In the online localization process, the ESKF is
used to fuse complementary sensor information, such as LiDAR, IMU and vehicle velocity
to achieve good localization accuracy in various challenging scenarios, which takes full
advantage of the vehicle velocity constraints of ground autonomous vehicles to optimize
localization results, and reaches 20–30 cm localization accuracy and shows environmental
robustness. Such good and stable mapping and localization results can assist autonomous
vehicles to safely complete navigation tasks in the lane. Furthermore, the proposed method
can be used to fuse more sensors for offline mapping and online localization, respectively,
facing different applications. In the future work, we will try to tightly couple the IMU and
LiDAR to the mapping system, which can reduce the drift of the front-end, and improve
the quality of the mapping.

Author Contributions: Conceptualization, K.D. and B.S.; data curation, K.D. and S.Z.; software, K.D.;
formal analysis, K.D., G.W. and B.S.; funding acquisition, B.S., J.W. and F.M.; writing—original draft
preparation, K.D. and B.S.; writing—review and editing, G.W. and Y.Z. All authors have read and
agreed to the published version of the manuscript.
Funding: This work was supported in part by the National Natural Science Foundation of China under
Grants 52102457, in part by the China Postdoctoral Science Foundation under Grant 2021M691207 and
in part by the Science and Technology Development Plan Project of Changchun under Grant 21QC09.
Institutional Review Board Statement: Not applicable.
Informed Consent Statement: Not applicable.
Data Availability Statement: Not applicable.
Conflicts of Interest: The authors declare no conflict of interest.

References
1. Aslan, M.F.; Durdu, A.; Yusefi, A.; Sabanci, K.; Sungur, C. A tutorial: Mobile robotics, SLAM, bayesian filter, keyframe bundle
adjustment and ROS applications. Robot Oper. Syst. (ROS) 2021, 6, 227–269.
2. Suhr, J.K.; Jang, J.; Min, D.; Jung, H.G. Sensor fusion-based low-cost vehicle localization system for complex urban environments.
IEEE Trans. Intell. Transp. Syst. 2016, 18, 1078–1086. [CrossRef]
3. Kuutti, S.; Fallah, S.; Katsaros, K.; Dianati, M.; Mccullough, F.; Mouzakitis, A. A survey of the state-of-the-art localization
techniques and their potentials for autonomous vehicle applications. IEEE Internet Things J. 2018, 5, 829–846. [CrossRef]
4. Kim, C.; Cho, S.; Sunwoo, M.; Resende, P.; Bradaï, B.; Jo, K. Updating Point Cloud Layer of High Definition (HD) Map Based on
Crowd-Sourcing of Multiple Vehicles Installed LiDAR. IEEE Access 2021, 9, 8028–8046. [CrossRef]
5. Cadena, C.; Carlone, L.; Carrillo, H.; Latif, Y.; Scaramuzza, D.; Neira, J.; Reid, I.; Leonard, J.J. Past, present, and future of
simultaneous localization and mapping: Toward the robust-perception age. IEEE Trans. Robot. 2016, 32, 1309–1332. [CrossRef]
6. Pomerleau, F.; Colas, F.; Siegwart, R.; Magnenat, S. Comparing ICP variants on real-world data sets. Auton. Robot. 2013, 34,
133–148. [CrossRef]
7. Einhorn, E.; Gross, H.-M. Generic NDT mapping in dynamic environments and its application for lifelong SLAM. Robot. Auton.
Syst. 2015, 69, 28–39. [CrossRef]
8. Zhang, J.; Singh, S. LOAM: Lidar Odometry and Mapping in Real-time. In Robotics: Science and Systems; MIT Press: Cambridge,
MA, USA, 2007; Volume 2.
9. Zhang, T.; Wu, K.; Song, J.; Huang, S.; Dissanayake, G. Convergence and consistency analysis for a 3-D invariant-EKF SLAM.
IEEE Robot. Autom. Lett. 2017, 2, 733–740. [CrossRef]
10. Ren, Z.; Wang, L.; Bi, L. Robust GICP-based 3D LiDAR SLAM for underground mining environment. Sensors 2019, 19, 2915.
[CrossRef]
11. Fayyad, J.; Jaradat, M.A.; Gruyer, D.; Najjaran, H. Deep learning sensor fusion for autonomous vehicle perception and localization:
A review. Sensors 2020, 20, 4220. [CrossRef]
12. Ahmed, H.; Tahir, M. Accurate attitude estimation of a moving land vehicle using low-cost MEMS IMU sensors. IEEE Trans.
Intell. Transp. Syst. 2016, 18, 1723–1739. [CrossRef]
13. Levinson, J.; Montemerlo, M.; Thrun, S. Map-based precision vehicle localization in urban environments. In Robotics: Science and
Systems; MIT Press: Cambridge, MA, USA, 2009; Volume 4.
14. Geiger, A.; Lenz, P.; Urtasun, R. Are we ready for autonomous driving? The KITTI vision benchmark suite. In Proceedings of the
Computer Vision and Pattern Recognition, Providence, RI, USA, 16–21 June 2012; pp. 3354–3361.
J. Imaging 2023, 9, 52 23 of 23

15. Junior, E.M.O.; Santos, D.R.; Miola, G.A.R. A New Variant of the ICP Algorithm for Pairwise 3D Point Cloud Registration. Am.
Acad. Sci. Res. J. Eng. Technol. Sci. 2022, 85, 71–88.
16. Magnusson, M.; Lilienthal, A.; Duckett, T. Scan registration for autonomous mining vehicles using 3D-NDT. J. Field Robot. 2007,
24, 803–827. [CrossRef]
17. Ren, R.; Fu, H.; Wu, M. Large-scale outdoor slam based on 2d lidar. Electronics 2019, 8, 613. [CrossRef]
18. Li, G.; Liao, X.; Huang, H.; Song, S.; Zeng, Y. Robust Stereo Visual SLAM for Dynamic Environments with Moving Object. IEEE
Access 2021, 9, 32310–32320. [CrossRef]
19. Wen, W.; Hsu, L.-T.; Zhang, G. Performance analysis of NDT-based graph SLAM for autonomous vehicle in diverse typical
driving scenarios of Hong Kong. Sensors 2018, 18, 3928. [CrossRef]
20. Luo, L.; Cao, S.-Y.; Han, B.; Shen, H.-L.; Li, J. BVMatch: LiDAR-Based place recognition using bird’s-eye view images. IEEE Robot.
Autom. Lett. 2021, 6, 6076–6083. [CrossRef]
21. Kasaei, S.H.; Tomé, A.M.; Lopes, L.S.; Oliveira, M. GOOD: A global orthographic object descriptor for 3D object recognition and
manipulation. Pattern Recognit. Lett. 2016, 83, 312–320. [CrossRef]
22. Kim, G.; Park, B.; Kim, A. 1-day learning, 1-year localization: Long-term lidar localization using scan context image. IEEE Robot.
Autom. Lett. 2019, 4, 1948–1955. [CrossRef]
23. Xue, G.; Wei, J.; Li, R.; Cheng, J. LeGO-LOAM-SC: An Improved Simultaneous Localization and Mapping Method Fusing
LeGO-LOAM and Scan Context for Underground Coalmine. Sensors 2022, 22, 520. [CrossRef]
24. Arshad, S.; Kim, G.-W. Role of deep learning in loop closure detection for visual and lidar slam: A survey. Sensors 2021, 21, 1243.
[CrossRef]
25. Fan, T.; Wang, H.; Rubenstein, M.; Murphey, T. CPL-SLAM: Efficient and certifiably correct planar graph-based SLAM using the
complex number representation. IEEE Trans. Robot. 2020, 36, 1719–1737. [CrossRef]
26. Vallvé, J.; Solà, J.; Andrade-Cetto, J. Pose-graph SLAM sparsification using factor descent. Robot. Auton. Syst. 2019, 119, 108–118.
[CrossRef]
27. Macenski, S.; Jambrecic, I. SLAM Toolbox: SLAM for the dynamic world. J. Open Source Softw. 2021, 6, 2783. [CrossRef]
28. Grisetti, G.; Kümmerle, R.; Stachniss, C.; Burgard, W. A tutorial on graph-based SLAM. IEEE Intell. Transp. Syst. Mag. 2010, 2,
31–43. [CrossRef]
29. Latif, Y.; Cadena, C.; Neira, J. Robust loop closing over time for pose graph SLAM. Int. J. Robot. Res. 2013, 32, 1611–1626.
[CrossRef]
30. Lin, R.; Xu, J.; Zhang, J. GLO-SLAM: A slam system optimally combining GPS and LiDAR odometry. Ind. Robot Int. J. Robot. Res.
Appl. 2021; ahead-of-print. [CrossRef]
31. Chang, L.; Niu, X.; Liu, T. GNSS/IMU/ODO/LiDAR-SLAM Integrated Navigation System Using IMU/ODO Pre-Integration.
Sensors 2020, 20, 4702. [CrossRef] [PubMed]
32. Chen, C.; Wang, B.; Lu, C.X.; Trigoni, N.; Markham, A. A survey on deep learning for localization and mapping: Towards the age
of spatial machine intelligence. arXiv 2020, arXiv:2006.12567.
33. Wang, K.; Ma, S.; Chen, J.; Ren, F.; Lu, J. Approaches challenges and applications for deep visual odometry toward to complicated
and emerging areas. IEEE Trans. Cogn. Dev. Syst. 2020, 14, 35–49. [CrossRef]
34. Gao, Y.; Liu, S.; Atia, M.M.; Noureldin, A. INS/GPS/LiDAR integrated navigation system for urban and indoor environments
using hybrid scan matching algorithm. Sensors 2015, 15, 23286–23302. [CrossRef] [PubMed]
35. Grupp, M. evo: Python Package for the Evaluation of Odometry and Slam. Available online: https://github.com/MichaelGrupp/
evo (accessed on 16 November 2022).

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual
author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to
people or property resulting from any ideas, methods, instructions or products referred to in the content.

You might also like