CoreSLAM A SLAM Algorithm in Less Than 200 Lines o

Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/228374722

CoreSLAM: a SLAM Algorithm in less than 200 lines of C code

Article

CITATIONS READS
22 10,303

2 authors, including:

Oussama El Hamzaoui

8 PUBLICATIONS 145 CITATIONS

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

CoreSLAM View project

All content following this page was uploaded by Oussama El Hamzaoui on 26 May 2014.

The user has requested enhancement of the downloaded file.


1

CoreSLAM : a SLAM Algorithm in less than 200


lines of C code
Bruno Steux, Oussama El Hamzaoui
Mines ParisTech - Center of Robotics, Paris, FRANCE.

Abstract—This paper presents a Laser-SLAM algorithm Some research has focused on the comparison of these
which has been programmed in less than 200 lines of C- algorithms in performance and speed of calculation [6],
language code. Our idea was to develop and implement a showing that the problem is still not solved in all cases...
very simple SLAM algorithm that could be easily integrated
into our particle-filter based localization subsystem. For our Among laser SLAM algorithms based on the use of particle
experiments, we have been using a homebrew robotic platform filters, the DP-SLAM is one of the best known [2]. DP-
called MinesRover. It’s a six wheels robot fully equipped with SLAM works by maintaining a joint distribution over robot
sensors, including a Hokuyo URG04 laser scanner. A typical poses and maps via a particle filter. The algorithm associates
example of our experiments is presented, showing the good a map to each particle, and focuses on the problem of sharing
performance of the algorithm. Furthermore, the full source code
of the map update and map matching functions are provided in parts of maps among particles in order to minimize memory
this paper. This work shows the possibility to perform complex (and time through map copy). The problem with DP-SLAM
tasks using simple and easily programmable algorithms. is that it is rather complex to integrate into an existing particle
Index Terms—SLAM, laser scanner, mobile robot, localiza- filter based localization susbystem, like ours - which is very
tion, mapping, particle filter. similar to the one described in [3].
During our review, we’ve seen that most algorithms are
tested on slow robots compared to ours - their speed hardly
I. I NTRODUCTION exceeding 1 m/s -, and/or are using very expensive lasers -
generally SICK or IBEO laser scanner with ranges of 100
HIS article presents a new SLAM algorithm called
T CoreSLAM. This algorithm was developed as a part
of the robotics system called CoreBots developped at Mines
meters. The speed of our robot - which reaches 3 m/s -,
combined with the short range of the Hokuyo URG04, creates
major but interesting problems for the tasks of localization
ParisTech. and mapping.
CoreSLAM was named after the tiny size of its code, In addition, most if not all SLAM algorithms developed
being smaller than 200 lines of C-language code. It was so far use methods requiring many lines of code or complex
so small that we decided to publish its source code in a mathematics, and so requiring much effort to understand and
conference paper, so that it would not be just open source, validate their operation - generally relying on many different
but published source ! What encouraged us to do so is the parameters that influence the results. We began our work
good performance we’ve obtained during our experiments. with an important goal: making a simple algorithm, easy to
After a short review of existing SLAM algorithms, we will understand and still offering good performance - and most
present the motivations which led us to create the CoreSLAM of all easy to integrate into an existing particle filtering
algorithm. Section III will be devoted to the description of localization framework. We added one last requirement :
the algorithm. We will then introduce the robotic platform our algorithm had to be embedded in the end, so it had to
used for our tests, and discuss the results obtained. minimize memory consumption and use integer computation
in critical loop code.
II. P REVIOUS W ORK III. M ETHODOLOGY
Many researches attempt to solve the problem of simul- In contrary to DP-SLAM, we decided to use only one map.
taneous localization and mapping, commonly called SLAM. The advantage of DP-SLAM over CoreSLAM is thus the
The algorithms developed can be classified according to the thoretical ability not to be lost in long corridors, and this
types of sensors used or the calculation methods adopted. is the goal indeed of the map-per-particle concept - not the
There are SLAM algorithms based on computer vision by loop closing which can’t be achieved in DP-SLAM without
one camera [1] or several cameras [8], and algorithms that an external process. As a matter of fact, we decided that
use a laser sensor or sonars [4]. Concerning the computa- this advantage didn’t worth the complexity - especially as
tion methods, two large families exist. On the one hand, we could rely on a good odometry on our platform and
algorithms based on the use of Kalman filters [7], and on given that our goal was to close rather small loops (exploring
the other hand, we find algorithms using particle filters [2], laboratories instead of corridors...).
or Rao-Blackwellized particle filters - a mix of particle and As the idea of CoreSLAM was to integrate laser informa-
Kalman filtering - like FastSLAM[5]. tion in our localization subsystem based on particle filter, we
had to write two main functions :
• The scan to map distance function, which acts as the
likelihood function used to test each state position distance measured
by laser
hypothesis (particle) in the filter. The source code of the
(x1, y1)
ts_distance_scan_to_map function is provided
below (Algorithm 2). Note that it simply consists in a
simple sum of all the map values at all the impact points value written
in map (xp, yp)
(x2, y2)
of the scan (relative to the position of the particle). This
is a very fast procedure, using only integer operations,
which enables us not to be limited in the number of
TS_HOLE_WIDTH
particles while running in real-time. However, that way
to estimate likelihood implies that the map is constructed
specifically...
• The map update function, used to build the map as the
robot is going forward, that is discussed below in detail.
Building a map compatible with a particle filter is not Figure 1. Result of the integration into the map of a range scan by the laser.
straightforward : the peakiness of the likelihood function One can see the “V” shape of the hole drawn around each laser impact.
is very important for the efficiency of the filter and thus
should be easily controllable. We achieved this by using
grey-level maps, the map update consisting in digging holes of 600 mm in our implementation), and the scale of the map
the width of which directly corresponds to our likelihood (1 point = 1 centimeter in our case). In this version, the
function peakiness. For each obstacle detected, the algorithm odometry can be ignored or be used as a starting point for
does not draw a single point, but a function with a hole whose the Monte-Carlo search (on figures 5 and 6, the odometry
lower point is at the position of the obstacle (see figure 1 for is ignored so that odometry and localization by laser can be
a zoom on a newly created map). As a result, constructed compared to each other)
map are not looking like traditional maps, but are looking The version based on particle filtering is necessary to
like surfaces with attracting holes. The integration into the manage ambiguous situations - and thus is necessary for
map is done through an αβ filter (line 74 of Algorithm 4), re-localization (when we start with a full map instead of a
resulting in a convergence of the map to the newer profiles. blank map). It is also the best for the integration of odometry,
The map update is called at the last updated position of the since we can cope with non-systematic errors like slippage
particle filter (being the weighted average of all the particles by integrating into the filter a non-linear error model (by
after likelihood evaluation), optionally with a latency setting the slippage probability to 10%, we make 10% of the
(see discussion below). particles to stay where they originally lie with high model
The source code of the ts_map_update function noise, and 90% of the particles evolve with the odometry
is provided below (Algorithm 3). It uses the function with low model noise).
ts_map_laser_ray (Algorithm 4), which is the most The particle filter is also a very good framework for
tricky part. The latter uses a Bresenham algorithm to draw integrating other sensors than odometry and laser. We have
the laser rays in the map, with another enhanced bresenham also integrated a GPS (for outdoor use) and a compass
algorithm inside to compute the right profile. It uses only (without good success at the moment, the sensor being very
integer computation and no integer division in critical parts. sensitive to magnetic noise in our case).
Even if it is executed only once per step (at 10 Hz in our We’ll finish our presentation of CoreSLAM by discussing
case), it is critical that this procedure be fast, since due to two issues relative to the accuracy of localization and the
the high resolution (1cm per pixel) of our map, a big part of latency of integration of the data into the map.
our map frame is touched by a laser scan. Subpixel accuracy : even if our map has a res-
CoreSLAM can easily be integrated into a particle filter, olution of 1cm, it is possible to measure displace-
but can also be used without. Indeed, the constructed map, ments smaller than 1cm, due to the fact that our
with its attracting holes and slopes, can be used with any gra- ts_distance_scan_to_map function takes into ac-
dient descent algorithm. The matching algorithm converges count several points to make its calculation. Even a 1mm
more easily towards the obstacle, as the hole-function is move can be measured, because some of the laser points
acting as a guide. Our stand-alone version uses a very simple will land in another point of the map.
Monte-Carlo algorithm to match the current scan with the Latency : The concept of latency - the time between the
map and to retrieve the updated position of the robot. Indeed, laser scan and its integration into the map - is necessary to
the stand-alone version was developed in order to tune the measure small displacements relative to the map resolution.
parameters of CoreSLAM, i.e. the integration speed of the For instance, in order to correctly measure the displacement
scans into the map (parameter quality in ts_map_update for an object moving at 1cm/s with a map resolution of 1
function, see 2), which was set to 50, the width of the “hole” cm and a frequency of 1Hz, it is necessary to wait for 10
surrounding the impact points in our map (set to a fixed value measurements, then the latency should be 10. Indeed the
latency theoretical formula is : that provides yaw rate information anf the inclination of the
Latency = M apResolution∗M easurementF requency
RobotSpeed
robot (not used in our experiments yet).
Concerning our robot and given our map resolution, the The electronics of the robot is centered on a Qwerk
0.01∗10 module, designed by charmedlabs.com. It is a board based
formula is : Latency = RobotSpeed . Our robot operates at a
rather high speed (2 m/s). Its speed falls rarely to 0.01 m/s, on an ARM 9 microprocessor coupled with a Xilinx FPGA
and even at this speed, a latency equal to 1 is sufficient to logic.
do calculations. Moreover, the subpixel accuracy mentioned
before makes the task easier. We thus decided to remove the
latency management code for CoreSLAM, but recall that it’s
necessary for applying CoreSLAM to slow moving objects.
We couldn’t end the description of our algorithm without
discussing loop closure. Our algorithm doesn’t cover loop
closure, but can be integrated with any loop closing algo-
rithm. Indeed, we are currently developping a tinyLoopCloser
algorithm that should complement our CoreSLAM nicely.

IV. P LATFORM DESCRIPTION


The platform used for our experiments is the MinesRover
(see figure 2), a homebrew robot developped jointly by Mines
ParisTech and SAGEM DS. This rover has four driving and Figure 3. The Mines Rover mechanical architecture. It’s a 6 wheel robot,
steering wheels and two odometry wheels (see figure 3). with 4 steering and driving wheels, and 2 free-rotating wheels equipped
with 2000 points encoders. The Qwerk module is at the center of the robot.
The mechanical architecture of the robot, based on a rocker- Its 200Mhz microprocessor is able to manage all the sensors and actuators
bogie articulation, yields a very good odometry information reliably.
since the two central free-running wheels always - at least
theoretically - keep contact with the ground. This mechanical A Linux operating system is intalled on the Qwerk and
setup offers a protection against the problem of slippage, provides USB bus support for the GPS and Hokuyo. The
without shock absorbers. FPGA logic takes care of the I²C bus support, odometer
inputs and servo-motor control.
The laser sensor used is a Hokuyo URG-04LX, connected
through USB. While being a great sensor for the price, it
suffers from several shortcomings :
• Its maximum range is limited to 5.6 meters, which
considering the speed of our robot - 3 meters/s - is
severly limitating. In our cluttered environnement, a lot
of measures by the sensors gave 0 meters (indicating a
bad or no reflection) or ambiguous measures (like lower
parts of moving chairs, or boxes on the ground next to
walls - where both the front of the box or the wall behind
can be detected).
• The frequency of measure (10Hz) is also limited. Run-
ning at 3m/s for instance into a wall, the wall appears
Figure 2. The Mines Rover platform. We can see on the picture the dome inclined, since the last point of the measure after 240°
for the camera, the steering servomotors, the HOKUYO URG-04 laser range of turning of the laser, is 240/360*3/10 = 20 centimeters
scanner, the GPS receiver (grey square) and the emergency ultrasonic sensor
in the front of the robot. nearer than the first point of measure (though instantly
being at the same distance). In our robot, it was nec-
The robot is powered by a 4-elements LiPo battery (14.8V) essary to take this into consideration, so we corrected
of 4.1 Ah. The four 45W CC-motors enable a top speed of every scan using a constant (for each scan) longitudinal
3 m/s. and rotationnal speed.
The GPS receiver allows a location accuracy that can reach
1 meter (using one EGNOS satellite). The Hokuyo URG04 V. R ESULTS
laser sensor offers a 10Hz horizontal scanning, with a range The experiments were conducted in the Electronic Labora-
of 5.6 meters. In order to retrieve the robot’s direction, we tory of Mines ParisTech. This environment was really chal-
combine the results of a compass and GPS. The ultrasonic lenging for the algorithm : it’s a highly cluttered environment
sensor provides an opportunity to discover any obstacle not with boxes and computers lying on the ground, many tables
detected by the laser (like stair steps), and is mostly used and shelves (see figure 9) that are very difficult to detect and
as an emergency stop. The robot also embeds a 5-axis IMU trace using a the laser scanner.
Mines Rover speed
3
Measured by odometry
Measured by laser

2.5

Speed (m/s)
1.5

0.5

0
0 100 200 300 400 500 600
Time (100ms)

Figure 4. Software architecture diagram of the Mines Rover robotic


platform. On the left, the embedded software. On the right side, the part Figure 5. The speed of the robot estimated using odometry alone and our
of software running on a desktop PC. Communication between the Mines laser CoreSLAM on the other side (without taking into account odometry),
Rover and the operator is done using an HMI on the desktop PC. The particle during experiment show on 7. Note the good match between measures. We
filtering algorithm is running on a desktop computer in order to reduce the can observe a small latency of the laser measure (of around 1 frame), which
burden of the ARM9 on the robot. We use a wireless connection, a MIMO is foreseeable considering the nature of the sensor. This demonstrates that
router in the robot allowing a good quality of connection. our odometry is very good (and perfectly calibrated) and that the laser
CoreSLAM is able to cope with the high velocity of the robot (reaching
2.5 m/s in this experiment).

Figures 5, 6 and 7 show results with the same sample


of data. On the first two figures, we see the comparison Mines Rover yaw rate
between odometry and estimation of movement by laser 150
Measured by odometry
Measured by laser
alone, and show that they are very similar. The map on 100

figure 7 was constructed by combining laser and odometry


50
information, and shows a good accuracy of reconstruction,
the loop being almost closed. Interestingly, the slippage 0
Yaw rate (deg/s)

observed in this experiment (see figure 8) was corrected by -50

the laser information in a nice way.


-100
Note that the speed of the robot in our experiments reached
2.5 meters/s and the yaw rate reached 150°/s (measured by -150

odometry and confirmed by the laser measurements), the very -200

high angular speed of the robot being favored by its 4 steering


-250
wheels. 0 100 200 300
Time (100ms)
400 500 600

Figure 6. The yaw rate measured by odometry alone and our laser CoreS-
VI. C ONCLUSIONS LAM (without taking into account the available odometry information).
Note the yaw rate reaching 250°/s in this experiment - demonstration the
We have developed a simple and efficient algorithm which very high steering ability of the Mines Rover. Note that in this experiment,
correspondong to the map shown on 7, we observed a slippage of odometry
is able to perform SLAM using data from a laser sensor. (negative peaks on frames 220 and 230) when the driver of the robot hit the
It was designed to be integrated easily and efficiently into right wall (see map 8). The CoreSLAM based on laser alone ignored this.
any particle filtering-based localization system. It was tested
indoor in a very cluttered environment, with a fast robot, us-
ing a very affordable laser sensor. The CoreSLAM algorithm R EFERENCES
shows good performance in this situation, being almost able [1] Andrew J. Davison. Real-time simultaneous localisation and mapping
to close a loop while being open-loop only. As the source with a single camera. Ninth IEEE International Conference on Computer
Vision (ICCV’03) - Volume 2, 2003.
code was very small and optimized, we decided to publish it [2] Austin Eliazar and Ronald Parr. Dp-slam: Fast, robust simultaneous lo-
in this paper. calization and mapping without predetermined landmarks. Proceedings
During the next stages of the algorithm development, we of the International Joint Conference on Artificial Intelligence, 2003.
[3] Dieter Fox, Sebastien Thrun, Wolfram Burgard, and Frank Dellaert.
will try to use our platform in an outdoor environment by Particle filters for mobile robot localization.
fusing with GPS and compass. We are also working on [4] Feng Lu and Evangelos Milios. Robot pose estimation in unknown
providing a very short and simple loop closing algorithm environments bu matching 2d range scans. Journal of Intelligent and
Robotic Systems, 1997.
(tinyLoopCloser), that automatically detects loop closure and [5] Michael Montemerlo, Sebastian Thrun, Daphne Koller, and Ben Weg-
corrects the trajectory of the robot accordingly. breit. Fastslam: A factored solution to the simultaneous localization and
Figure 9. A picture of the lab where the experiments took place. This
corresponds to the upper left part of the map shown on 7. Note that the
laser sees both the left wall and the shelves, the two open doors and the
Figure 7. A map of the laboratory obtained during our experiments. In table legs.
gray, the “holes” map constructed by the robot. In red, we have overlaid
all the scans taken by the laser. In blue, the reconstructed trajectory of the
robot. The loop closure is almost perfect here. This map was obtained by
combining information from odometers and laser (monte-carlo search). [email protected]
PhD Student at Mines ParisTech - Center of Robotics

Algorithm 1 CoreSLAM : header and data structures


1 # i f n d e f _CoreSLAM_H_
# d e f i n e _CoreSLAM_H_

# i f n d e f M_PI
# d e f i n e M_PI 3 . 1 4 1 5 9 2 6 5 3 5 8 9 7 9 3 2 3 8 4 6
6 # endif

# define TS_SCAN_SIZE 8192


# define TS_MAP_SIZE 2048
# define TS_MAP_SCALE 0 . 1
11 # d e f i n e TS_DISTANCE_NO_DETECTION 4000
# define TS_NO_OBSTACLE 65500
# define TS_OBSTACLE 0
# define TS_HOLE_WIDTH 600

16 t y p e d e f u n s i g n e d s h o r t t s _ m a p _ p i x e l _ t ;

typedef struct {
t s _ m a p _ p i x e l _ t map [ TS_MAP_SIZE * TS_MAP_SIZE ] ;
} ts_map_t ;
21
typedef struct {
d o u b l e x [ TS_SCAN_SIZE ] , y [ TS_SCAN_SIZE ] ;
i n t v a l u e [ TS_SCAN_SIZE ] ;
int nb_points ;
26 } t s _ s c a n _ t ;

typedef struct {
double x , y ; / / i n mm
double t h e t a ; / / in degrees
31 } t s _ p o s i t i o n _ t ;

v o i d t s _ m a p _ i n i t ( t s _ m a p _ t * map ) ;
i n t t s _ d i s t a n c e _ s c a n _ t o _ m a p ( t s _ s c a n _ t * s c a n , t s _ m a p _ t * map , t s _ p o s i t i o n _ t * p o s ) ;
v o i d t s _ m a p _ u p d a t e ( t s _ s c a n _ t * s c a n , t s _ m a p _ t * map , t s _ p o s i t i o n _ t * p o s i t i o n ,
Figure 8. The map constructed using odometry only. Note the slippage 36 int quality );
when the driver hit the right wall of the lab.
# e n d i f / / _CoreSLAM_H_

mapping problem. In Proceedings of the AAAI National Conference on


Artificial Intelligence, pages 593–598, 2002.
[6] Robert Ouellette and Kotaro Hirasawa. A comparison of slam imple-
mentations for indoor mobile robots. Proceedings of the 2007 IEEURSJ,
Int. Conference on Intelligent Robots and Systems, 2007.
[7] Soren Riisgaard and Morten Rufus Blas. SLAM for Dummies. A Tutorial
Approach to Simultaneous Localization and Mapping.
[8] Joan Sola, Andr e Monin, Michel Devy, and Teresa Vidal-Calleja. Fus-
ing monocular information in multicamera slam. IEEE TRANSACTIONS
ON ROBOTICS, VOL. 24, NO. 5, 2008.
Bruno Steux
[email protected]
Assistant Professor at Mines ParisTech - Center of
Robotics
—————————————–
Oussama El Hamzaoui
Algorithm 2 CoreSLAM : the scan to map distance measure
i n t t s _ d i s t a n c e _ s c a n _ t o _ m a p ( t s _ s c a n _ t * s c a n , t s _ m a p _ t * map , t s _ p o s i t i o n _ t * p o s )
2 {
double c , s ;
int i , x , y , nb_points = 0;
i n t 6 4 _ t sum ; Algorithm 4 CoreSLAM : laser ray map update
7 c = c o s ( pos−>t h e t a * M_PI / 1 8 0 ) ;
s = s i n ( pos−>t h e t a * M_PI / 1 8 0 ) ; # d e f i n e SWAP( x , y ) ( x ^= y ^= x ^= y )
/ / T r a n s l a t e and r o t a t e s c a n t o r o b o t p o s i t i o n 2
/ / and c o m p u t e t h e d i s t a n c e v o i d t s _ m a p _ l a s e r _ r a y ( t s _ m a p _ t * map , i n t x1 , i n t y1 , i n t x2 , i n t y2 ,
f o r ( i = 0 , sum = 0 ; i ! = s c a n−>n b _ p o i n t s ; i ++) { i n t xp , i n t yp , i n t v a l u e , i n t a l p h a )
12 i f ( s c a n−>v a l u e [ i ] ! = TS_NO_OBSTACLE ) { {
x = ( i n t ) f l o o r ( ( pos−>x + c * s c a n−>x [ i ] i n t x2c , y2c , dx , dy , dxc , dyc , e r r o r , e r r o r v , d e r r o r v , x ;
− s * s c a n−>y [ i ] ) * TS_MAP_SCALE + 0 . 5 ) ; 7 i n t incv , sincv , i n c e r r o r v , i n c p t r x , i n c p t r y , pixval , horiz , diago ;
y = ( i n t ) f l o o r ( ( pos−>y + s * s c a n−>x [ i ] ts_map_pixel_t * ptr ;
+ c * s c a n−>y [ i ] ) * TS_MAP_SCALE + 0 . 5 ) ;
17 / / Check b o u n d a r i e s i f ( x1 < 0 | | x1 >= TS_MAP_SIZE | | y1 < 0 | | y1 >= TS_MAP_SIZE )
i f ( x >= 0 && x < TS_MAP_SIZE && y >= 0 && y < TS_MAP_SIZE ) { r e t u r n ; / / R o b o t i s o u t o f map
sum += map−>map [ y * TS_MAP_SIZE + x ] ; 12
n b _ p o i n t s ++; x2c = x2 ; y2c = y2 ;
} / / Clipping
22 } i f ( x2c < 0 ) {
} i f ( x2c == x1 ) r e t u r n ;
i f ( n b _ p o i n t s ) sum = sum * 1024 / n b _ p o i n t s ; 17 y2c += ( y2c − y1 ) * (−x2c ) / ( x2c − x1 ) ;
e l s e sum = 2 0 0 0 0 0 0 0 0 0 ; x2c = 0 ;
r e t u r n ( i n t ) sum ; }
27 } i f ( x2c >= TS_MAP_SIZE ) {
i f ( x1 == x2c ) r e t u r n ;
v o i d t s _ m a p _ i n i t ( t s _ m a p _ t * map ) 22 y2c += ( y2c − y1 ) * ( TS_MAP_SIZE − 1 − x2c ) / ( x2c − x1 ) ;
{ x2c = TS_MAP_SIZE − 1 ;
int x , y , i ni tv al ; }
32 ts_map_pixel_t * ptr ; i f ( y2c < 0 ) {
i n i t v a l = ( TS_OBSTACLE + TS_NO_OBSTACLE ) / 2 ; i f ( y1 == y2c ) r e t u r n ;
f o r ( p t r = map−>map , y = 0 ; y < TS_MAP_SIZE ; y ++) { 27 x2c += ( x1 − x2c ) * (−y2c ) / ( y1 − y2c ) ;
f o r ( x = 0 ; x < TS_MAP_SIZE ; x ++ , p t r ++) { y2c = 0 ;
*ptr = initval ; }
37 } i f ( y2c >= TS_MAP_SIZE ) {
} i f ( y1 == y2c ) r e t u r n ;
} 32 x2c += ( x1 − x2c ) * ( TS_MAP_SIZE − 1 − y2c ) / ( y1 − y2c ) ;
y2c = TS_MAP_SIZE − 1 ;
}

dx = a b s ( x2 − x1 ) ; dy = a b s ( y2 − y1 ) ;
37 dxc = a b s ( x2c − x1 ) ; dyc = a b s ( y2c − y1 ) ;
i n c p t r x = ( x2 > x1 ) ? 1 : −1;
i n c p t r y = ( y2 > y1 ) ? TS_MAP_SIZE : −TS_MAP_SIZE ;
s i n c v = ( v a l u e > TS_NO_OBSTACLE ) ? 1 : −1;
i f ( dx > dy ) {
42 d e r r o r v = a b s ( xp − x2 ) ;
} else {
SWAP( dx , dy ) ; SWAP( dxc , dyc ) ; SWAP( i n c p t r x , i n c p t r y ) ;
d e r r o r v = a b s ( yp − y2 ) ;
}
47 e r r o r = 2 * dyc − dxc ;
h o r i z = 2 * dyc ;
d i a g o = 2 * ( dyc − dxc ) ;
errorv = derrorv / 2;
i n c v = ( v a l u e − TS_NO_OBSTACLE ) / d e r r o r v ;
Algorithm 3 CoreSLAM : full scan map update 52 i n c e r r o r v = v a l u e − TS_NO_OBSTACLE − d e r r o r v * i n c v ;
p t r = map−>map + y1 * TS_MAP_SIZE + x1 ;
p i x v a l = TS_NO_OBSTACLE ;
1 v o i d t s _ m a p _ u p d a t e ( t s _ s c a n _ t * s c a n , t s _ m a p _ t * map , t s _ p o s i t i o n _ t * pos , i n t q u a l i t y ) f o r ( x = 0 ; x <= dxc ; x ++ , p t r += i n c p t r x ) {
{ i f ( x > dx − 2 * d e r r o r v ) {
double c , s , q ; 57 i f ( x <= dx − d e r r o r v ) {
d o u b l e x2p , y2p ; p i x v a l += i n c v ;
i n t i , x1 , y1 , x2 , y2 , xp , yp , v a l u e ; e r r o r v += i n c e r r o r v ;
6 d o u b l e add , d i s t ; if ( errorv > derrorv ) {
p i x v a l += s i n c v ;
c = c o s ( pos−>t h e t a * M_PI / 1 8 0 ) ; 62 e r r o r v −= d e r r o r v ;
s = s i n ( pos−>t h e t a * M_PI / 1 8 0 ) ; }
x1 = ( i n t ) f l o o r ( pos−>x * TS_MAP_SCALE + 0 . 5 ) ; } else {
11 y1 = ( i n t ) f l o o r ( pos−>y * TS_MAP_SCALE + 0 . 5 ) ; p i x v a l −= i n c v ;
/ / T r a n s l a t e and r o t a t e s c a n t o r o b o t p o s i t i o n e r r o r v −= i n c e r r o r v ;
f o r ( i = 0 ; i ! = s c a n−>n b _ p o i n t s ; i ++) { 67 i f ( e r r o r v < 0) {
x2p = c * s c a n−>x [ i ] − s * s c a n−>y [ i ] ; p i x v a l −= s i n c v ;
y2p = s * s c a n−>x [ i ] + c * s c a n−>y [ i ] ; e r r o r v += d e r r o r v ;
16 xp = ( i n t ) f l o o r ( ( pos−>x + x2p ) * TS_MAP_SCALE + 0.5); }
yp = ( i n t ) f l o o r ( ( pos−>y + y2p ) * TS_MAP_SCALE + 0.5); }
d i s t = s q r t ( x2p * x2p + y2p * y2p ) ; 72 }
add = TS_HOLE_WIDTH / 2 / d i s t ; / / I n t e g r a t i o n i n t o t h e map
x2p * = TS_MAP_SCALE * ( 1 + add ) ; * p t r = ( ( 2 5 6 − a l p h a ) * ( * p t r ) + a l p h a * p i x v a l ) >> 8 ;
21 y2p * = TS_MAP_SCALE * ( 1 + add ) ; i f ( e r r o r > 0) {
x2 = ( i n t ) f l o o r ( pos−>x * TS_MAP_SCALE + x2p + 0.5); p t r += i n c p t r y ;
y2 = ( i n t ) f l o o r ( pos−>y * TS_MAP_SCALE + y2p + 0.5); 77 e r r o r += d i a g o ;
i f ( s c a n−>v a l u e [ i ] == TS_NO_OBSTACLE ) { } e l s e e r r o r += h o r i z ;
q = quality / 2; }
26 v a l u e = TS_NO_OBSTACLE ; }
} else {
q = quality ;
v a l u e = TS_OBSTACLE ;
}
31 t s _ m a p _ l a s e r _ r a y ( map , x1 , y1 , x2 , y2 , xp , yp , value , q ) ;
}
}

View publication stats

You might also like