0% found this document useful (0 votes)
57 views7 pages

Three-Dimensional Localization and Mapping For Mobile Robot in Disaster Environments

This paper presents a method for three-dimensional localization and mapping for mobile robots in disaster environments. The method uses a laser range finder to acquire 3D environmental data and estimates the robot's location within a partially constructed global map using correlation techniques. The authors introduce their SLAM approach which involves moving to new locations, sensing 3D range data, building local maps, comparing local maps to the global map, localizing the robot, and updating the global map. This method allows a mobile robot to simultaneously map its 3D environment and localize itself within that environment.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
57 views7 pages

Three-Dimensional Localization and Mapping For Mobile Robot in Disaster Environments

This paper presents a method for three-dimensional localization and mapping for mobile robots in disaster environments. The method uses a laser range finder to acquire 3D environmental data and estimates the robot's location within a partially constructed global map using correlation techniques. The authors introduce their SLAM approach which involves moving to new locations, sensing 3D range data, building local maps, comparing local maps to the global map, localizing the robot, and updating the global map. This method allows a mobile robot to simultaneously map its 3D environment and localize itself within that environment.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Engineering

Industrial & Management Engineering fields


Okayama University Year 2003

Three-dimensional localization and


mapping for mobile robot in disaster
environments
Keiji Nagatani Hiroshi Ishida
Okayama University Okayama University
Satoshi Yamanaka Yutaka Tanaka
Okayama University Okayama University

This paper is posted at eScholarship@OUDIR : Okayama University Digital Information


Repository.
http://escholarship.lib.okayama-u.ac.jp/industrial engineering/73
Proceedings of the 2003 IEEWRSJ
Inll. Conference on Intelligent Robofs and Systems
Las degas,Nevada. October 2003
Three-dimensional Localization and Mapping for Mobile Robot
in Disaster Environments
Keiji Nagatani Hiroshi Ishida
The Graduate School of Natural Science The Graduate School of Natural Science
and Technology, Okayama University and Technology, Okayama University
3-1-1 Tsushima-naka, Okayama 700-8530 3-1-1 Tsusbima-naka, Okayama 700-8530
Japan Japan

Satoshi Yamanaka Yutaka Tanaka


Department of Systems Engineering, The Graduate School of Natural Science
Okayama University and Technology, Okayama University
3-1-1 Tsushima-naka, Okayama 700-8530 3-1-1 Tsushima-naka, Okayama 700-8530
Japan Japan

A b s t r a c t To relieve damages of earthquake disaster, The for dogs and the trainers. Therefore, we assume such
Special Project for Earthquake Disaster Mitigation in Urban environment as our target environment.
Areas have been kicked offin Japan. Our researcb group is In such environment, many robotic technologies are
a part of the sub-project modeling of disaster environment
for search and rescue since 2002. In this project, our group required for search and rescue by mobile robots, and
aims to develop a three-dimensional mappings algorithm we focus on following topics in this research.
that is installed in a mobile robot to search victims in a 1) Localization in three-dimensional space
collapsed building. To realize this mission, it is important 2) Mapping three-dimensional environment
to map environment information, and also the mapping
requires localization simultaneously. (Tbis is called SLAM In an unknown environment, the above topics are com-
problem.) plement to each other. That is because a robot must
In tbis research, we use three-dimensional map by laser localize itself in partially mapped environment. In Figure
range finder, and we also estimate its location in a global 1, we introduce the idea of our localization method. A
map using correlation technique. In this paper, we introduce robot acquires local information in (l), localizes in partial
our localization and mapping method, and we report a result
of preparatory experiment for localization. mapped environment in (Z), and expands the environment
information in (3). To acquire local environment informa-
I. INTRODUCTION tion, we use a laser range finder mounted on a mobile
robot.
From the influence of the the gnat Hanshin-Awaji
earthquake whicb happened in 1995, many researches
related to rescue robotics have been performed recently.
In this background, The Special Project for Earthquake +
Disaster Mitigation in Urban Areas have been kicked off
in Japan. A complex of robotics research groups is charged
one of the pillar of that project to relieve disasters by I I
robotic technology. 1
Our research group have been charged a part of the sub-
projects in modeling of disaster environment for search
and rescue since 2002. In this project, our group aims to
develop a three-dimensional mappings algorithm that is
installed in a mobile robot to search victims in a collapsed
building. I I

A target environment of our research is a a narrow Fig. I induction to OUT localization method
ditch leading to wide space in a collapsed building.
According to a rescue dog trainer, one of requirements In this paper, we introduce an algorithm that maps
to search victims is that a rescue dog should pass through an environment and localizes mobile robots position in
a partial broken narrow ditch in disaster field (such as three-dimension. We also report a result of a preparatory
collapsed buildings). It is a very stressful and danger job experiment to verify the algorithm.

~7803-7860-1/0311b17.000 2003 IEEE 3112


11. RELATED WORKS dimensional localization of the robot. Therefore, we adopt
the following method for SLAM of mobile robot.
This research relates to following research fields.
1) Moving a suitable distance from the last sensing
In the research field of search and rescue robotics, point
many of research topics are focused on a mechanism to
2) Sensing three-dimensional range data
overcome uneven grounds. Hirose et al. proposed snake
3) Building a local map (that is represented by three-
types mobile robot[l] and Gunryu (several mobile
dimensional range information from the robots lo-
robots with arm overcome rocks by cooperation among
cation)
them)[2] for search and rescue tasks. Murphy proposed
4) Comparing between the local map and the pre-
a parent-and-child robot [3]. The parent mobile robot can
constructed global map
navigate on uneven ground, and the child robot (that is
5 ) Localizing the robots position in the global map
getting ready in the parents inside) can explore some
6) Adding the local map information to the global map
places like cliffs using its rope connecting to the parent
according to the robots estimated position
robot. Many of mobile robots for search and rescue
7) Returning to 1
(including above robots) have a crawler mechanism for lo-
comotion that can overcome uneven grounds. We have also
In following sub-sections, we introduce details of the
above method.
chosen such crawler type of mobile robot as OUT research
platform. However we focus on a research of localization A. Acquisition of three-dimensional ronge data
and mapping instead of a locomotion mechanism. We assume that the robot has a laser range finder to
In the research field of localization and mapping, re- construct a local map. A conventional laser range finder
cently SLAM (simultaneous localization and mapping) is can detect a range in a plane, so we rotate the sensor on an
a popular algorithm. Thrun et al. systematized a SLAM al- axis parallel to tbe plane for acquiring three-dimensional
gorithm for multiple mobile robots with Bayesian method, data. Figure 2 shows an idea of this method. Using this
and successfully implemented in mobile robots in two- method and slope sensors, the robot can detect a three-
dimensional environments [4]. Cboset et al. proposed a dimensional range data relative to the global horizontal
SLAM algoritbm using generalized Voronoi grapb [5]. plane.
It was also successfully implemented in mobile robots in
two-dimensional environment. In OUT approach, a target
environment is three-dimensional space. A robot acquires
a local environment information, and estimates its posi-
tion in a partially mapped global map using correlation
technique.
In representation of environment, a three-dimensional
polygon method is used in a research field of computer
graphics. Representation using cubic grids is also a basic
method. However, in the above methods, data sue is so
huge for representing a large environment that some kinds
of data compression method is required for realtime nav-
igation for mobile robots. DEM (Digital Elevation Map)
is one of the popular methods to compress information
of three-dimensional environment (e.g. in [6]). It can Fig. 2. Acquisition of thm-dimensional cmrironment data
represent an uneven ground by planar grids, and eacb grid
bas depth information. Unfortunately, it is impossible for
DEM to express unevenness of botb ceiling and ground. In B. Sphere digital elevation map @DEW
OUT research, we use a S-DEM (Sphere-DEM) to represent To represent a local map by standard g i d s (or boxels)
a local enviromnent, and global map is represented by requires huge memory area in three-dimensional environ-
relative locations of several S-DEMs. ment For saving computer resources, it is popular method
for representation of uneven ground to use a digital
111. LOCALIZATION AND MAPPING METHOD elevation map @EM). The map uses two-dimensional
To explore an unknown environment, a mobile robot grids on a horizontal plane of the world, and eacb grid has
needs to map the environment and to localize in the height information to represent three-dimensional world.
built map simultaneously (it is called SLAM). In un- Unforhmately, it is difficult to represent a space l i e a a
even ground (such as inside of a collapsed building), narrow ditch leading to wide space that can exist inside
geographical feahxes are good information for three- of collapsed buildings.

3113
In this research, we
use sphere-DEM (S-
DEM) to represent a
1 ,/* ?d ..
local map. S-DEM is :
one of the type of el-
evation maps, however
each height informa-
tiou is stored in a grid
on the spheres SUI-
face Of On the Fig. 3. Con~mctionof sphere Fig. 4. An example of VLhlal S-DEM
flat plane. Each sens- tal
- elemtion (s-DEW
ing point is represented
by r(S,4), and the sen-
sors location is the If the robot uses only range sensor data for localization
center of the sphere. Figure 3 shows an example of S- (and assumes many candidates of position and orienta-
DEM in a two-dimensional case. tion), it costs too much calculation time. Practically, we
assume that the robot uses "pyre sensor with compass for
C. Localization method detecting an orientation (and a pose) of the robot to reduce
the cost. Then it estimates x-y location and altitude using
Using S-DEM representation, a global map is repre- above algorithm.
sented by several S-DEMs and relative locations among
them. Also robots localization is performed by detecting D. Expanding global map
a relative location between one of S-DEMs in the global Once the robot knows a location in tbe global map,
map and a S-DEM at the current robots position. it merges the local S-DEM information for expanding
To calculate a relative location between two S-DEMs, the global map. A detail of the merging method is not
we use a correlation technique using the following method. determined yet. (It is one of our future works.)
Firstly, we transform each boundary position rg[(S,++)
of S-DEM in the global map into Descartes coordina- IV. HARDWARE
tion (z, y, z ) . Secondly, we move the S-DEMs origin Currently, we are setting np a sensor unit and a crawler
(zg,yg,zg) to an arbitrary position (zu,ye,zu). Then, type mobile robot for experiments.
we generate each v h a l boundary position rh(@h,++L)
using (x,y, z ) to generate a virtual S-DEM that center is A. Sensor unit
(z, yu,2.). Finally, the correlation between the virmal S- An objective of a sensor Unit mounted on OUT mobile
DEM and the local S-DEM is calculated by the following, robot is to detect environment information relative to a
horizontal plane of the ground in the target environment.
Therefore, we designed a sensor unit that consists of the
6-9 following functions.
where rlC(S,4) is a boundary position of S-DEM in a local 1) a laser range finder (produced by SICK)
map. 2) a gyro sensor unit (produced by NEC Tokin)
Then, the location (zv,yu,ztv)that minimizes a value 3) an actuator to lift the laser range finder
of d is the relative location between the global S-DEM 4) an actuator to rotate the laser range finder
and the local S-DEM. An overview of the sensor unit is shown in Figure 5.
Figure 4 shows an example to coushuct virtual S-DEM 1) Laser range finder
in a two-dimensional case. In the left figure, S-DEM is A laser range finder (LRF) has a capability to detect
represented by a group of gray segments, and thick black a range up to 8 meters witbin 5 millimeters errors.
curves mean detected boundaries of the environment. The We trust a range up to 3 meters because an angular
upper boundary is divided into two curves because of error becomes large at a farther detection point.
occlusion. When the origin of S-DEM is v h a l l y moved 2) Gyro sensor unit
to the lower left, a virtual S-DEMis constrncted as the A gyro sensor unit (produced by NEC-Tokin) in-
right figure. cludes not only gyros but a geomagnetism sen-
Once the robot calculates values of correlation d in (1) sor to cancel drift mors in gyros. It can.detect
at every candidates of virtual S-DEMs origin (x,y,, zu), three rotational angles around rectangular coordi-
the robot knows its location by picking up the minimum nates (dz,4y, 42).
value of d. 3) A lifting mechanism

3114
tation, we operate.sensorsmanually.

B. Mobile robot (cr~wlerwpe)


To mount the sensor
unit, we developed a .

crawler types mobile


robot. It consists of
two crawlers, two
motors, batteries and
a control computer.
Basically, we consider
Fig. 5 . An overview of our sensor unit a partial autonomous
&hhg rask instead
of fu~~-autonomous Fig. 7. A crawler ype mobile
robot
We also added a function to lift the laser range navigation. Figure 7
finder. When a robot enters a m o w ditch, the shows a photograph of
height of it should be low. However, in wide space, it the robot. Now we are setting up the controller of sensor
is better for the robot to detect environment informa- unit.
tion by the sensor at higher position. Therefore, we
designed a mechanism to lift the laser range finder. V. PREPARATORY
EXPERIMENT
The lifting mechanism includes slide joints, fixed A. Target environment
joints, a trapezoidal screw and a DC-motor. An Currently, we do not have a test field of collapsed
overview of the mechanism is shown in Figure 6. building, hence the experiment has been carried out in
a simple indoor environment. Figure X shows a target
environment in this experiment. It includes a flat floor
with several obstacles. We also assume that a pose of the
sensor unit is not always parallel to the horizontal plane
of the ground. Therefore, the smsor unit should measure
its inclination by the gyro sensor.

ti i
Fig. 6. A lifting mechanirm of a laser range fmder

The position of the laser range h d e r (z, y) is simply


calculated by a kinematics of following equation
using parameters shown in Figure 6,

Fig. 8. A targei envimnmmt

Y = d-
211 (3)
where the length of ?U is detected by an encoder B. Procedure of experiment
attached to the DC-motor. To confirm validity of ow localization algorithm using
All functions of the sensor were designed to be per- our sensor unit, we have performed a simple preparatoIy
formed automatically. However, in ow current implemen- experiment as follows.

3115
('-I)Firstly, we locate the sensor unit parallel to the (horizontal direction is equal to O[deg])shown in Figure
ground, and the initial location of the unit is 10. In this figure, an unknown area is illustrated as gray
defined as the origin of the global coordinates. areas.
Then it scans range data fiom -45[deg] to
40(deg] @orizontal direction is equal to O[deg])
in tilting angle in every 1[deg].At each angle, the
sensor can measure range data of panning angle
( h m -9O[deg] to QO[deg])in every I[deg].In
this experiment, tbe tilting angle is measured
by the gyro sensor unit. Thus the range data is
stored into a S-DEM in the global map rsl(6', 4).
(P-2) Secondly, we move the sensor unit to a certain
distance (and certain pose). Then the sensor unit
scans local data again (it is the same as the first
procedure), and the data is stored into a S-DEM
in the local map rlC(#,4).
(P-3) Thirdly, we generate virtual S-DEMs from the
global S-DEM to produce candidates of robot's
positions. In this experiment, candidates of the
origin of virtual S-DEM are located in (-50 < Fig. IO. An sliced S-DEM (0 = -30[deg]) in the global map
z < 50), (-30 < y < 60), (-10 < H < 10).
(P-4) Finally, we find the minimum value d in the
equation (I), and it is the estimated location of
the current sensor unit. Then we compare the
estimated location and the measured location.
C. Experimental result
At the first location of the sensor unit, we detected I
environment information using @-I). Then we constructed I
the first S-DEM of the global map using the data of the
laser range finder.
Figure 9 shows a raw information of the laser range
finder.

Ftg I 1 An s l i d S-DEM (0 = -3Ojdegj) m a local map

In the second step, we moved the sensor unit to the


location of (z = 22,y = 3 6 , z = 5)[cm],and the pose
of the unit is the same. Then we measured environment
data (P-2), and calculated correlations (P-3) (P-4). The
estimation result was calculated at (z = 23, y = 35, z =
3 ) [ m ] .It is almost matched as the measured location.
Figure 11 shows a sliced S-DEM at the # = -30jdegl.
We can see that the shape of the free space at the bottom
Fig. 9. A rclult of laser mnge data of this figure is matched to the same place in Figure 10.
Next, we suppose that the robot stepped on a rock,
It is very difficult to illustrate S-DEM because each shown in Figure 12. The moved location is the same as
depth from boundary is extended from the origin of the the last experiment (z = 22, y = 3 6 , a = 5 ) [ m ] and
, the
sphere. So, we sliced the S-DEM at the 0 = -30[deg] tilted angle of the body is 8[deg].

3116
The result of the esti- [5] H. Choset and K. Nagatani, Topological simultane-
mated location is (z = ous localization and mapping (slam): toward exact
24,y = 35,z = S ) [ n ] , localization without explicit localization, IEEE Dam.
and it is almost the same on Robotics and Automation, vol. 17, no. 2, pp. 125-
as the measured location. 137, 2001.
In these experiments, [6] J. RCollins, Y.Tsin and A.Lipton, Using a dem to
the grid size in re- Fig. 12. A tiltedgose of the determine geospatial object trajectories, in DARPA
robot
calculation for virtual Image Understanding Workshop, pp. 115-122, 1998.
S-DEM is Inn. Therefore
I suppose that these are
reasonable results from the point of view of accuracy.
On the other hand, we bad an experience that an
estimated position along z axis was not accurate. It is
unavoidable in this experiment because we use range data
kom -45[deg] to 40[deg] in tilting motion, and does not
use a ceiling information. In our intuition, the robot can
localize more accurately by using full-size S-DEM.

VI. CONCLUSION AND FUTURE WORKS


In this paper, we proposed a concept of S-DEM to rep-
resent environment information and to localize in partially
developed map. According to our prepamtoy experiments,
the robot can estimate its location in a collapse building.
We still have the following future works. One of the
important future work is to set up the mobile platform with
the constructing sensor unit to verify proposed methods
of localization and mapping. Another future work is to
discuss a viewing method of detected environment infor-
mation for a human. Finally, we will try to apply our
system to the test field of disaster environment in Kobe.

VII. ACKNOWLEDGMENTS
This research is fully supported by The Special Project
for E a r t h q d e Disaster Mitigation in Urban Areas in the
Japanese Ministry of Education, Culture, Sports, Science
and Technology.

VIII. REFERENCES

[ 11 E. E Fukusbima and S. Himse, An efficient steering


control formulation for the articulated body mobile
robot !a-ii, Autonomous Robots Vol.3 (I), pp. 7-
18, 1996.
[2] E. F. F. Sbigeo Hirose, Takaya Shirasu, Proposal for
cooperative robot gunryu composed of autonomous
segments, Robotics and Autonomous Systems, Else-
vie,: 27, pp. 107-118, 1996.
[3] R. Murphy, M. Ausmus, M. Bugajska, T. Ellis,
T. Johnson, N. Kelley, J. Kiefer, and L. Pollock,
Marsupial-like mobile robot societies: IEEE Intel-
ligent @stems 15(3), pp. 3 W 3 6 5 , 1999.
[4] S. Thrun,A probabilistic online mapping algorithm
for teams of mobile robots, Intemational Joumal of
Robotics Research, vol. 20, no. 5, pp. 335-363,2001,

3117

You might also like