0% found this document useful (0 votes)
24 views9 pages

Jmecs Revisinew

Uploaded by

nvtra27
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views9 pages

Jmecs Revisinew

Uploaded by

nvtra27
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/330362469

Lidar Application for Mapping and Robot Navigation on Closed Environment

Article in Journal of Measurements Electronics Communications and Systems · June 2018


DOI: 10.25124/jmecs.v4i1.1696

CITATIONS READS

9 6,797

3 authors, including:

Angga Rusdinar Rizki Ardianto Priramadhi


Telkom University Telkom University
33 PUBLICATIONS 166 CITATIONS 19 PUBLICATIONS 87 CITATIONS

SEE PROFILE SEE PROFILE

All content following this page was uploaded by Angga Rusdinar on 16 October 2019.

The user has requested enhancement of the downloaded file.


Journal of Measurement, Electronic, Communication, and Systems 00 (2015) 000~000
www.jmecs.org/content/1738-494x
submitted manuscript under review

LIDAR Application for Mapping and Robot Navigation on Closed Environment


I. Maulana1, A. Rusdinar1 and R. A. Priramadhi1
1
School of Electrical Engineering, Telkom University, Bandung, 40287, Indonesia

Abstract
Mapping and navigation on robots are now widely implemented in areas such as industry, home appliances,
military, exploration and automated vehicles. In the environment that are hard to reach by human, mapping and
navigation algorithms are absolutely necessary. In order to support the algorithms, Light Detection and Ranging
(LIDAR) sensor is needed. In this research, the LIDAR sensor is built, by using one laser sensor which is rotated
by a servo motor, the distance between the sensor position and the measured point around it can be calculated.
The distance information are converted into Cartesian axes. Using the Cartesian axes, local maps is built and
localization of position information can be determined. The result showed that the proposed Sensor worked
properly, and the algorithm is able to generate map and position information.

Keywords: Mapping and Navigation; LIDAR; Localization; Local Map; Global Map

1. Introduction position required during mapping of the surrounding


Environmental mapping is one of the most environment. Localization and road planning is
important aspects of robotics studies when dealing fundamental to the problem of robotic navigation. To
with localization, positioning, automatic navigation, achieve successful navigation, the robot must be able
and also search and rescue [1]. Environmental to localize itself and produce a simultaneous
mapping can be used for narrow caves, low oxygen environmental map (SLAM) [6].
underground passage ways, unknown environments The main objective of this research is to create a
that humans cannot reach. In addition, the number of spinner for LIDAR sensors in order to scan the
manufacturing jobs, mining, and industry that are still surrounding environment with a range of 360 degrees
manually done by humans are very prone to the using only one sensor only. Then the spinner is tested
occurrence of accidents. Therefore it needs a device by making a mapping based on the data obtained. then
that can do map and navigation automatically so that the results of the two scans are compared so that it can
the robot can replace human workers. be predicted the position of the robot movement. the
In order to create a map of environment requires last spinner is tested by creating a navigation path on
sensors that have a high level of accuracy and it can the previously generated map.
reach some point in long distance. LIDAR is one of Then it is expected that the device can be
remote sensing technology that has the potential to developed so that it can be used on the robot to do the
help (map, monitor, and estimate spatial element mapping and navigation. Most of the device for lidar
locations) of many fields/applications related to the spinner in the market are so expensive that researchers
provision of geospatial databases [2]. Due to data are expected to develop better and cheaper sensor
density and high accuracy, LIDAR sensor is very players for research on mapping and navigation of
suitable to be implemented in robot mapping and robots.
navigation. Many researchers reported has used laser
range finders in their system [1], it uses RP Lidar 2. System Design
sensor to create the mapping with a mobile robot. To be able to do the mapping and navigation in
Rusdinar used laser range finders in order to fix error need of several stages of data collection from the
poses correction of mobile robot using particle filter sensor, robot position estimation, map creation and
[3] and a vision-based indoor localization method for navigation. The sensor is rotated 360 degrees using a
autonomous Vehicles [4]. A. Rubinstein and T. Erez stepper motor. Arduino retrieves data from LIDAR
created a robot called LiTank which is a robot used for Lite V3 sensor using PWM communications. The data
tunnels mapping using Lidar sensors made by from the sensor is taken each step of the stepper motor
Velodyne [5]. rotation. Data from the sensor is sent by Bluetooth and
Automatic navigation of the robot is also the data is taken using serial communication by
required in the mapping environment so that the robot Matlab software. After the data obtained, then Matlab
can run properly. Data from LIDAR can be mapped change the data to be processed into a map and
and used by robots to determine localization to the navigation. Each movement of the stepper motor is
environment. Then the mapping data is used by robots controlled by a command from Matlab which is sent
for navigation and movement planning. In addition, via Bluetooth with serial communication.
data from LIDAR is also used to estimate the robot
Tody et al. / Journal of Measurement, Electronics and Communication Systems 00 (2014) 0000~0000

Figure 1 shows a flowchart of the mapping and 2.1 Occupancy Grid


navigation system. The first program run Matlab The Occupancy Grid reflects the
software connects serial communication with Arduino. multidimensional map of its environment (usually 2D
Then Matlab gives Arduino command to move the or 3D) into the cell, where each cell stores its
sensor and send data every step. After the data probabilistic state value [7]. Environmental
obtained then the data processed by Matlab to make information can be drawn from distance sensor data,
the map of probability. Then Matlab retrieves the data camera and bump sensors commonly used to
again and stored on different variables. The first data determine obstacles in the robot environment. There
and newly processed data using the scan matching are 2 representations of the Occupancy Grid map of
function to obtain a new robot position. Position and Binary Occupancy Grid and Probability Occupancy
orientation data are then used to update the new data Grid.
to a global map. The data retrieval process and global The Binary Occupancy Grid uses the True value
map updates continue until all environments are to represent the occupied workspace (obstacle) and the
mapped. Once map mapped data previously obtained False value to represent the free workspace. This box
plus limits for robots with inflating map function. shows where the obstacles are and whether the robot
After that, a graph of possible paths can be passed can move through that space. While Probability
using the PRM function. Thereafter searched the Occupancy Grid uses probability values to create
shortest path based on the map and the previous PRM. more detailed map representations. This
representation is the preferred method for using the
Residential Grid. This grid is usually referred to
simply as a dwelling. Each cell in the Residential Grid
has a value that represents the probability of the cell's
occupancy. A value close to 1 represents a high
certainty that the cell contains obstacles. A value close
to 0 represents the certainty that the cell is not
occupied and barrier-free [8].

2.2 Probability Road Map


Probability Road Map (PRM) is a network graph
of paths that may be present in a map determined by a
free and unimpeded space [9]. Basically, the way
PRM works is to take a random sample from the map,
each sample is checked whether it is in an empty room
or barrier-free. Then made local planning then each
plan is connected to each other based on the nearest.

Fig. 2. PRM and minimum robot path planning [10]

There are two stages to PRM. First is the


construction stage of making a road map approaching
the robot movement that can be done on the
Fig. 1.Flowchart of System environment. Initially a random configuration is
created and then connected to some of its neighbors,
usually the closest one is less than the specified
distance. Second is the query stage of the initial
Tody et al. / Journal of Measurement, Electronics and Communication Systems 00 (2014) 0000~0000

position configuration and the end position is


connected with the graph according to the Dijkstra In the design of the lidar spinner, there are
Algorithm to determine the shortest path. several important parts as in Figure 4 which will affect
the quality of this Project. First are a Stepper dish and
2.3 System Design a LIDAR disk connected by a rubber belt. The
Figure 3 shows the connection between the comparison between the disks is 1: 2 so with the
components that exist in the system. First, the sensor stepper motor resolution of only 200 step can be
data is taken by the Arduino through the slip ring so increased to 400 step so that the mapping can be better
that the sensor can rotate 360 degrees continuously. again later. Second is the slip ring is a component that
Arduino controls the motion of a stepper motor using connects the 360 degrees lidar sensor with Arduino
a motor driver. Data from sensors is sent via Bluetooth Uno. This component is very important because this
communication to PC. Then the PC with Matlab component makes the sensor connector cable is not
software processes the data to become a map. twisted when rotated. The third is the encoder is a tool
to measure angle orientation on the sensor and can be
used for feedback motor speed control stepper.

2.4 LIDAR-Lite V3 and Data Collection


LIDAR-Lite V3 is a low-cost laser rangefinder
sensor developed from Garmin. this sensor has a great
range and good accuracy. this sensor measures the
distance by calculating the time difference between
sending a laser beam with its reception after it is
reflected an object.
There are two basic configurations for this
device, I2C and PWM. in this project used PWM
configuration to take distance data from the sensor.
For the angle, data retrieval is taken based on the
calculation of each step of the driving stepper motor.
LIDAR-Lite V3 sensor mounted on the sensor spinner
connected with stepper motor using a belt. the design
of the sensor spinner is made on solidwork software
which is then printed using plastic material. Figure 4
shows the LIDAR mounted on the printed spinner and
a motor stepper.

Fig. 3. Connection of Hardware Component

Fig. 5. Printed LIDAR and Spinner

Data from lidar sensor is taken by Arduino and


then it be sent to PC via Bluetooth communication.
The distance and angle data are then converted into
cartesian form as follows :
Fig. 4. Design of Hardware
Tody et al. / Journal of Measurement, Electronics and Communication Systems 00 (2014) 0000~0000

i. convert degree to rad


𝜋
𝜃𝑟𝑎𝑑 = 𝜃𝑑𝑒𝑔 × (1)
180
Where 𝜃𝑟𝑎𝑑 denotes angle in radian and 𝜃𝑑𝑒𝑔
is angle in degree.

ii. convert polar to cartesian


𝑥 = 𝑟ℎ𝑜 × sin (𝜃𝑟𝑎𝑑 ) (2)
𝑦 = 𝑟ℎ𝑜 × cos (𝜃𝑟𝑎𝑑 ) (3)
Where 𝑥 is position in x-axis and y position is y-
axis.

Map and navigation are made in Matlab software


that has loaded robotics system toolbox used in this
research. Fig. 6. Graph of Standard Deviation of Distance
Measurement by Sensor
3. Testing and Analysis
The first test is the sensor measurement test of Based on table 1 if the distance measurement by
the straight line. This test aims to determine the LIDAR sensor is done 30 times then taken the average
distance reading by the LIDAR sensor compared to value it will get the measurement value with error less
the actual distance from some specified distance. than 0.109% from the actual distance. Based on the
This test is done by placing the sensor at a distance graph of Fig. 6 . it is found that the measurement at the
adjusted to the measurement from the barrier to the smaller distance of the measurement result is better
sensor surface. The barrier is a white concrete wall. shown with the decreasing standard deviation value.
The retrieval of data from the Lite V3 LIDAR sensor In this test is limited to the measurement of only up to
uses the reading of the PWM value by the Arduino 20 meters because when measured more than 20
controller. Data is taken in 30 samples for each meters, the distance data is not error obtained
distance. The sensor readout data is processed by the continuously. It can be concluded that the good
PLX-DAQ add-on to be stored in Microsoft Excel measurement distance limit for mapping is less than
software and created graphics. The distance is taken 20 meters and the minimum measurement limit can be
every 1 second. adjusted to the robot dimension.
The second test is distance measurement and
Table 1. Distance Measurement Results By LIDAR visualization at a certain angle. This test aims to
Sensor determine the results of reading at a certain angle
range. The reading results will determine the accuracy
Standard of the sensor in the data readings for the mapping. This
Average test is done by placing the sensor in the room with a
Deviation
Actual 30 Times white styrofoam barrier. Measurements were made at
Error Average of
distance Distance an angle of 0 degrees to 90,882 degrees with an angle
Distance Distance
measure- Measure- difference of 1,782 degrees so that 20 samples of data
Measurement Measure-
ment ment By were made. The actual distance measurements at each
By Sensor (%) ment by
(cm) Sensor angle are measured using a meter gauge. The retrieval
Sensor
(cm) of data from the Lite V3 LIDAR sensor uses the
(cm)
15 16 0.0667 1.05 reading of the PWM value by the Arduino controller.
16 16.73 0.0458 1.53 The sensor readout data is processed by the PLX-DAQ
add-on for storage in Microsoft Excel software. The
17 17.13 0.0078 1.59
distance is taken every 1 second.
18 18.30 0,0167 1.95
19 19.50 0.0263 1.69
20 20.50 0.0250 1.94
30 33.27 0.1089 3.18
75 74.93 0.0009 1.57
90 90.87 0.0096 1.38
100 103.80 0.0380 2.11
150 155.17 0.0345 2.12
300 308.17 0.0272 3.51
600 603.37 0.0056 3.77
1200 1218.77 0.0156 10.11
2000 2008.87 0.0044 9.62
Tody et al. / Journal of Measurement, Electronics and Communication Systems 00 (2014) 0000~0000

7 102 106 104 102 102


8 105 103 103 103 98
9 103 104 107 104 106
10 102 104 106 101 103
11 105 105 107 105 106
12 104 103 101 102 104
13 103 104 103 104 101
14 105 103 105 102 96
15 102 104 104 101 102
16 104 106 103 101 98
17 103 104 103 100 99
18 102 103 103 101 102
19 106 103 102 102 100
Fig. 7. Graph of sensor and distance measurement
results 20 105 103 102 103 98
Average 104. 104.0 104.0
102.4 100.4
(cm) 4 5 5
Standard
Deviation 2.96 0.99 1.67 1.43 4.22
(cm)

Based on the test results in Table 2 it is found that


each different measurement color has different
standard deviation value so it can influence on
mapping result. Surface colors with high standard
deviation will result in the possibility of large error
Fig. 8. Measurement Room and Ploting Result readings due to the large value distribution as well.
The fourth test is distance measurement on
Based on Figure 7 test results can be seen that the specific surface materials. This test aims to determine
results of LIDAR sensor distance reading at a certain the effect of distance reading on certain types of
angle close to the actual distance value and based on surface materials that are measured. This test is almost
Figure 8 obtained that the data visualization of the the same as the test on a particular color only in this
measurement of each angle is close to the form of the test every material is different in measured by the
test room although there is still Error. Error reading of sensor.
the distance itself can be overcome by making a map
that is divided into boxes with a certain resolution. Table 3. Distance Measurement Results By LIDAR
The third test is distance measurement on certain Sensor on Certain Material on cm
surface colors. This test aims to determine the effect
of distance reading on a particular color. This test is Distance Measurement on Material
performed by facing the sensor surface on the paper No.
Con- Plas Card-
guides with red, green, blue, white and black at a Wood Iron
create tic board
distance of 1 meter. Sampling data is done 20 times. 1 81 87 84 102 101
The retrieval of data from the Lite V3 LIDAR sensor 2 104 97 102 104 109
uses the reading of the PWM value by the Arduino 3 103 106 101 103 110
controller. The sensor readout data is processed by the
4 105 100 103 105 117
PLX-DAQ add-on for storage in Microsoft Excel
5 106 103 102 103 118
software. The distance is taken every 1 second.
6 102 96 104 103 110
Table 2. Distance Measurement Results By LIDAR 7 100 101 101 109 111
Sensor on Certain Color on cm 8 100 101 103 105 108
9 107 100 102 106 106
Distance Measurement on Color 10 106 98 104 105 113
No. 11 101 101 103 103 109
Red Green Blue White Black
1 103 104 104 101 88 12 100 101 104 95 110
2 103 105 106 103 99 13 105 101 99 105 120
3 105 105 105 102 104 14 106 99 101 108 115
4 114 105 105 102 105 15 106 115 100 104 106
5 110 104 103 105 96 16 100 111 101 106 114
6 102 103 105 104 101 17 102 117 103 105 116
18 97 105 102 109 107
Tody et al. / Journal of Measurement, Electronics and Communication Systems 00 (2014) 0000~0000

19 81 105 101 105 111


20 107 102 100 107 116
Average 102. 101.0 104. 111.3
100.71
(cm) 42 9 6 5
Standard
Deviatio 7,39 6,62 4,24 2,99 4,75
n (cm)

Based on the standard deviation value in Table 3


it is found that any material that is measured can
influence the measurement result. Differences in the
results of measurements in environments with
significant material variations can affect the shape of
the mapping visualization.
The fivth test is visualization test of lidar sensor
scan results. This test aims to see the visualization Fig. 10. Test room dimensions
results of LIDAR sensor scans rotated 360 degrees
compared with the actual form of test environment.
The test is done by rotating the LiDAR sensor as far
as 360 degree using stepper motor. The stepper motor
spinner has 404 step to rotate 360 degrees. Data
collection is done every step. The data is then
transmitted through serial communication which then
taken data by Matlab software. The scanned data
processed by Matlab to be visualized with the X mark
is the robot position.

Fig. 11. Sensor Scan Mapping results

Based on the results of the scan data obtained


that the form of visualization close to the original form
of the room. The form of visualization that is not a
straight line is the result of the variation of sensor
readout error. The robot is in a position of 0.0 in
coordinates. By reducing the X coordinate on the left
side the right side length based on the visualization is
191.22 cm or has a error of 0.006% and on the lower
Fig. 9. Test room
side with the reduction of the coordinate value Y is
212.48 cm or has an error of 3.6%. Based on the
visualization obtained that dimension closer to the
actual dimension.
The last test is mapping navigation test of data
from sensor spinner.
Tody et al. / Journal of Measurement, Electronics and Communication Systems 00 (2014) 0000~0000

Fig. 13. Test room map

Fig. 14. PRM

Fig. 12. 1st, 10th and 20th steps of mapping

Based on the visualization of the mapping results


on Figure 12 it can be seen that there is an error
reading the orientation of the robot causing an error on
the visualization of the mapping. Because the position
and orientation estimation based on current and
previous scan data, the resulting error is getting bigger Fig. 15. Path planing
and cause the mapping result the less good can be seen
from the visualization result which is getting away Based on the test results of Figure 13, 14 and 15
from the actual shape at every step of mapping step. obtained that the map can be used as reference
navigation robot. Once the path is obtained, the robot
can be controlled to follow the path. The
Tody et al. / Journal of Measurement, Electronics and Communication Systems 00 (2014) 0000~0000

determination of the number of nodes in the PRM will


be better if more but the calculation time is longer. Ikhsan Maulana was born in
Tasikmalaya, September 15th
4. Conclution 1995. Had graduated in High
From the results of testing and analysis obtained School 5 Tasikmalaya. Then
the conclusion of this Final Project that The LIDAR continued to study in Electrical
Lite V3 sensor has a different standard deviation value Engineering from Telkom
for each distance read with a range between 1.05 cm University.
to 10.11 cm for measurements less than 20 meters. A
good distance for LIDAR Lite V3 sensor to map the
environment is less than 20 meters with a standard
deviation of 9.62 cm. Color and Material of the
measurement medium can affect the reading of Angga Rusdinar received his
distance data by the sensor. Positioning estimates B.Eng. degree in Electrical
using scan matching resulted in error seen from the Engineering from Sepuluh
visualization results. The resulting error is greater November Institute of
because each stage of the test uses position estimation Technology, Indonesia, in 2001
from the previous stage. Error of position and and M. Eng. Degree from School
orientation estimation can cause error on mapping due of Electrical Engineering and
to insertion of unsuitable scan data. Informatics, Indonesia in 2006.
He got Ph.D. from the School of
Reference Electrical Engineering, Pusan a Ph.D. from the School
[1] M. A. Markom, A. H. Adom, E. S. M. M. Tan, S. of Electrical Engineering, Pusan National University,
A. A. Shukor, N. A. Rahim, A. Y. M. Shakaff, Korea. His research interests include robotics, robot
“A Mapping Mobile Robot using RP LIDAR vision, localization and navigation systems. Now he is
Scanner”. IEEE Transaction on Robotics and a lecturer at Telkom University, and head of
Intelligent Sensors, 2015 information Autonomous and Control System
[2] E. Prahasta, “Pengolahan Data LIDAR” . Bandung: (INACOS).
Informatika, 2015.
[3] Rusdinar. A, Kim. J., Kim. S., “Error pose Rizki Ardianto Priramadhi
correction of mobile robot for SLAM problem received his Barchelor, Master
using laser range finder based on particle filter”, and Doctor of Electrical
Internation Conference of Control Automation Engineering from ITB, Indonesia.
and Systems (ICCAS), 2010. He currently works as lecturer in
[4] Rusdinar. A., Kim. S., Vision-Based Indoor the Electrical Engineering
Localization Using Artificial Landmarks and Department of Telkom University,
Optical Flow and a Kalman Filter, International Bandung, Indonesia.
Journal of Fuzzy Logic and Intelligent Systems .
Vol. 13, No. 2, June 2013, pp. 133-139.
[5] A. Rubinstein, T. Erez, “Autonomous Robot for
Tunnel Mapping”. International Conference on
the Sciece of Electrical Engineering, 2016.
[6] N. Jain, Y. P. Kumar, K. S. Nagla, Comer
Extraction From Indoor Environment For
Mobile Robot Mapping. Presented at India
Conference (INDICON), New Delhi, India, 2015.
[7] A. Elfes, “Using Occupancy Grids for Mobile
Robot Perception and Navigation”. IEEE
Transaction on Computer, 1989.
[8] https://www.mathworks.com/help/robotics/ug
/occupancy-grids.html [Accessed in 20
December 2017 20:19:00 WIB].
[9] https://www.mathworks.com/help/robotics/ug/
occupancy-grids.html [Accessed in 21
December 2017, 08:00:21 WIB].
[10] P. Corke, Robotics, Vision and Control
Fundamental Algorithms in MATLAB. Berlin :
Springer 2011.

View publication stats

You might also like