5-Ieeemelecon 2022
5-Ieeemelecon 2022
Abstract— In this paper, we describe a positioning method for Some work from [12,13] have used the points receiving the
a moving mobile robot in a known environment. The proposed Qr-Code as landmarks for the robot to locate the approaching
technique incorporates location estimation by letting cameras to robot, or [14,15] in combination with the Dijkstra algorithm
recognize the QR code from fixed markers. These fixed points will (Dijkstra algorithm) to find information about the desired
provide the reference points listed in the database. The system will destination. However, these methods are also quite complicated,
calculate the angle to each landmark and then correct the robot's the robot also has to calculate a lot and it is complicated for the
directions. The extended Kalman filter is deployed to correct the robot to accurately follow the desired trajectory, when the robot
position and orientation of the robot from the error between the operates in an environment with many obstacles. Recent works
viewing angle and the estimate to each datum. The experimental of the author's group propose methods of navigation with virtual
results show that the approach improves and suffices in robot
paths [16-17]. These methods require that the operating
localization for navigation tasks. Results from experiments in real
environment of the robot has a flat floor, the distance between
environments are presented including analysis. The results are
significant and show promise.
the Hall sensors and the floor is very small, so the robot's
operating range is also quite limited. From the above-mentioned
Keywords – Data fusion; Sensor fusion; Laser range finder; starting points, this report presents the construction of a sensor
Localization; Omni-camera; Sonar; Extended Kalman filter; GPS; for a mobile robot with the synthesis of two sensors coding
rotation axis (with Odometry method) and Camera by
I. INTRODUCTION recognizing QR code - code is fixed together with the associated
Mobile robots, such as self-propelled vehicles, are used in electronic circuit design and an efficient communication
many industrial and service, medical, and military applications network between the sensors and the controller. On that basis, a
[1-3]. The research and development of these robots is currently software program for synthesizing sensors based on the
a problem of interest to many electronic and automatic control Extended Kalman filter theory was developed and presented,
researchers around the world. One of the issues to be concerned allowing to significantly improve the accuracy of robot
about is the precise positioning of the robot in the process of positioning.
guiding it to its destination in an indoor or outdoor environment The rest of this paper is organized as follows. Section II
when there is no GPS signal, etc. In general, a robot consists of presents the electronic hardware structure of the sensors and the
three basic components: a sensor, a controller, and an actuator data mechanisms. Section III provides the sensing fusion
[4-6]. With positioning tasks, sensing parts consist of electronic methods with EKF. Section IV addresses experimental results
sensors and hardware signal processing circuits and software with real robots and analysis. Finally, conclusions are provided
programs that play a key role to accurately estimate robot’s in Section V.
position, orientation and motion based on the senssing data
[7,8]. However, because each sensor normally detects only one
or two ambient characteristics with limited accuracy, it's logical
to find that the more sensors employed, the higher the
probability of increasing the robot's projected accuracy position.
That is why the sensor fusion method is now being widely
applied in modern robotic systems to increase measurement
accuracy, which was previously presented by the author and the
research team [9-11]. The basis of this method is mostly based
on probabilistic reasoning. The extended Kalman filter (EKF) is
a probabilistic solution for some information about its internal
and external sensors. However, the above methods require
complex hardware and processing, especially real-time data Fig. 1. (a) The robot in the experiment; (b) Detailed mechanical drawing of
processing. the base of the robot; (c) Kinetic model of mobile robot.
91
II. SENSOR SYSTEM DESIGN programming to collect data. The control program is developed
The robot in Fig. 1 is built as a 2-wheeled type. Each wheel by us in C++.
is attached to an independently controlled electric motor.
Different types of sensors are designed and installed in the
sensing unit of the mobile robot, as shown in Fig. 2. Technical
features are presented as follows.
To calculate depth, stereo image sensing methods use two Although a marker can be discovered rotating in the
cameras. RealSense stereo depth technology from Intel offers medium, the detection procedure must be able to ascertain its
3D to devices and machines that can only see in two beginning rotation in order to characterize each angle clearly.
dimensions. RealSense D435, as shown in Fig. 4, boasts long- This is done using binary encoding as well. A dictionary of
range capabilities, excellent depth resolution, and active markers is a list of markers that can be used in a specific
infrared (IR) stereo, and is designed for ease of setup and application. It's just a list of each of its markers' binary
mobility. The D435's broad vision and global shutter combo encodings. The dictionary size and marker size are the two most
makes it excellent for robotic navigation and object recognition. important aspects of the dictionary.
A single camera can cover more ground with a broader field of
The ArUco module comes with a number of preconfigured
vision, resulting in fewer blind spots. The global shutter sensors dictionaries that cover a wide variety of dictionaries and marker
give remarkable low-light sensitivity, allowing robots to sizes. The marker's ID is just the index of the marker in the
navigate places with the lights out. The camera is connected to dictionary. The first five markers in the dictionary, for example,
the computer via a USB port. Users have to do embedded have the following IDs: 0, 1, 2, 3, and 4.
92
E. Create markers interval Δt we can have the input control signals that the
The markers must be printed and placed in the environment displacement increments applied to these wheels are
before they can be detected. The drawMarker() method in
OpenCV can be used to create marker images. sR tRR
sL tR L
F. Marker recognition
The detection must produce a list of identified markers From here, the distance increment of the robot center Δs is:
given an image containing ArUco markers, as shown in Fig. 6.
Each detected marker has the following information: The
sR sL
image's position of its four corners (in original order) and the s
marker ID. 2
sR sL
L
xk xk 1 sk cos( k 1 k / 2)
y y s sin( / 2)
Fig. 6. Recognized markers and ID. k k 1 k k 1 k
k k 1 k
G. Estimated location
Following the detection of the markers, we may want to The displacement increments sR and sL (proportional to
deduce the camera pose from them. We need to know the the set speeds of the wheels R and L and measured by the
calibration parameters of the camera in order to do camera pose encoders) are the input variables affected by systematic and
estimation. We can use the calibrateCamera() method and random errors. These increments when subjected to disturbance
OpenCV's Calibration tutorial if we do not know how to can be expressed as the sum of two components including the
calibrate your camera. We can estimate the pose of each nominal value and the noise:
individual marker when using ArUco markers to estimate the
stance. As shown in Fig. 7, the marker's camera posture is a
s R s R 0 R
three-dimensional transition from the marker's coordinate
system to the camera's. The vectors rotate rvecs and translating s L s L 0 L
tvecs define it. The Kalman filter will be used to locate the robot
based on these characteristics. where sR0 and sL0 are the nominal values of the input signal,
εR and εL are process noise values, assumed to be independent,
white noise with zero mean and normally distributed.
The EKF calculation equations are carried out through the
following phases:
1) Starting
From step k = 0, the robot starts from an exact known initial
Fig. 7. Coordinate system associated with the camera. position with the posterior state estimate x̂0 and the covariance
of the error estimate P0.
III. SENSOR FUSION
2) Prediction phase:
The positioning determines the coordinates and orientation The a priori state xˆ k is calculated according to (3), (4) with
of the robot in an environment. The position of the robot T
(coordinates and directions) in the global coordinate system is input control signals u sR sL measured by the encoder.
T
represented by the State Vector x x, y , , where x, y are To get the filter matrices P, Q and A are defined as follows:
the coordinates of the center of the robot. θ is the angle of the Qk is the input noise covariance matrix of size [2x2],
robot in the direction of the XR axis relative to the XG room we have:
coordinate axis.
When the angular velocities of the right and left wheels are | s Rk | 0
Qk covar(s Rk , s Lk ) R
set to R and L then with a sufficiently short sampling 0 L | s Lk |
93
Here R and L are error constants representing the
uncertainty parameters of the motor control circuit and the f x f x f x k
x y 1 0 sk sin k 1 2
interaction of the wheel with the floor and they are independent
of each other. Thus, the variance of the process noise is f y f y f y k
proportional to the absolute value of the increments traveled by Ak 0 1 sk sin k 1
each wheel ∆sR and ∆sL. Based on these position and orientation x y 2
errors, the parameter δ of the calculated input covariance matrix f f f 0 0 1
Qk is determined to be δR = L ≡ = 0.005.
x y
Ak is a matrix of size [3x3], calculated by:
-Wk is a matrix of size [3x2], calculated by:
f x f x f x f x 1 s 1 s
cos k 1 k k sin k 1 k cos k 1 k k sin k 1 k
L s R s L 2 2 2L 2
2 2 2L 2
R
f y f y f y f y 1 k sk 1 s
Wk sin k 1 cos k 1 k sin k 1 k k cos k 1 k
R L s R s L 2 2 2 L 2
2 2 2L 2
f f f f 1 1
R L s R s L L L
3) Correction phase
xG , k xˆk xR cos ˆk yR sin ˆk
As mentioned above, the output from the camera of the
marker position estimator gives the translation vector tvec and
y ˆ
y
x sin ˆ y cos ˆ
the rotation vector rvec of each marker. From these vectors, we G , k k R k R k
transform to the deviations along the x, y axes as well as the G , k ˆk R
angular deviation θ of each marker for the robot. The magnitude
of the translation vectors along the axes is also equal to the
The offset between the marker and the camera in the global
distance deviation along the corresponding axes of the camera:
coordinate system is illustrated in Fig. 8 and calculated as
xC tvec[0]
xG , k xm xG , k xm xˆk xR cos ˆk yR sin ˆk
yC tvec[1]
yG , k ym yG , k ym yˆ k xR sin ˆk yR cos ˆk
Particularly for the rotation vectors, we need to convert to G , k m G , k m ˆk R
Euler angle to find the rotation angle of each marker concerning
the z-axis (perpendicular to the camera plane and passing
through the camera center). We first use the Rodrigues
transform to convert the rotation vector rvec into a rotation
matrix [3x3] using the CvInvoke.Rodrigues function in
OpenCV, then convert the rotation matrix into Euler angles
according to the following transformation. We will get the
rotation angles corresponding to the x, y, z axes. Since the
markers are mounted on the ceiling, they are parallel to the
camera and the distance is constant from the camera, so we are
only interested in the rotation around the z-axis which is ∆θ.
Vector zk will now have the value:
Fig. 8. Calculation of offset between marker and robot.
z k [ xC yC C ]T
The deviation values in the global coordinate system are
The position and rotation of each marker concerning the converted to the coordinate system attached to the camera, now
corresponding global coordinate system is [ xm ym m ]T . The the measurements of the vector zk are calculated by:
position of the Camera in the robot's coordinate system is
[xR y R R ]T . The equation to convert the Camera position in xC , k xG , k cos G , k yG , k sin G , k
the robot coordinate system to the global coordinate system is yC , k xG , k sin G , k yG , k cos G , k
as follows: C , k G , k
94
error in the measurement from the Camera can be up to 1mm
x C x C x C depending on the type of Camera, the quality of the image
x
y obtained from the Camera as well as whether the Camera has
y C y C y C been correctly calibrated or not. In addition, during the robot
H
x y
movement, as shown in Fig. 10, the camera may shake that
C C C results in inaccurate results. Camera error is calculated by
moving the robot on the floor to fixed positions and measuring
x y
the deviation from the measurement result of the camera
sin G ( x m xˆ k ) compared to reality. After many tests at different positions, the
co s G , k 0
ˆ
co s ( y m y k ) maximum deviation error along the axes is 0.05m and the
cos G ( x m xˆ k )
maximum rotation angle error is 1.5°. We can determine the
0 co s G , k matrix R as var(∆x) = var(∆y) = 0.0025m2 và var(∆θ) =
sin ( y m yˆ k )
0.000685 rad2.
0 0 1
Now the matrix V has size [3x3] and is equal to:
var(x) 0 0
R 0 var(y ) 0
0 0 var( )
95
ACKNOWLEDGMENT
The authors would like to thank Thai Nguyen University
of Technology, Viet Nam for the support.
REFERENCES
[1] A. Q. Pham, H. M. La, K. T. La and M. T. Nguyen, "A magnetic wheeled
robot for steel bridge inspection" in Advances in Engineering Research
and Application, Springer International Publishing, pp. 11-17, 2020.
[2] Nguyen, M. T. “An energy-efficient framework for multimedia data
routing in internet of things (IoTs)” EAI Endorsed Transactions on
Industrial Networks and Intelligent Systems, 6(19), 2019.
[3] M. A. K. Niloy et al., "Critical Design and Control Issues of Indoor
Autonomous Mobile Robots: A Review," in IEEE Access, vol. 9, pp.
Fig. 11. Robot trajectory estimate in various EKF configurations. 35338-35370, 2021.
[4] M. T. Nguyen, H. M. La and K. A. Teague, "Collaborative and
Compressed Mobile Sensing for Data Collection in Distributed Robotic
Networks," in IEEE Transactions on Control of Network Systems, vol. 5,
no. 4, pp. 1729-1740, Dec. 2018.
[5] X. Zhong, Y. Li, S. Zhu, W. Chen, X. Li and J. Gu, "LVIO-SAM: A
Multi-sensor Fusion Odometry via Smoothing and Mapping," 2021 IEEE
International Conference on Robotics and Biomimetics (ROBIO), 2021,
pp. 440-445, doi: 10.1109/ROBIO54168.2021.9739244.
[6] M. T. Nguyen and K. A. Teague, "Compressive and cooperative sensing
in distributed mobile sensor networks," MILCOM 2015 - 2015 IEEE
Military Communications Conference, 2015, pp. 1033-1038.
[7] N. Dawar, S. Ostadabbas and N. Kehtarnavaz, "Data Augmentation in
Deep Learning-Based Fusion of Depth and Inertial Sensing for Action
Recognition," in IEEE Sensors Letters, vol. 3, no. 1, pp. 1-4, Jan. 2019,
Art no. 7101004.
[8] M. T. Nguyen and H. R. Boveiri, “Energy-efficient sensing in robotic
networks,” Elsevier Measurement, vol. 158, p. 107708, 2020.
Fig. 12. Deviation in X and Y directions between anticipated and actual [9] T. T. Hoang, P. M. Duong, N. T. T. Van, D. A. Viet and T. Q. Vinh,
placements. "Development of an EKF-based localization algorithm using compass
sensor and LRF," 2012 12th International Conference on Control
Automation Robotics & Vision (ICARCV), 2012, pp. 341-346.
[10] T. T. Hoang, P. M. Duong, N. T. T. Van, D. A. Viet and T. Q. Vinh,
"Multi-sensor perceptual system for mobile robot and sensor fusion-based
localization," 2012 International Conference on Control, Automation and
Information Sciences (ICCAIS), 2012, pp. 259-264.
[11] T. T. Hoang, D. T. Hiep, P. M. Duong, N. T. T. Van, B. G. Duong and T.
Q. Vinh, "Proposal of algorithms for navigation and obstacles avoidance
of autonomous mobile robot," 2013 IEEE 8th Conference on Industrial
Electronics and Applications (ICIEA), 2013.
[12] Ilkovičová, Ľ., Erdélyi, J., & Kopáčik, A. “Positioning in indoor
environment using QR codes” In International Conference on
Engineering Surveying Prague, Czech republic (INGEO), pp. 117-122,
2014.
Fig. 13. A sequence of images showing the motion of robot in an experimental [13] Suresh, S., Anand, P. R., & Lenin, D. S. “A novel method for indoor
environment during the autonomous navigation operation. navigation using QR codes” International Journal of Applied
Engineering Research, 2015, 10.77: 2015.
V. CONCLUSIONS [14] H. Zhang, C. Zhang, W. Yang and C. -Y. Chen, "Localization and
navigation using QR code for mobile robot in indoor environment," 2015
The results of merging the data from the two sensors Camera IEEE International Conference on Robotics and Biomimetics (ROBIO),
and Encoder with the Extened Kalman filter to improve the 2015, pp. 2501-2506.
positioning value of the mobile robot indicate that the guiding [15] Sneha, A., Teja, V., Mishra, T. K., & Satya Chitra, K. N. “Qr code based
process for the robot to accurately follow the desired route in indoor navigation system for attender robot” EAI Endorsed Transactions
autonomous mode. This paper provides significant calculations, on Internet of Things, 2020, 6.21.
simulation and experimental results to clarify the problems. [16] V. C. Thanh, T. L. T. Dong, N. N. A. Quan, T. T. Hoang, “Autonomous
vehicle navigation using inertial sensors and virtual path”, In: National
The next quantitative studies will be appling to the designed Conference "High- Tech Application in Practice in 2021". (2021)
robotic system, which will promise more useful results to [17] V. C. Thanh, N. N. A. Quan, T. L. T. Dong, T. T. Hoang, Minh T. Nguyen,
contribute to the research field of mobile robots with a lot of “Fusion of Inertial and Magnetic Sensors for Autonomous Vehicle
potential products in industry, healthcare, national security, etc. Navigation and Freight in Distinctive Environment”, ICERA 2021:
International Conference on Engineering Research and Applications.
Springer, Cham, Dec. 2021.
96