Real Time Autonomous Robot For Object TR
Real Time Autonomous Robot For Object TR
Volume: 63 Issue: 6
Publication Year: 2020
Abstract— Researchers and robotic development groups have recently started paying special
attention to autonomous mobile robot navigation in indoor environments using vision sensors. The
required data is provided for robot navigation and object detection using a camera as a sensor. The
aim of the project is to construct a mobile robot that has integrated vision system capability used by
a webcam to locate, track and follow a moving object. To achieve this task, multiple image
processing algorithms are implemented and processed in real-time. A mini-laptop was used for
collecting the necessary data to be sent to a PIC microcontroller that turns the processes of data
obtained to provide the robot's proper orientation. A vision system can be utilized in object
recognition for robot control applications. The results demonstrate that the proposed mobile robot
can be successfully operated through a webcam that detects the object and distinguishes a tennis
ball based on its color and shape.
Keywords--- Autonomous mobile robot, image processing, object tracking, mobile robot, vision-
based navigation
I. INTRODUCTION
Mobile robots incorporating vision systems are the most desirable in their environment. They can
move around freely, without being fixed to one particular physical position [1]. A mobile robot needs
appropriate real-world information in real-time for improved navigation [2]. In addition, object
tracking methods such as autonomous vehicle navigation, surveillance and many other applications
are used in various robot vision applications. Although many studies and experiments have been
carried out in recent decades, they have been concerned and committed to solving the problematic
issues of monitoring the desired objective in chaotic and noisy environments [3]. The precision and
versatility of intelligent robot solutions are improved by a robot with vision capability [4]. There
16278
exist many applications involving robots with vision systems, particularly in the manufacturing
industry, research in science development, as well as space navigation purposes, because the vision
system has provided numerous advantages [5]. A controller must be programmed to adjust the
robot's movements in order to create an autonomous mobile robot equipped with a vision system (i.e.
camera) to navigate through the desired locations and locate desired objects [6]. By introducing this
autonomous robot according to the images obtained, many advantages could be accomplished using
image processing, such as securing buildings, maneuvering through dangerous environments, and so
on. In this work, the Visual Basic programming language was used to program RoboRealm software
[7]. Matlab was used to program the PIC microcontroller using a fuzzy logic concept [14-16]. In this
paper, we developed a real-time autonomous robot for object tracking using the vision system by
controlling a camera in the vision range of a robot that was used as a sensor in front of the robot, then
by extracting camera information, it is used to actuate the motion of the robot.
II. METHODOLOGY
Figure 1 depicts the system's overall layout. The device includes a camera attached to a mini-
laptop as a vision sensor. The laptop is connected to the main controller through a motor driver
circuit that interfaces with the robot motor using a PIC chip [8]. The mobile robot is programmed to
find its path, monitor and use a webcam to follow an object, follow the path and monitor the object
[9]. A control system helps the robot to keep tracking the object and follow it as it travels once the
object is detected. The boards are mounted on the robot, including the camera that captures images
which processes them through the compact microcontroller that adjusts the robot's motion and
orientation according to the object's location. The robot must be capable to independently detect an
object of interest and track it [10].
A web camera is used and mounted on the front of the robot. Due to cost constraints and for the
purposes of small robots, it is intended to be as simple as possible [11]. A PC was used to be
mounted on the top of the robot as the main controller for faster image processing, due to the
computational complexity and extensive processing needed for image processing. This PC holds the
decision-making algorithms. It performs the task of obtaining and processing image data from the
camera, instructing the microcontroller to communicate with the positioning motors, and enabling
the robot to move [12]. The PC is chosen because it provides a large competency of data processing
results that can be visualized on the computer monitor, hence facilitating the fine-tuning process. To
interpret, process and adjust the robot according to the incoming signals captured by the camera, a
microcontroller is used. The microcontroller works as an interface to convert the computer system's
serial data into a motor control signal to coordinate the robot's motors and drives. The robot's
movements are determined by two stepper motors. The microcontroller controls them via a
UCN5804B motor driver circuit to facilitate the programming section [13].
16279
Vision System
(Camera)
Microcontroller
for controlling the movement and speed
of the motors
16281
The three layers of the mobile robot are located 6 cm apart to allow the circuits to be mounted at a
sufficient distance. The complete structure of the robot is shown in Figures 2 and 3. The main board
is put on the top layer including the microcontroller and stepper motor controller circuit in the second
layer. The stepper motors are also carried on both sides of the layer, and a mini laptop is put in the
third layer. A cargo ball is mounted in the front part of the layer. The battery to power the robot is
located in the second layer.
16282
The stepper motor controls the motor drivers shown in Figure 7, and consists of a UCN5804B
controller for the stepper motor speed and direction control. There are three operating modes that can
be selected according to the desired operation. For stepper motors, the two-phase operating mode is
usually used, and the DIP switch is used to choose the appropriate mode of operation. Figure 7
shows the schematic view of the UCN5804B circuit, while Figure 4 shows the circuit 's actual
configuration [17]. Two UCN5804B chips were used in both figures to independently power each
motor. For the stepper motors, a 12 v power supply (battery) was used, while the circuit was operated
by a 5 V source [7].
from the view of the camera. Figure 8 shows the image processing task.
Start
NO Desired
Object
16285
Apply Thresholding
The pixels with dark or white colors are omitted using a thresholding property that does not
contain a sufficient green color. Any green noises caused by changes in lighting or intensity are
removed by a degree of hysteresis that is applied to the object. Figure 12 shows the segmentation of
the green ball from the background for further processing. The noises created by additional factors
such as lighting intensity and the presence of moving particles are filtered out by a mean or average
filter. By taking the average of pixels, the object is further smoothed and blurred, which has the
benefit of removing any unwanted differences in noise factors. Figure 13 shows the resulting object.
It can be shown that the green pixels in Figure 12 are connected by comparing Figures 11 and 12,
removing any gaps between neighboring pixels or any other pixels that are not part of the ball. It also
helps to eliminate textures beyond the ball’s diameter [13].
group objects. It is a method of discarding any pixels that are not part of the ball by adding a limit.
The product of point clustering is shown in Figure 13.
where x_n is the coordinate of the nth pixel, and n is the total pixels in the chosen object. The box
16288
encircles the ball, and its size is important to evaluate how far the ball is from the camera. The robot
will go backwards and stop if the size of the box is large to avoid hitting the ball.
Figure18. Forward
The box size determines whether the ball is close to the robot. If the robot is sufficiently in close
proximity to the ball, the PIC receives an "S" signal to inform the robot to stop moving. The robot
will stop due to the location of ball if the size of the box is 136 with the range 130 to 130 + 100. The
robot moves backwards and prevents the ball from crashing if the ball approaches the robot within a
range greater than 230. As a consequence, the PIC receives the "B" signal, as shown in the Figure 19.
IV. CONCLUSION
The vision system object tracking robot was successfully accomplished using only a webcam as
the main object detection sensor, demonstrating a great ability to distinguish a tennis ball based on
the color and shape, and track the ball as it travels in any direction. The robot has been fitted with the
mechanism to search for the ball, and keeps monitoring it if the ball is not present in the view of the
camera by spinning in place until the ball is detected. Extensive image processing techniques and
algorithms need to be processed on-board using a mini-laptop for rapid processing to accomplish the
16291
task. The interpreted information is transmitted to the microcontroller, and converted into real world
orientation.
V. ACKNOWLEDGMENTS
The authors would like to thank the Ministry of Higher Education Malaysia and Universiti Tun
Hussein Onn Malaysia for supporting this research under the Fundamental Research Grant Scheme
(FRGS) vot FRGS/1/2018/TK04/UTHM/02/14 and TIER1 Grant vot H158.
REFERENCES
[1] F. Gul, W. Rahiman, and S. S. Nazli allay, A comprehensive study for robot navigation
techniques. Cogent Engineering. 3, 6 (2019)
[2] A. Pandey, Mobile Robot Navigation and Obstacle Avoidance Techniques: A Review.
International Robotics & Automation Journal. 2, 3 (2017)
[3] A. Aouf, L. Boussaid, and A. Salk, Same Fuzzy Logic Controller for Two-Wheeled Mobile
Robot Navigation in Strange Environments. Journal Robotics. (2019)
[4] M. Gheisarnejad, and M. Khooban, Supervised control strategy in trajectory tracking for a
wheeled mobile robot, in IET Collaborative Intelligent Manufacturing, 1, 1(2019)
[5] A. Mohamed, C. Yang, and A. Cangelosi, Stereo Vision based Object Tracking Control for a
Movable Robot Head. IFAC. 5, 49. (2016)
[6] X. Guo, C. Wang and Z. Qu, Object Tracking for Autonomous Mobile Robot based on
Feedback of Monocular-vision, 2007 2nd IEEE Conference on Industrial Electronics and
Applications,2007 May 467-470, Harbin, China.
[7] H. Cheng, L. Lin, Z. Zheng, Y. Guan and Z. Liu, An autonomous vision-based target tracking
system for rotorcraft unmanned aerial vehicles,2017 IEEE/RSJ International Conference on
Intelligent Robots and Systems (IROS), (2017), 1732-1738, Vancouver, BC.
[8] H. Jha, V. Lodhi and D. Chakravarty, Object Detection and Identification Using Vision and
Radar Data Fusion System for Ground-Based Navigation, 2019 6th International Conference
on Signal Processing and Integrated Networks (SPIN), (2019) 590-593, Noida, India.
[9] M. C. Le, and M. H Le, Human Detection and Tracking for Autonomous Human-following
Quadcopter, 2019 International Conference on System Science and Engineering (ICSSE),
(2019) 6-11, Dong Hoi, Vietnam.
[10] J. Martinez-Gomez, A. Fernandez-Caballero, I. Garcia-Varea, L. Rodriguez, and C. R.
Gonzalez, A Taxonomy of Vision Systems for Ground Mobile Robots. International Journal
of Advanced Robotic Systems. 11, 7 (2014)
[11] H. Naeem, J. Ahmad and M. Tayyab, Real-time object detection and tracking, INMIC,
(2013)148-153, , Lahore, Pakistan.
[12] Y. Wu, Y. Sui and G. Wang, Vision-Based Real-Time Aerial Object Localization and
Tracking for UAV Sensing System, in IEEE Access, 5, (2017)
[13] P. Chen, Y. Dang, R. Liang, W. Zhu and X. He, Real-Time Object Tracking on a Drone with
Multi-Inertial Sensing Data, in IEEE Transactions on Intelligent Transportation Systems, 19,
1, (2018)
[14] N. Farah, M. H. Talib, N. S. Shah, Q. Abdullah, Z. Ibrahim, J. B. Lazi, A. and Jidin, A Novel
Self-Tuning Fuzzy Logic Controller Based Induction Motor Drive System: An Experimental
16292
[16] T. Sadeq, C. K. Wai, E. Morris, Q. A. Tarbosh and Ö. Aydoğdu, "Optimal Control Strategy to
Maximize the Performance of Hybrid Energy Storage System for Electric Vehicle
Considering Topography Information," in IEEE Access, 8 (2020)
[17] M. Qazwan Abdullah, Objects Tracking Autonomously with Vision System. Universiti Tun
Hussein Onn Malaysia, (2012)
AUTHORS PROFILE
Qazwan Abdullah was born in Taiz, Yemen and received his bachelor’s degree of Electrical and electronic
engineering from Universiti Tan Hussein Onn Malaysia (UTHM) in 2013. Also received his Master of
Science in Electrical and Electronic Engineering in 2015 from the same university. Currently, he is a PhD
student at the Faculty of Electrical & Electronic Engineering with research interests of control system,
wireless technology and microwave
Nor Shahida Mohd Shah received B.Eng, in Electrical and Electronic from Tokyo Institute of Technology,
Japan in 2000. She then received M.Sc. in Optical Communication from University of Malaya, Malaysia in
2003. She received PhD in Electrical, Electronics and System Engineering from Osaka University, Japan in
2012. She is currently a Senior Lecturer at Universiti Tun Hussein Onn Malaysia. Her research interests are
optical and wireless communication.
Mahathir Mohamad received the B.Sc. degree from Universiti Putra Malaysia, in 2000, the M.Sc. degree
from the National University of Malaysia, in 2003, and the Ph.D. degree from Osaka University, in 2012.
He currently holds the position as a Senior Lecturer at Universiti Tun Hussein Onn Malaysia. He is
majoring in applied mathematics.
Muaammar Hadi Kuzman Ali was born in Yemen in 1988. He has received a bachelor’s degree in
electrical engineering (robotic) from Universiti Teknologi Malaysia in 2011. And received a master’s
degree in electrical engineering at the same university. Research interest is the robotic
N. Farah was born in Yemen in 1988. He has received a bachelor’s degree in electrical engineering (Power
electronics and drives) from Universiti Teknikal Malaysia Melaka in 2015.And received master’s degree in
electrical engineering in same university. Currently he is enrolled as PhD student in electrical engineering
also at same university. Current research interest is self-tuning fuzzy logic controller of AC Motor drives
and predictive control of induction motor drives.
A. Salh currently a postdoctoral researcher at Electrical and Electronic Engineering, Universiti Tun Hussein
Onn Malaysia, Received the Bachelor of Electrical and Electronic Engineering, IBB university, IBB, Yemen
(2007), and the Master and PhD of Electrical and Electronic Engineering, University Tun Hussein Onn
Malaysia, Malaysia (201 5& 2020).
MAGED ABOALI was born in Yemen, in 1993. He received the B.Eng. degree (mechatronics with Hons.)
from the Universiti Teknikal Malaysia Melaka, in 2015, and the M.S. degree (Hons.) in electronic
engineering from the Universiti Teknikal Malaysia Melaka, in 2018. His main research interests include
artificial intelligent, image processing, computer vision, speech synthesis, stereo vision, AC motor drives
and control of induction motor drives.
16293
Mahmod Abd Hakim Mohamad is a lecturer at the Department of Mechanical Engineering, Centre for
Diploma Studies, University of Tun Hussein Onn Malaysia (UTHM), since 2011. In June 1996, he
graduated with Certificate in Mechanical Engineering. After that, he continued his studies and after one
year, he got his diploma in the same field from Port Dickson Polytechnic. He furthered his tertiary education
in Kolej Universiti Teknologi Tun Hussein Onn (KUiTTHO) and obtained a Bachelor’s Degree in
Mechanical Engineering with honors in 2002 and after 2010 complete study at University of Putra Malaysia (M. Sc.) in
Aerospace Engineering and started to get involved in research field of computational fluid dynamics. He is active in
research and innovation products in the field of teaching and learning, especially in design engineering. He also actively
published various journals and conference papers
Abdu Saif received his Master of Science degree in Project Management in 2017 from University of
Malaya, Malaysia and B.E. Degree in Electronics communication in 2005 from IBB University, Yemen.
He has more than 9 years of industrial experience in telecommunications companies. He is currently
pursuing his Doctorate Degree in Electrical Engineering (major in wireless communication) from the
Faculty of Engineering, University of Malaya, Kuala Lumpur, Malaysia. His research interests include
Wireless networks, 3D coverage by UAV, Internet of flying things, emergency management system, and public safety
communication for 5G.
16294