Computer Vision Based Object Grasping 6DoF Robotic Arm Using Picamera
Computer Vision Based Object Grasping 6DoF Robotic Arm Using Picamera
Computer Vision Based Object Grasping 6DoF Robotic Arm Using Picamera
Abstract—This article presents a project to design a robust disciplines (techniques), including kinematics, signal
robotic arm which can perform multifunctional tasks. The analysis, information theory, artificial intelligence and
controller of the manipulator is based on Arduino mega 2560 probability theory. An automated arm is a robot manipulator,
Microcontroller. The aim of the project is to focus all axes of typically programmable with comparable capabilities to a
manipulator to lift, carry and unload the objects at a desired human arm.
location. This requires a precise drive motion control that In this study, a robot arm system is intended to perform
incorporate electric motors as a drive system. Further multifunctional tasks. It detects and identifies red color item
experiments are done to implement a camera based 3D vision and grips the item and drops it in a desired location where an
system integrated with a computer vision algorithm to image of items is taken through a camera. All items in the
recognize object deformation and spatial coordination to picture are recognized utilizing image processing methods
control the deviation from the original training. The 3D and every single distinguished items’ coordinates are
visualization systems are able to detect the objects as well as
controlled on the computer and sent to the robot arm
their distance from the End-effecter and transmit the signals to
the drive system. The vision system requires a separate II. RELATED WORK
computing hardware capable of processing complex vision
algorithms. We utilize Raspberry pi microcontroller for Computer vision is a field that incorporates techniques
processing the vision data, separately making the vision system for acquiring, processing, analyzing, and understanding
capable of recognizing the specified object as per program images. Through these techniques, high-dimensional data is
commands. converted to numerical or symbolic data [3] [4] [5]. In the
realm of AI, CV attempts to emulate the capacities of human
Keywords-robotic arm; computer vision system; color vision via electronical observations and understanding of
detection; object recognition images [2]. Computers are preprogrammed in many
applications that make utilization of computer vision to play
I. INTRODUCTION out a particular undertaking. Recently, learning based
method are likewise usually utilized for that kind of
The contemporary research reveals that AI has
application [6], [7], [8]. This context implies the conversion
successfully adapted to the evolving field of computer vision.
The evidence suggests that modern manufacturing is of visual images (the input of retina) into depictions.
contingent on robotics [1]. A robot can be defined as a Subsequently, the interface will identify other points of
programmable, self-controlled device consisting of electronic, views inspire proper activity. This image comprehension can
electrical and mechanical units. Generally it is a machine be viewed as the unraveling of representative data from
performing in place of a living agent. To pick and place image information, utilizing models built with the guide of
objects is the major task in industrial environment [2]. geometry, material science, insights, and learning hypothesis.
Robotics include the coordination of a wide range of Image segmentation is based on color intensity and texture. It
112
Authorized licensed use limited to: Rajshahi University Of Engineering and Technology. Downloaded on June 03,2024 at 17:39:50 UTC from IEEE Xplore. Restrictions apply.
fixings and attachments etc. The list of all components in V. WORKING OF THE ROBOTIC ARM
robot arm kit is shown in Table I. Fig. 2 is the image of This work has successfully completed the zoned
complete kit of robotic arm with microcontroller and functionality for robot arm. A model robot which can rotate,
Picamera. magnetize item, lower and raise its arm, controlled all over
IV. COMPUTER VISION ALGORITHM by microcontroller is constructed successfully. The
development board is assembled and it utilized the required
Our system is based on two algorithms: Algorithm 1 and method for the right operation of the controller. The
Algorithm 2. advancement board has been interfaced to the servo and (DC)
Algorithm 1 is the algorithm of red color detection which motors.
is utilized to identify the object of red color. In this algorithm, Fig. 3 is describing the working flow of the system. First,
for each pixel, the distance (D) between pixel value and the computer vision detection model uses the camera and
reference color value, is computed. Whereas, the threshold takes continuous video to detect the red objects. It uses
value is set manually in accordance with tolerance level different filters on the video to mark the center of the object
given to algorithm, which is a constant value (a high value of with a circle. Second, the raspberry pi sends the serial signal
Threshold considers "less clear" red pixels as red, whereas a to Arduino, which in turn, generates the command and sends
lower value of Threshold will make the algorithm quite signal to robotic arm. Robotic arm is then activated,
"strict"). If the distance (D) is less than Threshold, pixel is collecting object and dispatching it to the desired destination.
selected otherwise not. These tasks are completed by servos with six DOFs
embedded in Arduino programs.
If the red object is not detected, robotic arm does not
perform any operation.
113
Authorized licensed use limited to: Rajshahi University Of Engineering and Technology. Downloaded on June 03,2024 at 17:39:50 UTC from IEEE Xplore. Restrictions apply.
We use another object for detection which is not given in zone (area where servo cannot move). After performing
coordinates and except red color then it will not move and it experimentations we got the following results which is
will not pick that object because except red it will not detect shown in Fig. 6 and Table II which is describing the life and
different colors of object. It will wait only for red color’s dead zone of robotic arm.
object otherwise it will not move to any direction.
Fig. 4 and Fig. 5 are the outputs which come after
implementation. In Fig 4, it is detecting the red color object
and after detection it is marking the green circle on the center
of the original image. In the other picture (Fig. 5) the robotic
arm is picking the object after detection.
VII. CONCLUSION
In this paper we present our experience with 6 DOF
articulated robotic manipulator and 3D vision systems. We
Figure 5. Picking and droping the object. developed a robotic arm which can perform multifunctional
tasks with the help of computer vision. It performs some
different functions by focusing all axes of the manipulator to
TABLE II. THE LIFE AND DEAD ZONE OF SERVOS a desired objects and unload at a desired location.
A 3D vision system based on a camera and a computer
Servos Life zone Dead zone
vision algorithm to recognize object deformation and spatial
1 00 to 3100 3110 to 3600 coordination/deviation from the original training. The 3-D
2 400 to 3200 00 to 390 , 3210 to 3600 visualization systems is able to detect the objects as well as
3 500 to 2900 00to 490 , 2910 to 3600 their distance from the end-effector and will transmit the
4 00 to 2300 2310 to 3600 signals to the drive system. The vision system would require
5 00 to 3000 3010 to 3600 a separate computing hardware capable of processing
6 00 to 1800 1810 to 3600 complex vision.
VIII. FUTURE WORK
B. Testing the Maneuvariablity of Arm/Manipulator Today, robots are doing human labor in all kinds of
There are 6 servos in the robotic arm because our project places. Best of all, they are doing the jobs that are unhealthy
is based on six Degree Of Freedom (6 DOF) and they can or impractical for people. This frees up workers to do the
move in different directions, for example, they can do pitch more skilled jobs, including the programming, maintenance
yaw and roll its elbow and each servo has attached on and operation of robots.
different places in mechanical structure of robotic arm. In the Future works aimed is implementing the wireless
robotic arm each servo will move on different angles to pick protocol so that operator at one end can control the robotic
up the object and drop it at the desired location. Each servo arm wirelessly at the other end. Voice recognition to control
has its life zone (the area in which servo can move) and dead a robotic arm over voice. And the most advanced to make it
114
Authorized licensed use limited to: Rajshahi University Of Engineering and Technology. Downloaded on June 03,2024 at 17:39:50 UTC from IEEE Xplore. Restrictions apply.
mind-controlled robotic arm by connecting to existing Computer Vision and Pattern Recognition, 2009. CVPR 2009. IEEE
neurons or to electrodes implanted into the human brain to Conference on. IEEE, 2009, pp. 951–958.
decode the signals from the brain and use them to control a [8] A. Saxena, J. Driemeyer, and A. Y. Ng, “Robotic grasping of novel
objects using vision,” The International Journal of Robotics Research,
robotic arm. vol. 27, no. 2, pp. 157–173, 2008.
ACKNOWLEDGMENT [9] M. Kazemi, K. K. Gupta, and M. Mehrandezh, “Randomized
kinodynamic planning for robust visual servoing,” IEEE Transactions
This work of the paper is supported by National Science on Robotics, vol. 29, no. 5, pp. 1197–1211, 2013.
Foundations of China (No.61174016). [10] D. Song, C. H. Ek, K. Huebner, and D. Kragic, “Task-based robot
grasp planning using probabilistic inference,” IEEE transactions on
REFERENCES robotics, vol. 31, no. 3, pp. 546–561, 2015.
[1] A. E. Kuzucuoglu and G. Erdemir, “Development of a web-based [11] R. Szab´o and A. S. Gontean, “Full 3d robotic arm control with stereo
control and robotic applications laboratory for control engineering cameras made in labview.” in FedCSIS Position Papers, 2013, pp.
37–42.
education,” Information technology and control, vol. 40, no. 4, pp.
352–358, 2011. [12] Y. Hasuda, S. Ishibashi, H. Kozuka, H. Okano, and J. Ishikawa, “A
[2] C.-Y. Tsai, C.-C. Wong, C.-J. Yu, C.-C. Liu, and T.-Y. Liu, “A robot designed to play the game” rock, paper, scissors”,” in Industrial
hybrid switched reactive-based visual servo control of 5-dof robot Electronics, 2007. ISIE 2007. IEEE International Symposium on.
IEEE, 2007, pp. 2065–2070.
manipulators for pick-and-place tasks,” IEEE Systems Journal, vol. 9,
no. 1, pp. 119–130, 2015. [13] A. Shaikh, G. Khaladkar, R. Jage, and T. P. J. Taili, “Robotic arm
[3] A. D. Kulkarni, Computer vision and fuzzy-neural systems. Prentice movements wirelessly synchronized with human arm movements
Hall PTR, 2001. using real time image processing,” in India Educators’ Conference
(TIIEC), 2013 Texas Instruments. IEEE, 2013, pp. 277–284.
[4] M. Ebner, “A parallel algorithm for color constancy,” Journal of
Parallel and Distributed Computing, vol. 64, no. 1, pp. 79–88, 2004. [14] S. Manzoor, R. U. Islam, A. Khalid, A. Samad, and J. Iqbal, “An
open-source multi-dof articulated robotic educational platform for
[5] D. Forsyth and J. Ponce, “Prentice hall professional technical autonomous object manipulation,” Robotics and Computer-Integrated
reference,” Computer vision: a modern approach, 2002. Manufacturing, vol. 30, no. 3, pp. 351–362, 2014.
[6] G. Bradski, A. Kaehler, and V. Pisarevsky, “Learning-based [15] T. P. Cabre, M. T. Cairol, D. F. Calafell, M. T. Ribes, and J. P. Roca,
computer vision with intel’s open source computer vision library.” “Project-based learning example: controlling an educational robotic
Intel Technology Journal, vol. 9, no. 2, 2005. arm with computer vision,” IEEE Revista Iberoamericana de
[7] C. H. Lampert, H. Nickisch, and S. Harmeling, “Learning to detect Tecnologias del Aprendizaje, vol. 8, no. 3, pp. 135–142, 2013.
unseen object classes by between-class attribute transfer,” in
115
Authorized licensed use limited to: Rajshahi University Of Engineering and Technology. Downloaded on June 03,2024 at 17:39:50 UTC from IEEE Xplore. Restrictions apply.