0% found this document useful (0 votes)
32 views

Automated Object Identifing and Sorting Robot

The rapid growth of industries has led to the need for various automation robots that can make the operation process more efficient. Industries of various sectors like food, mining and automobiles demand a robot that can identify and sort different items which are either too heavy or dangerous to be done manually by humans.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views

Automated Object Identifing and Sorting Robot

The rapid growth of industries has led to the need for various automation robots that can make the operation process more efficient. Industries of various sectors like food, mining and automobiles demand a robot that can identify and sort different items which are either too heavy or dangerous to be done manually by humans.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

10 XI November 2022

https://doi.org/10.22214/ijraset.2022.47594
International Journal for Research in Applied Science & Engineering Technology (IJRASET)
ISSN: 2321-9653; IC Value: 45.98; SJ Impact Factor: 7.538
Volume 10 Issue XI Nov 2022- Available at www.ijraset.com

Automated Object Identifing and Sorting Robot


Dhiransh Saxena1, Potnuru Sai Vamsi2, Chaitanya V Eedala3, Gagan Karthik K4
1, 2, 4
Dept. of Electronics and Communication Engineering, 3Dept. of Computer Science, VIT University Vellore, India

Abstract: The rapid growth of industries has led to the need for various automation robots that can make the operation
process more efficient. Industries of various sectors like food, mining and automobiles demand a robot that can identify and
sort different items which are either too heavy or dangerous to be done manually by humans. The robot also gets the work
done with minimal error and in less time which in turn increases the profitability of the operation.In this project we built
an prototype of such robotic arm which can automatically identify and sort different object using open cv and YOLO
algorithm .The robotic arm is controlled by a microcontroller and has four servo motors .It consists of four Degrees of
Freedom to sort the objects.
Keywords: component, formatting, style, styling, insert (key words)

I. INTRODUCTION
Robotics has become an indispensable part of our lives from strategic military operations to dull applications of household and
cleaning. Among the various applications of robotics, object picking has been one of the most emerging fields in terms of projects.
The demand for lifting heavy object or those which are dangerous to be handled by humans has been the main factor that led to the
need of such robots.
The aim of this project is to construct a demo model of a robotic arm that can perform various activities related to picking up certain
objects and placing it at a specific location desired by the user and also using the same approach to sort objects. The robotic arm that
is being developed is an inexpensive method by using lightweight Plastic. The arm of the robot has 4 degrees of freedom
categorized as Gripper, forward/backward, up/down and left/right which unitedly can comfortably grab any object. We also worked
with cloud for to and fro transfer of data. Our project can be further worked upon to include the negative angles and increase the
strength to comfortably operate on heavy objects.

II. LITERATURE SURVEY


The paper [1], presents the development of an autonomous ball picking robot.The main objective was to increase the capability
of a 6-axis industrial robot. The project was performed to demonstrate an autonomous capability of the robot to deal with a
dynamic operational environment. The system developed features like visual processing on two cameras by an outside computer. One
camera search for objects randomly launched at the robot to determine the location and color, while, the other camera is used for
digital feedback control of the gripper style end-effector. The project has demonstrated how the outside sensors and processing can
control a robot to grip and sort objects by colour and location.
In paper [2], a robotic sorting arm based on colour recognition technique was demonstrated . the image processing algorithms and
Inverse Kinematics algorithm were combined to develop a robotic arm that can sort objects according to their shape and colour. The
paper also mentioned that performing image processing on Raspberry Pi will reduce the size of the system while increase the
efficiency in terms of power consumption also Installing light sensors can reduce the interference from the environment and using
more Sophisticated neural network can help the system differentiate a larger variety of the objects.
In paper [3], the team managed to design and implement a robotic arm that has the talent to accomplish simple tasks like light
material handling. They designed and built the robot arm from acrylic material where servo motors were used to perform links
between arms and execute arm movements.
The design of the robot arm was limited to four degrees of freedom since this design allows most of the necessary movements and
keeps the costs and the complexity of the robot competitively.
In paper [4]an object sorting robotic arm based on colour sensing was developed . This can be useful to categorize the objects which
move on a conveyor belt. The proposed method of categorization is based on colour of the object. In this project the system
categorizes balls of three different colours. The detection of the particular colour is done by a light intensity to frequency converter
method. The robotic arm is controlled by a microcontroller-based system which controls DC servo motors.

©IJRASET: All Rights are Reserved | SJ Impact Factor 7.538 | ISRA Journal Impact Factor 7.894 | 1346
International Journal for Research in Applied Science & Engineering Technology (IJRASET)
ISSN: 2321-9653; IC Value: 45.98; SJ Impact Factor: 7.538
Volume 10 Issue XI Nov 2022- Available at www.ijraset.com

Paper [5] has discussed about the process of sorting objects and how they can be done by using autonomous machines that can
recognize objects. It presented a system in which a robotic arm sorts objects according to their colour and shape. Objects are
categorized into three colours which are red blue and green. The objects are also differentiated based on their shape into two
categories, one with edges and other without edges. The image of the object to be sorted is captured using a webcam and image
processing is done using MATLAB. The robotic arm was controlled by an ARM 7 based system. Geared DC motors were used for
operating the robotic arm.
Paper [6] presents the colour-based object sorting system which uses the machine vision and the operations in image processing. The
main objective was to develop a compact, easy and accurate objects sorting machine using real time color image processing method
to continuously evaluate and inspect the color deformity using camera-based machine vision. The servomotors used in the roboarm
plays the vital role as control movement of the roboarm wholly depends control signal given to servo motor. Hence to operate the
system accurately the synchronization between IR sensors, dc motors of the conveyor belt and roboarm is very essential.
Paper [7] presented a system in which a robotic arm sorts objects according to their colour and shape. The objects are categorized into
three colors which are red blue and green. The objects are also differentiated based on their shape into two categories, one with edges
and other without edges. The image of the object to be sorted is captured using a webcam and image processing is done using
MATLAB. They used the ARM7 based system to control the robotic arm and Geared DC motors were used for operating the robotic
arm.
In paper [8], The project led to the development of a new compact soft actuation unit intended to be used in multi degree of
freedom and small scale robotic systems such as the child humanoid robot “iCub”. Compared to the other existing series elastic linear
or rotary implementations the proposed design shows high integration density and wider passive deflection. The miniaturization of the
newly developed high performance unit was achieved with a use of a new rotary spring module based on a novel arrangement of
linear springs. Their control scheme is a velocity based controller that generates command signals based on the desired simulated
stiffness using the spring deflection state.
In paper [9],the project deals with the designing of a Synchronized Robotic Arm, which is used to perform all the basic activities like
picking up objects and placing them. They have designed a robotic arm, synchronized it with the working arm which allows it to
perform the task as the working arm does. The work done by the robotic arm would be highly precise, since they have used a digital
servo motor. They have used a servo motor of 230 oz-inch. This robotic arm can also be used for precision works. For instance some
work has to be done very precisely but the conditions do not suit human beings. In such conditions, this robotic arm can be used
remotely and the task can be accomplished. The programming is done on ATMEGA-8 Microcontroller using Arduino programming.

III. METHODOLOGY
A. Robotic Arm Controller
In order to control the Robotic Arm an IOT based platform is being used. In this project we have used Blynk which is capable of
connecting phone with different microcontrollers. Blynk provides us with a cloud database which would be used to communicate
between Blynk app and microcontroller.
In Blynk four buttons have been defined for four different parameters and assign voltage variable to all these four buttons.
In order to control the robotic arm, we connect to ESP8266 using unique Ids.When we use the buttons in the Blynk app, data is being
sent to the cloud and then it is being fetched using ESP8266 and the servo motors are being controlled which are used to control the
robotic arm.

Fig 1. Circuit Diagram

©IJRASET: All Rights are Reserved | SJ Impact Factor 7.538 | ISRA Journal Impact Factor 7.894 | 1347
International Journal for Research in Applied Science & Engineering Technology (IJRASET)
ISSN: 2321-9653; IC Value: 45.98; SJ Impact Factor: 7.538
Volume 10 Issue XI Nov 2022- Available at www.ijraset.com

B. Objected detection and automatic sorting


In order to detect objects we are using OpenCV library and YOLO algorithm - You Only Look Once, with python. We detect
object using a web camera and then we send 3 different signals for our small objects. The signal are as follows:
1) A signal ’1’ is being sent when we detect a eraser
2) A signal ’2’ is being sent when we detect a Elephant
3) A signal ’0’ is being sent for NULL REQUEST, that is when we want to stop the robotic arm so that it can wait for the next
data.
The cloud used in this project is ThingSpeak Cloud, an open source cloud platform. ESP8266 is used to communicate the data with
ThingSpeak cloud.
Response of Robotic Arm on receiving different signals is as follows:
a) On receiving the signal ’1’ the robotic arm will pick up the object and place it to the right.
b) On receiving the signal ’2’ the robotic arm will pick up the object and place it to the left.
c) On receiving the signal ’0’ the robotic arm stops so that it can wait until the next data is received and both the LED’s will turn
on.

C. Flowchart

IV. RESULTS

Fig. 2. Hardware Model of the Robotic Arm

©IJRASET: All Rights are Reserved | SJ Impact Factor 7.538 | ISRA Journal Impact Factor 7.894 | 1348
International Journal for Research in Applied Science & Engineering Technology (IJRASET)
ISSN: 2321-9653; IC Value: 45.98; SJ Impact Factor: 7.538
Volume 10 Issue XI Nov 2022- Available at www.ijraset.com

Fig. 3. Detection of Person and Eraser with Label and Confidence Value

Fig. 4. Python Command Window - data is sent to the cloud (ERASER)

Fig. 5. Graphical Data Visualisation in Thingspeak (ERASER)

Fig. 6. 1 is displayed in numeric field when ERASER is detected’

©IJRASET: All Rights are Reserved | SJ Impact Factor 7.538 | ISRA Journal Impact Factor 7.894 | 1349
International Journal for Research in Applied Science & Engineering Technology (IJRASET)
ISSN: 2321-9653; IC Value: 45.98; SJ Impact Factor: 7.538
Volume 10 Issue XI Nov 2022- Available at www.ijraset.com

Fig. 7. Detection of Person with Label and Confidence Value

Fig. 8. Python Command Window - data is sent to the cloud (ELEPHANT)

Fig. 9. Graphical Data Visualisation in Thingspeak (ELEPHANT)

Fig. 10. 2 is displayed in numeric field when ELEPHANT is detected

Fig. 3 and Fig. 7 shows the object detection of the respective objects using YOLO algorithm. Fig. 4 and Fig. 8
shows the output from the Python console through which we will know the real time information of the working.
Fig. 5 and Fig. 9 shows a graph plotted against the data received and time at which the data is received. Fig. 6 and
Fig. 10 shows the final data which is to be fetched by the ESP8266.

©IJRASET: All Rights are Reserved | SJ Impact Factor 7.538 | ISRA Journal Impact Factor 7.894 | 1350
International Journal for Research in Applied Science & Engineering Technology (IJRASET)
ISSN: 2321-9653; IC Value: 45.98; SJ Impact Factor: 7.538
Volume 10 Issue XI Nov 2022- Available at www.ijraset.com

.REFERENCES
[1] Karl Nicolaus, Jeremy Hooper, Richard wood and Chan Ham, De- velopment of an autonomous ball picking robot, 2016 International Conference on
Collaboration Technologies and Systems (CTS), IEEE, 2016.
[2] Yonghui Jai, Guojun Yang and Jafar saniie. Real time colour based sort- ing robotic arm system, International Conference on Electro Information Technology
(EIT) IEEE, 2017.
[3] Ashraf Elfasakhany, Eduardo Yanez, Karen Baylon, Ricardo Salgado, Design and Development of a Competitive Low-Cost Robot Arm, Inter- national Journal
of Emerging Technology and Advanced Engineering, Volume 3, Issue 5, May 2013, pp 10-15.
[4] Aji joy, Object sorting robotic arm based on colour sensing, International Journal of Advanced Research in Electrical, Electronics and Instrumen- tation
Engineering (An ISO 3297: 2007 Certified Organization) Vol. 3, Issue 3, March 2014.
[5] Balkeshwar Singh, Kumaradhas , N. Sellappan, Evolution of Industrial Robots and their Applications, IJEIT, Volume 2, Issue 2, November 2011, pp24-26.
[6] Prof. D. B. Rane, Gunjal Sagar S, Nikam Devendra V, Shaikh Jameer U, “Automation of Object Sorting Using an Industrial Roboarm and MATLAB Based
Image Processing”, International Journal of Emerging Technology and Advanced Engineering, Volume 5, 2004.
[7] Lee, J. Kim, M. Lee, et al. “3D visual perception system for bin picking in automotive sub-assembly automation,” Automation Science and Engineering (CASE),
2012 IEEE InternationalConferenceon, 2012, pp.706-713.
[8] T N. Tsagarakis, M. Laffranchi, B. Vanderborght, and D. Cald- well, “Acompact soft actuator unit for small scale human friendly robots”,IEEE International
Conference on Robotics and Automation Conference(ICRA), 2009, pp. 4356–4362.
[9] GoldyKatal, Saahil Gupta, ShitijKakkar,”Design And Operation Of Synchronized Robotic Arm”, International Journal of Research inEngi- neering and
Technology, Volume: 02 Issue: 08, Aug-2013

©IJRASET: All Rights are Reserved | SJ Impact Factor 7.538 | ISRA Journal Impact Factor 7.894 | 1351

You might also like