0% found this document useful (0 votes)
8 views10 pages

CSMA05 Nithin

The document presents a major project on the design and implementation of an AI-controlled robotic arm that utilizes computer vision for object recognition and sorting based on color detection. The system integrates OpenCV with an Arduino-based motor control unit to enhance automation in industrial settings, reducing human intervention and improving operational efficiency. Future enhancements may include advanced machine learning algorithms for better object recognition and adaptability.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views10 pages

CSMA05 Nithin

The document presents a major project on the design and implementation of an AI-controlled robotic arm that utilizes computer vision for object recognition and sorting based on color detection. The system integrates OpenCV with an Arduino-based motor control unit to enhance automation in industrial settings, reducing human intervention and improving operational efficiency. Future enhancements may include advanced machine learning algorithms for better object recognition and adaptability.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

MAJOR PROJECT FINAL PRESENTATION

ON
DESIGN AND IMPLEMENTATION OF AI CONTROLLED ROBOTIC ARM

21R21A6629 - Lakshmi Priya Y,


21R21A6626 - K Anirudh,
22R25A6604 - Bhaskara Sathwik,
21R21A6609 - C Nithin

Under the Guidance of

Dr. K. SivaKrishna
Associate Professor
Department of Computer Science and Engineering-Artificial Intelligence & Machine Learning
29/04/2025

Contents
• Abstract
• Introduction
• Literature Survey
• Existing System
• Disadvantages of Existing System
• Proposed System
• Objectives
• Architecture
• Algorithms and Flowcharts
• UML Diagrams
• Implementation(Sample code)
• Result and Discussion
• Conclusion and Future Enhancement
• References
• Publication Details
Abstract

The AI-Controlled Robotic Arm is an intelligent automation system designed for object recognition,
classification, and sorting based on color detection. Using OpenCV and a camera module, the system
processes real-time video feed to identify red, green, and blue objects. The identified color signals are
transmitted to an Arduino-based motor control unit, which directs the robotic arm for precise
pick-and-place operations. The system utilizes HSV color space for robust color segmentation, reducing
false detections. The integration of AI and embedded systems enhances accuracy and efficiency, making it
suitable for industrial automation and smart manufacturing. This project demonstrates a practical
approach to intelligent robotic control, providing a foundation for future advancements in autonomous
systems and machine vision applications.

Introduction
• The AI-Controlled Robotic Arm is an advanced automation system that combines computer vision and
embedded control to perform intelligent object sorting based on color detection.

• Utilizing OpenCV and a camera module, the system captures real-time video feed, processes the
images in the HSV color space, and identifies objects of predefined colors with high accuracy.

• The robotic arm is controlled via an Arduino-based system that receives color detection signals and
executes precise pick-and-place actions, ensuring efficient object handling.

• This system enhances automation in industries by reducing human effort, increasing operational
efficiency, and providing a cost-effective solution for sorting and classification tasks.

• AI and real-time computer vision enhance robotic adaptability for applications like automation, smart
warehouses, and manufacturing quality control.
Literature Survey
S. No Author(s) & Year Title Key Findings
1 Sriram et al., 2023 [1] IoT-enabled 6DOF robotic arm Developed an IoT-controlled robotic
with inverse kinematic control: arm with inverse kinematics for
design and implementation precise movements.
2 Haggerty et al., 2020 [2] Modeling, reduction, and control Introduced a novel control method for
of a helically actuated inertial soft soft robotic arms using Koopman
robotic arm via the Koopman operators.
operator
3 Yeshmukhametov et al., A novel discrete wire-driven Designed a wire-driven robotic arm
2019 [3] continuum robot arm with passive with improved flexibility and
sliding disc kinematic efficiency.
4 Bhuyan & Mallick, 2014 [4] Gyro-accelerometer based control Proposed a motion-controlled robotic
of a robotic arm using AVR arm using sensors for real-time
microcontroller control.

Literature Survey
S. No Author(s) & Year Title Key Findings
5 Feng et al., 2015 [5] Optimization-based controller Implemented an optimization-based
design and implementation for the control system for complex robotic
Atlas robot in the DARPA movements.
Robotics Challenge Finals
6 Hanson et al., 2020 [6] A Neuro-Symbolic Humanlike Developed a human-like robotic arm
Arm Controller for Sophia the controller with neuro-symbolic AI
Robot techniques.
7 Arezzo et al., 2017 [7] Total mesorectal excision using a Demonstrated the use of soft robotic
soft and flexible robotic arm: a arms for minimally invasive surgical
feasibility study procedures.
8 Ranjbar et al., 2024 [8] Kinematic matrix: One-shot Proposed a kinematic matrix-based
human action recognition using action recognition technique for
kinematic data structure robotic applications.
Existing System
• Traditional object sorting systems rely on human operators, making the process time-consuming,
labor-intensive, and prone to errors

• Many robotic arms operate based on predefined movements, which limits their ability to adapt
dynamically to variations in object placement, size, or color.

• Some existing robotic systems use basic image processing techniques, but the lack of advanced
AI-based computer vision reduces accuracy in object detection under different conditions.

• Most conventional robotic systems do not have real-time feedback mechanisms, preventing
continuous monitoring and adaptation for optimized performance.

Disadvantages of Existing System


• High Dependency on Human Intervention: Many conventional systems rely heavily on manual
operation, leading to inefficiencies and higher chances of human error.

• Limited Accuracy in Object Recognition: Traditional methods struggle to accurately detect and
classify objects in varying lighting conditions and complex environments.

• Lack of Real-Time Adaptability: Existing solutions do not dynamically adjust to new conditions or
unexpected object placements, reducing overall efficiency.

• Higher Operational Costs: Manual intervention and outdated technology increase labor costs and
maintenance expenses, making these systems less cost-effective in the long run.
Proposed System
• The system utilizes computer vision to identify and classify objects in real-time, reducing the need for
manual intervention.

• Advanced image processing techniques and machine learning models enhance the accuracy of object
recognition, even in varying environmental conditions.

• The system processes data instantly and makes immediate decisions, improving operational efficiency.

• The proposed system communicates with hardware components (e.g., robotic arms) to perform precise
actions based on detected objects.

• By reducing human dependency and increasing automation, the system lowers operational costs while
allowing for future scalability and improvements.

Objectives
• To develop an efficient and accurate object detection system using computer vision techniques.

• To integrate real-time image processing for immediate decision-making and response.

• To enable seamless communication between the vision system and hardware components for
automated actions.

• To enhance system scalability and adaptability for future improvements and applications.
Architecture

Flow Chart
Use Case Diagram

Implementation(Sample Code )
import cv2
import serial
arduino = serial.Serial('COM3', 9600)
cap = cv2.VideoCapture(0)
while True:
ret, frame = cap.read()
hsv = cv2.cvtColor(frame, cv2.COLOR_BGR2HSV)
red_mask = cv2.inRange(hsv, (0, 120, 70), (10, 255, 255))
if np.any(red_mask):
arduino.write(b'A')
cv2.imshow("Detection", frame)
if cv2.waitKey(1) & 0xFF == ord('q'):
break
cap.release()
arduino.close()
Result and Discussion

Fig1. Initial Position Fig2. Picking the object

Result and Discussion

Fig3. Holding the Object Fig4. Dropping the Object


Conclusion & Future Scope
• The system effectively detects red, green, and blue objects in real-time, enabling automated responses
through Arduino control.

• Real-time processing, supported by OpenCV and serial communication, ensures detection and action
with minimal delay.

• The system performs well under controlled lighting conditions but faces challenges with varying
lighting and complex backgrounds.

• Future enhancements could include machine learning algorithms for improved object recognition and
adaptive thresholding.

• The system can be expanded to detect more colors and support complex sorting tasks for broader
industrial automation applications.

References
[1] Sistu, G., Leang, I., & Yogamani, S. (2019). Real-time Joint Object Detection and Semantic
Segmentation Network for Automated Driving. arXiv preprint arXiv:1901.03912.

[2] Leclercq, P., & Bräunl, T. (2001). A Color Segmentation Algorithm for Real-Time Object Localization
on Small Embedded Systems. In R. Klette, S. Peleg, & G. Sommer (Eds.), Robot Vision (pp. 69–76).

[3] Zhang, L., & Wang, L. (2018). Color sensors and their applications based on real-time color image
segmentation for cyber physical systems. EURASIP Journal on Image and Video Processing, 2018(1).

[4] (2011). Real time object detection using a novel adaptive color thresholding method. In Proceedings
of the 2011 international ACM workshop on Ubiquitous meta user interfaces (pp. 35–42). ACM.

[5] (2014). A Real-Time and Effective Object Recognition and Localization Method. Applied Mechanics
and Materials, 615, 107–112.

You might also like