0% found this document useful (0 votes)
27 views45 pages

Main Project Zalzameen (Final)

Uploaded by

HEART SPORTS
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views45 pages

Main Project Zalzameen (Final)

Uploaded by

HEART SPORTS
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 45

THUMB AND INDEX FINGER CONTROLLED

LED BRIGHTNESS

A PROJECT REPORT
Submitted by

MUHAMMED ZALZAMEEN MK
[Register No: 2122J0942]
Under the guidance of

Mr. MICHAEL RAJ S, MCA., M.Phil., (Ph.D.,)

[PG Coordinator & Asst.Professor, Department of Computer


Applications]

In partial fulfillment for the award of the degree

Of

BACHELOR OF COMPUTER APPLICATIONS

In

DEPARTMENT OF COMPUTER APPLICATIONS

NILGIRI COLLEGE OF ARTS AND SCIENCE

(AUTONOMOUS)

MARCH 2024
NILGIRI COLLEGE OF ARTS AND SCIENCE
(AUTONOMOUS)
DEPARTMENT OF COMPUTER APPLICATIONS
PROJECT WORK

MARCH - 2024

This is to certify that the project entitled

THUMB AND INDEX FINGER CONTROLLED


LED BRIGHTNESS

Is a bonafide record of project work done by

MUHAMMED ZALZAMEEN MK

[Register No: 2122J0942]


Bachelor of Computer Applications during the year 2021-2024

Project Guide Head of the Department

Submitted for the project viva-voce examination held on..............................

Internal Examiner External Examiner


DECLARATION

I hereby declare that the project work, “THUMB AND INDEX FINGER CONTROLLED
LED BRIGHTNESS” submitted to Nilgiri College of Arts & Science (Autonomous), in partial
fulfillment of the requirements for the award of degree Bachelor of Computer Applications, is a
record of original project work done by me ,during the period of December 2023 to March 2024 under
the guidance of Mr. MICHAEL RAJ S, MCA., M.Phil., (Ph.D.,) PG Coordinator &
Asst.Professor, Department of Computer Applications, Nilgiri College of Arts and
Science(Autonomous), Thaloor.

MUHAMMED ZALZAMEEN MK

Signature of the Student

PLACE :

DATE :
ACKNOWLEDGEMENT
I am grateful to Almighty God for giving me the strength, knowledge and understanding to
complete this project. His love has been more than sufficient to keep and sustain me

I hereby express my sincere gratitude to the people, whose cooperation has helped me for the
successful completion of my project work. I would like to thank them deep from my heart, for the
valuable assistance they rendered for me.

I express my deep sense of gratitude to our most respected Principal Dr. SENTHIL KUMAR
G, M.Sc., M.Phil., Ph.D., Nilgiri College of Arts and Science (Autonomous), for the outstanding
facilities provided to carry out my project work.

I am extremely grateful display indebted to our Head of the Department Mr. P


MUTHUKUMAR, MCA., M.Phil., B.Ed., (Ph.D.,) Department of Computer Applications,
Nilgiri College of Arts and Science (Autonomous)Thaloor, for the valuable facilities and help
provided to me.

I express my deep sense of esteem to my guide Mr. MICHAEL RAJ S, MCA., M.Phil.,
(Ph.D.,) PG Coordinator & Asst.Professor, Nilgiri College of Arts and Science (Autonomous) ,
for all his encouragement, valuable advice and timely instructions at various stages of my project
work.

I am thankfully recollecting the helping mentality and kind cooperation rendered by my


intimate friends, family and all my dear and near ones for the successful completion of my project
work
LIST OF FIGURES
Fig 1: Thumb and Index Finger at minimum level
Fig 2: Thumb and Index finger at medium level
Fig 3: Thumb and index finger at maximum level
Fig 4: LED brightness at minimum level
Fig 5: LED brightness at medium level
Fig 6: LED brightness at maximum level
LIST OF ABBREVIATIONS

 LED: Light Emitting Diode


 AI: Artificial Intelligence
 IoT: Internet of Things
 ML: Machine Learning
 RGB: Red, Green, Blue
 GUI: Graphical User Interface
ABSTRACT

In this study, we propose a novel method for controlling LED brightness using thumb and
index finger gestures. The aim is to create an intuitive and interactive interface that allows users to
adjust the brightness of LEDs with simple hand movements. The system comprises a wearable device
equipped with sensors to detect the movements of the thumb and index finger. These movements are
translated into corresponding changes in LED brightness levels, providing users with real-time control
over the lighting environment.

The design incorporates machine learning algorithms to accurately interpret the gestures and
adjust the brightness levels accordingly. By leveraging the natural dexterity of the human hand, our
approach offers a user-friendly interface that is accessible to individuals of all ages and abilities.
TABLE OF CONTENTS

S.NO TITLE PAGE No

1 PROBLEM DEFINITION 1

1.1.1 Overview
1.1.2 Problem Statement

2 INTRODUCTION 3

2.1 System Specification


2.1.1 Hardware Specification
2.1.2 Software Specification
3 SYSTEM STUDY 6

3.1 Existing System


3.1.1 Drawbacks
3.2 Proposed System
3.2.1 Features

4 LITERATURE SURVEY 11

5 13
SYSTEM DESIGN AND DEVELOPMENT
5.1 Input Design
5.2 Output Design
5.3 Description of Modules

6 TESTING AND IMPLEMENTATION 18

7 CONCLUSION 24

8 SCOPE OF FUTURE ENHANCEMENT 26

9 REFERENCES 28

10 APPENDICES 30
DIAGRAMS
SAMPLE CODING
SAMPLE INPUT
SAMPLE OUTPUT
0
PROBLEM DEFINITION
1
1. PROBLEM DEFINITION

1.1.1 OVERVIEW

The integration of gesture control into electronic devices has revolutionized human-computer
interaction, offering intuitive and natural interfaces. In this context, controlling LED brightness
through thumb and index finger gestures presents an innovative approach to lighting control systems.
This section provides an overview of the problem domain and sets the stage for the subsequent
problem statement.

1.1.2 PROBLEM STATEMENT

The problem at hand involves designing and implementing a system that enables users to
adjust LED brightness levels using thumb and index finger gestures. The system should be capable of
accurately interpreting these gestures in real-time and translating them into corresponding changes in
LED brightness

2
INTRODUCTION
3
2. INTRODUCTION

Gesture-based control systems have emerged as a promising avenue for enhancing


human-computer interaction, offering intuitive interfaces that leverage natural body movements.
In this context, the ability to control LED brightness through thumb and index finger gestures
presents an innovative and user-friendly approach to lighting control systems. Gesture-based
interfaces, allow users to interact with devices using natural hand movements, mimicking
everyday actions. By leveraging gestures, these interfaces can enhance user experiences,
streamline interactions, and improve accessibility for individuals with mobility impairment

4
2.1 SYSTEM SPECIFICATION

2.1.1 Hardware Requirement:

Processor : Corei5 Processor

RAM : 4GB

ROM : 256 GB

2.1.2 Software Requirement:

Front End : Python, CPP

Operating system: Windows 10

5
SYSTEM STUDY
6
3. SYSTEM STUDY

System study is the first stage of a system development life cycle. This gives a clear picture
of what the physical system actually is. The system study is done in two phases. In the first phase,
the preliminary survey of the system is done which helps in identifying the scope of the system.
The second phase of the system study is a more detailed and in-depth study in which the
identification of user’s requirements and the limitations and problems of the present system are
studied. After completing the system study, a system proposal is prepared by the user.

7
3.1 EXISTING SYSTEM

The existing system relies on manual control mechanisms for operating LED brightness,
involving multiple switches and wires. Its primary aim is to facilitate the management of LED lighting
setups, typically found in various environments such as homes, offices, or public spaces. However,
this traditional approach to LED control is cumbersome and labor-intensive, requiring physical
manipulation of switches by individuals to adjust lighting configurations. Moreover, the reliance on
manual input makes the system less flexible and efficient, often leading to inefficiencies and
limitations in controlling the LED brightness effectively.

3.1.1 Drawbacks

 Limited Range of Control


 Accuracy and Precision
 Response Time
 Interference and Noise

8
3.2 PROPOSED SYSTEM

Thumb and Index Finger-Controlled LED Brightness: Develop a system that utilizes the
movement of the thumb and index finger to control the brightness of an LED. The system will
incorporate computer vision algorithms to detect the position of the fingers in real-time and adjust the
LED brightness accordingly. This system aims to develop a novel way of controlling LED brightness
using the movement of the thumb and index finger. By leveraging computer vision algorithms, the
system will detect the position of these fingers in real-time and adjust the LED brightness accordingly.

3.2.1 Features

 Real-time Gesture Recognition


 LED Brightness Control

9
PACKAGE SELECTION
cv2 (OpenCV):
•This package is used for computer vision tasks such as capturing video frames from a
webcam, processing frames, drawing on frames, and displaying images.
Mediapipe:
•The mediapipe package provides solutions for various media processing tasks, including hand
tracking. In this code, it is used for detecting hand landmarks.
Serial:
•The serial package facilitates serial communication with external devices like an Arduino
board.

10
LITERATURE SURVEY

11
4. LITERATURE SURVEY

A Review on Hand Gesture Recognition for Human-Computer Interaction: - This review


provides an overview of hand gesture recognition techniques for human-computer interaction,
including applications like thumb and index finger-controlled LED brightness. It discusses the
challenges, such as varying lighting conditions and occlusions, and reviews the use of machine
learning algorithms and depth sensors in gesture recognition systems.

A Survey on Hand Gesture Recognition Techniques for Human-Computer Interaction: -


This survey discusses various hand gesture recognition techniques for human-computer
interaction, including those using thumb and index finger movements to control LED brightness.
It covers methods like computer vision-based recognition, sensor-based recognition, and hybrid
approaches. The survey highlights the challenges and future directions in this field.

A Survey on Hand Gesture Recognition Techniques for Sign Language Recognition: -


This survey focuses on hand gesture recognition techniques for sign language recognition, which
can also be applied to thumb and index finger-controlled LED brightness. It discusses different
approaches, including neural networks, Hidden Markov Models (HMMs), and rule-based
systems. The survey evaluates the performance of these techniques and discusses their
applications and limitations.

12
SYSTEM DESIGN AND DEVELOPMENT
13
5. SYSTEM DESIGN AND DEVELOPMENT

System design transforms a logical representation of what the system is required to do into
the physical specification. The specifications are converted into a physical reality during the
development. Design forms a blueprint of the system and adds how the components relate to each
other. The design phase proceeds accordingly to an ordinary sequence of steps, beginning with
review and assignment of task and ending with package design. Design phase is the life cycle phase
in which the detailed design of the system selected in the study phase is accomplished. A smooth
transition from the study phase to design is necessary because the design phase continues the
activities in the earlier phase. The first step in the design phase is to design the database and then
input and output within predefined guidelines.

5.1 INPUT DESIGN

Input design deals with data that should be given as input, how the data should be arranged or
code, the dialogue to guide the operating personnel in providing input, methods for preparing input
validations and steps to follow when errors occur. Input design is the process of converting a user-
oriented description of the input into a computer-based system. This design is important to avoid
errors in the data input process and show the correct direction to the management for getting correct
information from the computerized system. It is by creating a user-friendly screen for the data entry
to handle large volumes of data. The goal of designing input is to make data entry easier and to be
free from errors. The data entry screen is designed in such a way that all the data can be performed. It
also provides record viewing facilities.

When the data is entered it will check for its validity. Data can be entered with the help of
screens. Appropriate messages are provided as when needed so that the user will not be in maize
instant. Thus the objective of input design is to create an input layout that is easy to follow. In this
project, the input design consists of a login screen, for the admin to get into that for the updation
purpose and the users are not allowed to enter into that section of part in the web application.

14
5.2 OUTPUT DESIGN

A quality output is one, which meets the requirements of the end user and presents the
information clearly. The objective of output design is to convey information about the products,
current rates of the products mentioned or warnings, trigger an action, confirm an action etc.
Efficient, intelligible output design should improve the system’s relationships with the user and help
in decision making. In output design the emphasis is on displaying the output on a CRT screen in a
predefined format. The primary consideration in design of output is the information requirement and
objectives of the end users. The major information of the output is to convey the information and so
its layout and design need careful consideration. There is an output display screen for showing the
compressed/decompressed file or folder details (Original file size, Compressed/Decompressed file
size, distinct characters).

15
16
5.3 DESCRIPTION OF MODULES

Arduino Board:
The Arduino board serves as the central processing unit for the project. It reads the analog
signals from the flex sensors, processes the data, and controls the LED brightness accordingly.

LED (Light Emitting Diode):


The LED is the output device whose brightness will be controlled by the flex sensors. Connect
the LED to one of the PWM (Pulse Width Modulation) pins on the Arduino for smooth brightness
control.

Arduino Code:
The Arduino code reads the analog values from the flex sensors, maps these values to a
suitable range for LED brightness, and then controls the LED using PWM. It continuously loops
through this process, adjusting the LED brightness in real-time based on the finger positions.

17
TESTING AND IMPLEMENTATION
18
6. TESTING AND IMPLEMENTATION

Testing is a set of activities that can be planned in advance and conducted. Systematically,
this is aimed at ensuring that the system works accurately and efficiently before live operations
commences.
● Testing Is the process of correcting a program with the intent of finding an
error.
● A good test case is one that has a high probability of finding a yet
undiscovered error.
● A successful test is one that uncovers a yet undiscovered error.

Testing objectives

There are several rules that can serve as testing objectives

● Testing is a process of executing a program with the intent of finding an error.


● A good test case is one that has a high probability of finding an undiscovered
error.
● A successful test is one that uncovers an undiscovered error.

Testing is vital to the success of the system. System testing makes a logical assumption that
all parts of the system are subject to a variety of tests on-line response, volume, stress, recovery and
security and usability tests. A series of tests are performed before the system is ready for user
acceptance testing.

Testing Strategies

● Manual Testing
○ Usability Testing
○ Acceptance Testing

19
Manual Testing

This testing is performed without taking help of automated testing tools. The software tester
prepares test cases for different sections and levels of the code, executes the tests and reports the
result to the manager. Manual testing is time and resource consuming. The tester needs to confirm
whether or not the right test cases are used. Major portion of testing involves manual testing.

Usability Testing

Usability testing refers to evaluating a product or service by testing it with representative


users. Typically, during a test, participants will try to complete typical tasks while observers watch,
listen and take notes. The goal is to identify any usability problems, collect qualitative and
quantitative data and determine the participant's satisfaction with the product.Usability testing lets
the design and development teams identify problems before they are coded. The earlier issues are
identified and fixed, the less expensive the fixes will be in terms of both staff time and possible
impact to the schedule. Thus the proposed system under consideration has been tested by usability
testing & found to be working successfully. Also it satisfy the following conditions :

● To know the participants are able to complete specific tasks successfully.


● Identify how long it takes to complete specific tasks.
● Find out how satisfied participants are with your Web site or other product.
● Identify changes required to improve user performance and satisfaction.
● And analyze the performance to see if it meets your usability objectives.

20
Acceptance Testing

When the software is ready to hand over to the customer it has to go through the last phase
of testing where it is tested for user-interaction and response. This is important because even if the
software matches all user requirements and if the user does not like the way it appears or works, it
may be rejected.
Alpha testing - The team of developers themselves perform alpha testing by using the system
as if it is being used in a work environment. They try to find out how users would react to some action
in software and how the system should respond to inputs.

Beta testing - After the software is tested internally, it is handed over to the users to use it
under their production environment only for testing purpose. This is not yet the delivered product.
Developers expect that users at this stage will bring minute problems, which were skipped to attend

21
Libraries:

 Open CV(cv2)
OpenCV (Open Source Computer Vision Library) is a popular open-source library for
computer vision and image processing tasks.

 Mediapipe
Mediapipe is a Google-developed library for building machine learning pipelines for
various tasks, including hand tracking, pose estimation, and facial recognition.

 Serial
Used for serial communication with an Arduino board

22
IMPLEMENTATION
Hardware Setup:
Gather necessary hardware components, including an Arduino board, LEDs, a webcam, and
any required peripherals.
Connect the LEDs to the Arduino board and ensure they are properly powered and wired.

Software Installation:
Install the Arduino IDE on your computer and set up the Arduino board for programming.
Install OpenCV library and any dependencies on your development environment for image processing
and gesture recognition.

Gesture Recognition Algorithm:


Develop or utilize pre-existing hand gesture recognition algorithms using OpenCV.
Implement algorithms for hand detection, tracking, and gesture recognition

Arduino Programming:
Write Arduino code to receive gesture commands from the computer via serial
communication.
Implement functions to control the LEDs based on the received commands

Integration and Deployment:


Integrate all components of the system, including hardware and software, into a cohesive unit.

User Training :
Provide user training on how to interact with the system using hand gestures.

23
CONCLUSION
24
7. CONCLUSION

This project proposes a thumb and index finger-controlled LED brightness system utilizing
computer vision algorithms. By detecting and interpreting specific hand gestures, users can control the
brightness and intensity of the LED unit, offering a touch less and intuitive experience. This approach
eliminates the inconvenience of traditional control methods like buttons and remote controls,
promoting accessibility and potentially opening doors for innovative applications in various fields,
such as smart homes, entertainment systems, and interactive art installations.

Furthermore, this project has provided valuable insights into sensor interfacing,
microcontroller programming, and real-time signal processing. The knowledge gained from this
experience will undoubtedly inform future projects and contribute to our understanding of embedded
systems and human-machine interaction.

However, it is important to acknowledge that this project is in its early stages, and further
research and development are needed to refine the system's accuracy, robustness, and user experience.
Future work may explore incorporating advanced gesture recognition techniques, implementing user
customization options, and investigating potential applications in various contexts

In summary, our thumb and index finger-controlled LED brightness project represents a
successful fusion of hardware and software engineering principles to create a practical and user-
friendly interface. As we continue to refine and expand upon this technology, we look forward to
exploring its full potential and making meaningful contributions to the field of interactive systems

25
SCOPE OF FUTURE ENHANCEMENT
26
8. SCOPE OF FUTURE ENHANCEMENT

In the future, the Thumb and Index finger controlled LED brightness system holds significant
potential for enhancement, with opportunities to refine functionality and elevate user experience.
Potential avenues for improvement include the development of more sophisticated brightness control
algorithms, integration of machine learning techniques for personalized interactions, and incorporation
of multi-modal input options. Additionally, advancements in user interface design, IoT integration,
and energy optimization offer promising pathways for enhancing usability, accessibility, and energy
efficiency. By embracing these possibilities and continuously iterating based on user feedback, the
system can evolve into a more versatile and user-centric solution, catering to a broader range of needs
and preferences.

27
REFERENCES
28
9. REFERENCE

 Bradski, G. Opencv. Retrieved from OpenCV: https://opencv.org/


 Hassan, M. cvzone. Retrieved from https://www.computervision.zone/
 OpenAI. (2022). Github. Retrieved from https://github.com/openai/gpt-3.5-turbo
 OpenAI. OpenAI. Retrieved from https://openai.com
 Rosebrock,CV ZONE, Retrieved from Computer Vision Zone:
https://www.computervision.zone/
 Thompson, R. P. Mediapipe. Retrieved from Mediapipe: https://pypi.org/project/mediapipe/

29
APPENDICES
30
10. APPENDICES

A. USE CASE DIAGRAM

LED BRIGHTNESS

GESTURE CONTROL

ARDUINO

CAMERA

USER

31
B. SAMPLE CODING

import cv2
import mediapipe as mp
import serial
import time

# Initialize Mediapipe Hand module


mp_hand = mp.solutions.hands
hands = mp_hand.Hands()

# Open webcam
cap = cv2.VideoCapture(0)

# Configure serial port (update port name based on your setup)


ser = serial.Serial('COM9', 9600, timeout=1) # Adjust port name and baud rate

time.sleep(2) # Allow Arduino to initialize

while cap.isOpened():
ret, frame = cap.read()
if not ret:
continue

# Flip the frame horizontally for a later selfie-view display


frame = cv2.flip(frame, 1)

# Convert the BGR image to RGB


rgb_frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)

# Process the frame and detect hand landmarks


results = hands.process(rgb_frame)

32
if results.multi_hand_landmarks:
for hand_landmarks in results.multi_hand_landmarks:
# Extract landmarks for thumb and index finger
thumb_tip = hand_landmarks.landmark[mp_hand.HandLandmark.THUMB_TIP]
index_tip = hand_landmarks.landmark[mp_hand.HandLandmark.INDEX_FINGER_TIP]

# Get the pixel coordinates of thumb and index finger tips


thumb_x, thumb_y = int(thumb_tip.x * frame.shape[1]), int(thumb_tip.y * frame.shape[0])
index_x, index_y = int(index_tip.x * frame.shape[1]), int(index_tip.y * frame.shape[0])

# Draw line between thumb and index finger


cv2.line(frame, (thumb_x, thumb_y), (index_x, index_y), (255, 0, 0), 2)

# Draw circles on thumb and index finger tips


cv2.circle(frame, (thumb_x, thumb_y), 10, (0, 255, 0), -1)
cv2.circle(frame, (index_x, index_y), 10, (0, 0, 255), -1)

# Calculate distance between thumb and index finger


distance = ((thumb_tip.x - index_tip.x) ** 2 + (thumb_tip.y - index_tip.y) ** 2) ** 0.5
# Map the distance to LED intensity (adjust these values based on your requirements)
intensity = int(255 - min(255, max(0, distance - 0.05) / 0.1 * 255))
# Send intensity value to Arduino
ser.write(bytes([intensity]))

# Display the frame


cv2.imshow('Hand Tracking', frame)

if cv2.waitKey(1) & 0xFF == 27: # Press 'Esc' to exit


break
cap.release()
ser.close()
cv2.destroyAllWindows()
33
C. SAMPLE INPUT

Fig 1 thumb and index finger at minimum level

Fig 2 thumb and index finger at medium level


34
Fig 3 thumb and index finger at maximum level

35
D. SAMPLE OUTPUT

Fig 4 LED brightness at minimum level

Fig 5 LED brightness at medium level

36
Fig 6 LED brightness at maximum level

37

You might also like