0% found this document useful (0 votes)
468 views

Minor Project Report Sign Language Detection

Uploaded by

Manu Atri
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
468 views

Minor Project Report Sign Language Detection

Uploaded by

Manu Atri
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 34

A PROJECT REPORT ON

SIGN LANGUAGE DETECTION

A PROJECT

Submitted in partial fulfillment of the requirements for the award of the

degree OF

BACHELOR OF COMPUTER APPLICATION

Submitted By: SAHIL


SHARMA(02591102019)

BCA- 5TH SEMESTER UNDER

THE SUPERVISION OF:

DR.SUPREET KAUR SAHI

SRI GURU TEGH BHAHADUR INSTITUTE OF MANAGEMENT AND

INFORMATION TECHNOLOGY

GURUDWARA NANAK PIAO CAMPUS G.T. KARNAL ROAD,DELHI-110033


(Approved By AICTE Ministry of HRD, Govt. of India)

AFFILIATED TO GURU GOBIND SINGH INDRAPRASTHA

UNIVERSITY,DELHI 2019-2022

DECLARATION

This project titled “SIGN LANGUAGE DETECTION” towards the


completion of my course requirements for Semester V is my original work and
has been carried out under the guidance of DR.SUPREET KAUR SAHI.
The material borrowed from other sources and incorporated in the report has
been duly acknowledged and referenced.
I understand that I will be held liable and accountable for my project; it is all
done by me without having any copyright(s) of any of the organization. I further
declare that the work reported in this project has not been submitted and will not
be submitted, either in part or in full, for the award of any other degree or
diploma in this institute or any other institute or university.

Sahil Sharma
(02591102019)
CERTIFICATE

This is to certify that the project entitled: “SIGN LANGUAGE DETECTION” done by Sahil
Sharma having university enrolment number 02591102019 is an authentic work carried out
by them at SRI GURU TEGH BAHADUR INSTITUTE OF MANAGEMENT & INFORMATION
TECHNOLOGY. The matter embodied in this project work has not been submitted earlier for
the award of any degree or diplomas to the best of my knowledge and belief. The
suggestion as approved by the faculty was duly incorporated.

DATE: 04/12/2021

HOD: MR.AMANDEEP SINGH

GUIDE: DR.SUPREET KAUR SAHI


ACKNOWLEDGEMENT

We are extremely grateful to our guide DR. SUPREET KAUR SAHI for being a source of
inspiration and for her constant support in the Design, Implementation and Evaluation of
the project. I am thankful to her for constant constructive criticism and in valuable
suggestions, which benefited me a lot while developing the project on “ SIGN LANGUAGE
DETECTION”.

She has been a constant source of inspiration and motivation for hard work. She has been
very cooperative throughout this project work with condor and pleasure. Through this
column, it would be my utmost pleasure to express my warm thanks to her for the
encouragement, co- operation and consent without which we mightn’t be able to
accomplish this project.

Place: New Delhi

Sahil Sharma

BCA-5th SEM

HOD: MR.AMANDEEP SINGH

GUIDE: DR. SUPREET KAUR SAHI


Abstract

The project aims at building a program that is able to recognize the various hand gestures used for
different signs to communicate or help us in doing some contactless communication with the
system. We are taking the help of different libraries like mediapipe as it has the ability to perceive
the shape and motion of hands that is the main requirement for the program. MediaPipe Hands is
a high-fidelity hand and finger tracking solution. It employs machine learning to infer 21 3D
landmarks of a hand from just a single frame. MediaPipe Hands utilizes an ML pipeline
consisting of multiple models working together: A palm detection model that operates on the full
image and returns an oriented hand bounding box. A hand landmark model that operates on the
cropped image region defined by the palm detector and returns high-fidelity 3D hand keypoints.
We can assign different hand gestures to functions that can be performed by doing the hand
gesture like clicking images, applying filters, controlling volume by using hands only. The
program can also tell about some sign language gestures.
TABLE OF CONTENT

CHAPTER NO. TOPIC

1. INTRODUCTION
1.1. INTRODUCTION TO THE PROJECT
2. REQUIRMENT ANALYSIS
2.1. SOFTWARE REUIRMENT ANALYSIS
2.2. USE CASE DIAGRAM
3. SOFTWARE DESIGN
3.1. INTRODUCTION
3.2. DATA FLOW DIAGRAM (LEVEL O)
3.3. DATA FLOW DIAGRAM (LEVEL 1)
3.4 DATA FLOW DIAGRAM (LEVEL 2)
4 TESTING
4.1. INTRODUCTION
4.2. TEST CASES
5. FUTURE WORK
6. CONCLUSION AND ACHIEVEMENTS
APPENDICES

LIST OF FIGURES

FIGURE
FIGURE1. USE CASE
FIGURE 2. DATA FLOW DIAGRAM LEVEL 0
FIGURE 3. DATA FLOW DIAGRAM LEVEL 1
FIGURE 4. ER DIAGRAM
CHAPTER 1
INTRODUCTION
Introduction

As we know, the vision-based technology of hand gesture recognition is an important part of


human-computer interaction. Recently we have used keyboards and mouse to play a significant
role in human-computer interaction. However, at the rapid development of hardware and
software, new types of human-computer interaction methods have been required. In particular,
technologies such as speech recognition and gesture recognition receive great attention in the
field of human-computer interaction.

 The posture of the body or the gesture of the hand denotes a sign and the movement of the body
or the hand conveys some messages. Gesture can be used as a tool of communication between
computer and human.  It is greatly different from the traditional hardware based methods and can
accomplish human-computer interaction through gesture recognition as we can make a lot of
different signs using our hands. Hand gesture recognition has great value in many applications
such as sign language recognition, augmented reality (virtual reality), sign language interpreters
for the disabled and lot more.

The objective of our project is to provide a real time hand finger counter that will also act as sign
language detector and will provide some functions on specific hand gestures and can help us do
some of the tasks hands free in contactless manner without the use of any special hardware
requirement.

We are using python for the project as it provides a lot of features and comes will a large
number of libraries and modules that makes the code small and easy to analyze and find errors in
efficient ways and can provides easy GUI which can make the using very friendly for the user.

1.1 SCOPE OF THE PROJECT


 Fast working
 Easy to use
 One click access

1.2 OBJECTIVE OF PROJECT


 Interpretation of Sign Language
 Contact free volume control

8
CHAPTER 2
REQUIREMENTANALYSIS

9
 REQUIREMENT ANALYSIS

A basic working computer with a working camera is required, as this does not require any
special hardware to run. In software requirements the system should have the following libraries-
matplotlib, opencv and mediapipe.

2.1 Hardware Requirement


 Working Camera
 Processor- i3 or above
 RAM- 4gb or more

2.2 Software Requirement


 Operating System – Windows 10
 Python
 Visual Studio Code/Jupyter Notebook

10
Use Case Diagram

Log In

Start Webcam

Capture Video

Capture Gesture

USER SYSTEM
Translate Gesture

Extract features

Recognize Gesture

Display Result
11
CHAPTER 3
SOFTWARE DESIGN

12
3.1 Introduction

This chapter will focus on the design of the system using diagrams to illustrate graphically
certain sections of software system. As information moves through software, it is modified
by a series of transformations. A data flow diagram is a graphical representation that depicts
information flow and the transforms that are applied as data move from input to output.

The data flow diagram may be used to represent a system or software at any level of
abstraction. In fact, DFDs may be partitioned into levels that represent increasing
information flow and functional detail. Therefore, the DFD provides a mechanism for
functional modelling as well as information flow modelling.

Symbols used in Data Flow Diagram

Symbol Name Description


Process Performs some transformation of
its input data to yield output data.

Source or External A source of external input.


Entity

Data Flow Show the data flow between


Direction source and processes.
OR

13
3.2 Data Flow Diagram (Level 0)

Hand Gesture and Sign Language


information

Tracks hands Hand gesture


User recognition and Sign
Language Detection

COUNT FINGERS

14
3.3 Data Flow Diagram (Level 1)

LOGIN/REGISTER

USER AUTHENTICATION

GIVES
OUTPUT

ADMIN

SIGN LANGUAGE HAND GESTURE CONTROLS


DETECTION RECOGNITION VOLUME

IMAGE
INPUT

CAMERA

IMAGE CAPTURE

15
3.4 Data Flow Diagram (Level 2)

1) LOGIN/REGISTER

USER

EXISTING USER

NEW USER

LOGIN
REGISTER

FETCHES USER
DATA
HOMEPAGE
SAVES USERID
AND
PASSWORD

USER INFORMATION

16
2) Sign language detection

USER

OUTPUT

SIGN LANGUAGE
DETECTION

IMAGE CAPTURE
AND INPUT

CAMERA

17
3) HAND GESTURE
RECOGNITION

USER

HAND GESTURE
RECOGNITION
CONTROLS
VOLUME HAND MOVEMENT
INPUT

CAMERA

18
CHAPTER 5
TESTING

19
4.1 INTRODUCTION

The testing of software means measuring or accessing the software to determine the
quality. Testing is a measuring instrument for software quality with the unit of
measurement being the number of defects found during testing. Testing activities also
help to achieve software quality. Testing is essential in the development of any system
software. Testing is essential in the development of any software system.

4.2LEVELS OF TESTING
4.2.1 Unit Testing
Unit testing comprises the set of tests performed usually by the programmers prior to the
integration of the unit in to a large Program. This is the lowest level of testing and is done
by the programmer (Who develops it) who can test it in great detail.

4.2.2 System Testing

After in integration testing is completed, the entire system is tested as whole. The
functional specifications or requirements specification are used to derive the test case. At
this level the system testing looks for errors in the end-to-end functional quality.
Attributes such as performance, reliability, Volume, stress tolerance, usability,
maintainability, security etc. Independent testers can carry out this testing.

Test Case Table

S Test Case Description Expected Result Result


No.
1. Successful User The login to the Login should be Passed
Login program should be successful and
tried with the login the user should
name and the correct be move further.
password.
2. Unsuccessful User Login to the system Login should Passed
Login due to wrong with a wrong fail
password password with an error
‘Invalid
Password’.
3. Unsuccessful Login to the program Login should Passed
User Login with an invalid login fail
due to id with an error
invalid ‘Invalid Login
Login id Id’.
4. Successfully recognize When the hand is Keypoints will be Passed
hands shown to the camera it shown on the hand
20
should be recognized.
5. Successfully recognize When index and middle Clicked image will Passed
gesture fingers of any hand is be visible on the
shown it will click an screen
image.
6. Successfully count The fingers that are Number of fingers Passed
fingers shown should be that are up will be
counted shown

5.Future Work

This can be used as an interpreter by the disabled to interact with other


people easily and it can also help them in interacting with the computer
system more easily and in an efficient way. We can also add more
functionality that can also make our day to day life easy and
contactless.

6.Conclusion

Instead of using a web camera we can use a better resolution camera


for better images and we can also use infrared camera for the night use
as well. The mechanism is not that much accurate.

SCREENSHOTS

21
 AUTHENTICATION:

22
 HOMEPAGE:

1) FINGER COUNTER
23
24
2) SIGN LANGUAGE DETECTION

25
3) VOLUME CONTROL USING HAND GESTURE

26
APPENDICES

CODING:
from tkinter import *
import os
import cv2
import time
import numpy as np
import mediapipe as mp
import matplotlib.pyplot as plt
import math
from ctypes import cast, POINTER
from comtypes import CLSCTX_ALL
from pycaw.pycaw import AudioUtilities, IAudioEndpointVolume

#import tkinter as tk
 
mp_hands = mp.solutions.hands

# Set up the Hands functions for images and videos.


hands = mp_hands.Hands(static_image_mode=True, max_num_hands=2,
min_detection_confidence=0.5)
hands_videos = mp_hands.Hands(static_image_mode=False, max_num_hands=2,
min_detection_confidence=0.5)

# Initialize the mediapipe drawing class.


mp_drawing = mp.solutions.drawing_utils

def detectHandsLandmarks(image, hands, draw=True, display = True):


   
    # Create a copy of the input image to draw landmarks on.
    output_image = image.copy()
   
    # Convert the image from BGR into RGB format.
    imgRGB = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)
   
    # Perform the Hands Landmarks Detection.
    results = hands.process(imgRGB)
   
    # Check if landmarks are found and are specified to be drawn.

27
    if results.multi_hand_landmarks and draw:
       
        # Iterate over the found hands.
        for hand_landmarks in results.multi_hand_landmarks:
           
            # Draw the hand landmarks on the copy of the input image.
            mp_drawing.draw_landmarks(image = output_image, landmark_list =
hand_landmarks,
                                      connections = mp_hands.HAND_CONNECTIONS,
                                     
landmark_drawing_spec=mp_drawing.DrawingSpec(color=(255,255,255),
                                                                               
thickness=2, circle_radius=2),
                                     
connection_drawing_spec=mp_drawing.DrawingSpec(color=(0,255,0),
                                                                               
thickness=2, circle_radius=2))
   
    # Check if the original input image and the output image are specified to be
displayed.
    if display:
       
        # Display the original input image and the output image.
        plt.figure(figsize=[15,15])
        plt.subplot(121);plt.imshow(image[:,:,::-1]);plt.title("Original
Image");plt.axis('off');
        plt.subplot(122);plt.imshow(output_image[:,:,::-
1]);plt.title("Output");plt.axis('off');
       
    # Otherwise
    else:
       
        # Return the output image and results of hands landmarks detection.
        return output_image, results  
def delete_login_success():
    login_success_screen.destroy()
    login_screen.destroy()
    main_screen.destroy()
def qu():
        camera_video = cv2.VideoCapture(0)
        camera_video.set(3, 1280)
        camera_video.set(4, 960)
        cv2.namedWindow('Fingers Counter', cv2.WINDOW_NORMAL)

        # Iterate until the webcam is accessed successfully.

28
        while camera_video.isOpened():

            ok, frame = camera_video.read()

            # Check if frame is not read properly then continue to the next


iteration to read the next frame.
            if not ok:
                continue
            frame = cv2.flip(frame, 1)

        # Perform Hands landmarks detection on the frame.


            frame, results = detectHandsLandmarks(frame, hands_videos,
display=False)

        # Check if the hands landmarks in the frame are detected.


            if results.multi_hand_landmarks:
                # Count the number of fingers up of each hand in the frame.
                frame, fingers_statuses, count = countFingers(frame, results,
display=False)
            # Display the frame.
            cv2.imshow('Fingers Counter', frame)

        # Wait for 1ms. If a key is pressed, retreive the ASCII code of the key.
            k = cv2.waitKey(1) & 0xFF

        # Check if 'ESC' is pressed and break the loop.


            if (k == 27):
                break
        camera_video.release()
        cv2.destroyAllWindows()
    def write_text():
        camera_video = cv2.VideoCapture(0)
        camera_video.set(3, 1280)
        camera_video.set(4, 960)
        cv2.namedWindow('Selfie-Capturing System', cv2.WINDOW_NORMAL)
        num_of_frames = 5

    # Initialize a dictionary to store the counts of the consecutive frames with


the hand gestures recognized.
        counter = {'V SIGN': 0, 'SPIDERMAN SIGN': 0, 'HIGH-FIVE SIGN': 0, 'NO':
0, 'PLAY': 0, 'YES': 0, 'IHATEYOU': 0}

    # Initialize a variable to store the captured image.


        captured_image = None

29
    # Iterate until the webcam is accessed successfully.
        while camera_video.isOpened():
            # Read a frame.
            ok, frame = camera_video.read()

        # Check if frame is not read properly then continue to the next iteration
to read the next frame.
            if not ok:
                continue
            # Flip the frame horizontally for natural (selfie-view)
visualization.
            frame = cv2.flip(frame, 1)

        # Get the height and width of the frame of the webcam video.
            frame_height, frame_width, _ = frame.shape
            frame, results = detectHandsLandmarks(frame, hands_videos,
draw=False, display=False)

        # Check if the hands landmarks in the frame are detected.


            if results.multi_hand_landmarks:
                frame, fingers_statuses, count = countFingers(frame, results,
draw=False, display=False)

            # Perform the hand gesture recognition on the hands in the frame.


                _, hands_gestures = recognizeGestures(frame, fingers_statuses,
count, draw=False, display=False)
                if any(hand_gesture == "SPIDERMAN SIGN" for hand_gesture in
hands_gestures.values()):
                    # Increment the count of consecutive frames with SPIDERMAN
hand gesture recognized.
                    counter['SPIDERMAN SIGN'] += 1

                # Check if the counter is equal to the required number of


consecutive frames.  
                    if counter['SPIDERMAN SIGN'] == num_of_frames:

                        print('ILOVEYOU')
                        font = cv2.FONT_HERSHEY_SIMPLEX
                        cv2.putText(frame,
                                    'ILOVEYOU',
                                    (100, 150),
                                    font, 3,
                                    (0, 255, 255),
                                    5,
                                    cv2.LINE_4)

30
                        counter['SPIDERMAN SIGN'] = 0
                else:
                    # Update the counter value to zero. As we are counting the
consective frames with SPIDERMAN hand gesture.
                    counter['SPIDERMAN SIGN'] = 0
                    if any(hand_gesture == "HIGH-FIVE SIGN" for hand_gesture in
hands_gestures.values()):

                        counter['HIGH-FIVE SIGN'] += 1

                # Check if the counter is equal to the required number of


consecutive frames.  
                        if counter['HIGH-FIVE SIGN'] == num_of_frames:
                             # Turn off the filter by updating the value of the
filter status variable to False.
                            print('HELLO')
                            font = cv2.FONT_HERSHEY_SIMPLEX
                            cv2.putText(frame,
                                        'HELLO',
                                        (100, 150),
                                        font, 3,
                                        (0, 255, 255),
                                        5,
                                        cv2.LINE_4)
                            counter['HIGH-FIVE SIGN'] = 0
                    else:
                        counter['HIGH-FIVE SIGN'] = 0
                    if any(hand_gesture == "NO" for hand_gesture in
hands_gestures.values()):
                        counter['NO'] += 1

                        # Check if the counter is equal to the required number


of consecutive frames.  
                        if counter['NO'] == num_of_frames:
                             # Turn off the filter by updating the value of the
filter status variable to False.
                            print('NO')
                            font = cv2.FONT_HERSHEY_SIMPLEX
                            cv2.putText(frame,
                                        'NO',
                                        (100, 150),
                                        font, 3,
                                        (0, 255, 255),
                                        5,
                                        cv2.LINE_4)

31
                    # Update the counter value to zero.
                            counter['NO'] = 0
                    else:
                        counter['NO'] = 0
                    if any(hand_gesture == "PLAY" for hand_gesture in
hands_gestures.values()):
                        counter['PLAY'] += 1
                        if counter['PLAY'] == num_of_frames:
                            print('PLAY')
                            font = cv2.FONT_HERSHEY_SIMPLEX
                            cv2.putText(frame,
                                        'PLAY',
                                        (100, 150),
                                        font, 3,
                                        (0, 255, 255),
                                        5,
                                        cv2.LINE_4)

                    # Update the counter value to zero.


                            counter['PLAY'] = 0
                    else:
                        counter['PLAY'] = 0
                    if any(hand_gesture == "YES" for hand_gesture in
hands_gestures.values()):
                        counter['YES'] += 1

                # Check if the counter is equal to the required number of


consecutive frames.  
                        if counter['YES'] == num_of_frames:
                    # Turn off the filter by updating the value of the filter
status variable to False.
                            print('YES')
                            font = cv2.FONT_HERSHEY_SIMPLEX
                            cv2.putText(frame,
                                        'YES',
                                        (100, 150),
                                        font, 3,
                                        (0, 255, 255),
                                        5,
                                        cv2.LINE_4)

                    # Update the counter value to zero.


                            counter['YES'] = 0
                    else:

32
                        counter['YES'] = 0
                    if any(hand_gesture == "IHATEYOU" for hand_gesture in
hands_gestures.values()):
                        counter['IHATEYOU'] += 1
                        if counter['IHATEYOU'] == num_of_frames:
                        # Turn off the filter by updating the value of the
filter status variable to False.
                            print('IHATEYOU')
                            font = cv2.FONT_HERSHEY_SIMPLEX
                            cv2.putText(frame,
                                        'IHATEYOU',
                                        (100, 150),
                                        font, 3,
                                        (0, 255, 255),
                                        5,
                                        cv2.LINE_4)

                    # Update the counter value to zero.


                            counter['IHATEYOU'] = 0
                    else:
                        counter['IHATEYOU'] = 0
            cv2.imshow('Selfie-Capturing System', frame)

        # Wait for 1ms. If a key is pressed, retreive the ASCII code of the key.
            k = cv2.waitKey(1) & 0xFF

        # Check if 'ESC' is pressed and break the loop.


            if (k == 27):
                break
        camera_video.release()
        cv2.destroyAllWindows()
import tkinter as tk
    parent = tk.Tk()
    parent.geometry("1000x300")
    parent.title("MP")
    parent.config(bg="blue")

    tk.Label(parent, text="REAL TIME FINGER COUNTER AND HAND GESTURE


RECOGNITION", font="normal 20 bold", fg="black",
            bg="blue").pack(pady=20)

    frame = tk.Frame(parent)
    frame.config(bg="blue")
    frame.pack()

33
    text_disp = tk.Button(frame,
                      text="sign detection",
                      command=write_text, bg="red")

# text_disp.pack(side=tk.LEFT)
    text_disp.pack(side=tk.LEFT, padx=10)

    exit_button = tk.Button(frame,
                        text="Finger counter", bg="red", command=qu)

# exit_button.pack(side=tk.RIGHT)
    exit_button.pack(side=tk.LEFT, padx=10)
    button3 = tk.Button(frame,
                    text="hand gesture", command=volume, bg="red")
   

# button3.pack(side=tk.BOTTOM)
    button3.pack(side=tk.LEFT, padx=10)
   
   
    parent.mainloop()

34

You might also like