0% found this document useful (0 votes)
43 views7 pages

PBL Project Report I - Format AIML Department3

The document is a project report on an AI-Based Sign Language Translator developed by students at Alard College of Engineering & Management. It aims to create a real-time system that translates sign language into text and speech, enhancing communication for individuals with hearing and speech impairments. The project utilizes computer vision and deep learning techniques to recognize hand gestures and provide a more inclusive communication experience.

Uploaded by

borsesoham0710
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
43 views7 pages

PBL Project Report I - Format AIML Department3

The document is a project report on an AI-Based Sign Language Translator developed by students at Alard College of Engineering & Management. It aims to create a real-time system that translates sign language into text and speech, enhancing communication for individuals with hearing and speech impairments. The project utilizes computer vision and deep learning techniques to recognize hand gestures and provide a more inclusive communication experience.

Uploaded by

borsesoham0710
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

PBL Project Report on

AI-Based Sign Language


Translator

Submitted in partial fulfillment of the


requirements for the award of the degree of

SE AIML

by

Name of the Student


Borse Soham Bhatu
Soham Sajjan Asgaonkar
Niyamat Hannanpasha Barote
Deep Mahendra Kanade

Under the Guidance of


Asst Prof Shilpa Shinde

ALARD CHARITABLE TRUST'S


ALARD COLLEGE OF ENGINEERING & MANAGEMENT
Department of Artificial Intelligence and Machine Learning
ALARD CHARITABLE TRUST'S
ALARD COLLEGE OF ENGINEERING & MANAGEMENT
Department of Artificial Intelligence and Machine Learning

CERTIFICATE

This is to certify that the project entitled “AI-Based Sign Language Translator”
submitted by “Borse Soham, Niyamat Barote, Deep kanade , Soham Asgaonkar , ”
for the partial fulfillment of the requirement for award of a degree Bachelor of
Engineering in AIML,to the SPPU,Pune , is a bonafide work carried out during
academic year 2024-2025

Guide HOD
AIML Department AIML Department
Dr. Asst Prof:- Shlipa Shinda Hod Disha Nagpure

Place: Alard College of Engineering& Management Pune


Date:
Acknowledgement

We have great pleasure in presenting the report on Mini Project AI-Based Sign
Language Translator We take this opportunity to express our sincere thanks towards
my guide Dr. Asst Prof Shlipa Shinde Department of AIML, ACEM Pune for providing
the technical guidelines and suggestions regarding line of work. We would like to
express my gratitude towards his/her constant encouragement, support and guidance
through the development of project.
We thank HOD. Prof. Disha Nagpure Head AIML, ACEM Pune for her
encouragement during progress meeting and providing guidelines to write this report.

We also thank the entire staff of ACEM Pune for their invaluable help rendered during
the course of this work. We wish to express our deep gratitude towards all my colleagues
of ACEM Pune for their encouragement.

Borse Soham Niyamat Barote Deep kanade Soham Agaonkar


Declaration

We declare that this written submission represents our ideas in our own words and where others’
ideas or words have been included, We have adequately cited and referenced the original
sources. We also declare that we have adhered to all principles of academic honesty and
integrity and have not misrepresented or fabricated or falsified any idea/data/fact/source in
our submission. We understand that any violation of the above will be cause for
disciplinary action by the Institute and can also evoke penal action from the sources which
have thus not been properly cited or from whom proper permission has not been taken
when needed.

Borse Soham Niyamat Barote Deep kanade Soham Agaonkar

Date:
Index

TITLE PAGE NO.


List of Figures V
List of Tables V
Abstract VI
Contents IV
Chapter 1 1 Introduction 1
Chapter 2 2 Literature Review 3
2.1 Motivation 3
2.2 Problem Definition 3
2.3 Aim 4
2.4 Objectives 4
Chapter 3 3 Proposed System 10
3.1 Analysis/Framework/ Algorithm 15
3.2 Details of Hardware & Software 23
3.3 Design details 25
3.4 Methodology (your approach to solve the problem) 24
Chapter 4 Implementation Plan for next semester
4 29

Chapter 5 5 Implementation and Testing 31


Chapter 6 6 Conclusion and Future Work 34
Chapter7 7 Reference 35

iv
List of Figures

Figure No. Description Page No.

1 ASL sign language 1

2.1 pre-processing and hand segmentation 5

2.2 Pre processing Sign Language Finger-spellings 6

2.3 Translate the sign language 7

2.4 Image Collection of different Asl 8

2.5 Accuracy of sign language 9

3.1 Timeline Chart 12

3.2 Collected image of different signs for A to Z 14

3.3 to 3.5 We get different approach of collected images 15

3.6 M Mediapipe Landmark System: 16


3.11 Convolutional Neural Network (CNN) 19
3.12 Average Pooling 20
3.13 Fully Connected Layer 21
3.6.1 System Flowchart 24
3.6.2 DFD Level 0 24
3.6.3 DFD Level 1 25
3.6.4 Sequence Digram 26
5.1 to 5.6 Testing the Character 31 to 33

List of tables

Table No. Description Page No.

3.1 Comparison Table 10

v
Abstract:-

Communication is a fundamental human need, yet individuals with hearing and speech
impairments often face challenges in expressing themselves in a society that predominantly
relies on spoken language. Sign language serves as an essential medium for these
individuals, but the lack of widespread understanding creates a communication barrier
between them and the hearing population.
This project aims to bridge that gap by developing a real-time Sign Language to Text and
Speech Conversion System. The system captures hand gestures using a webcam, detects
hand landmarks through computer vision techniques (such as MediaPipe), and utilizes a
deep learning model to recognize the signs. These recognized gestures are then translated
into corresponding text, which is further converted into speech output using text-to-speech
synthesis.
The system not only helps in real-time translation of sign language but also enhances
accessibility and inclusiveness in communication. It is designed to support sentence-level
recognition, suggestions, and sentence building, providing a more natural conversational
experience. This solution has applications in educational institutions, public service centers,
and personal communication, making it a valuable tool for the hearing and
speech-impaired community.

vi

You might also like