PBL Project Report I - Format AIML Department3
PBL Project Report I - Format AIML Department3
SE AIML
by
CERTIFICATE
This is to certify that the project entitled “AI-Based Sign Language Translator”
submitted by “Borse Soham, Niyamat Barote, Deep kanade , Soham Asgaonkar , ”
for the partial fulfillment of the requirement for award of a degree Bachelor of
Engineering in AIML,to the SPPU,Pune , is a bonafide work carried out during
academic year 2024-2025
Guide HOD
AIML Department AIML Department
Dr. Asst Prof:- Shlipa Shinda Hod Disha Nagpure
We have great pleasure in presenting the report on Mini Project AI-Based Sign
Language Translator We take this opportunity to express our sincere thanks towards
my guide Dr. Asst Prof Shlipa Shinde Department of AIML, ACEM Pune for providing
the technical guidelines and suggestions regarding line of work. We would like to
express my gratitude towards his/her constant encouragement, support and guidance
through the development of project.
We thank HOD. Prof. Disha Nagpure Head AIML, ACEM Pune for her
encouragement during progress meeting and providing guidelines to write this report.
We also thank the entire staff of ACEM Pune for their invaluable help rendered during
the course of this work. We wish to express our deep gratitude towards all my colleagues
of ACEM Pune for their encouragement.
We declare that this written submission represents our ideas in our own words and where others’
ideas or words have been included, We have adequately cited and referenced the original
sources. We also declare that we have adhered to all principles of academic honesty and
integrity and have not misrepresented or fabricated or falsified any idea/data/fact/source in
our submission. We understand that any violation of the above will be cause for
disciplinary action by the Institute and can also evoke penal action from the sources which
have thus not been properly cited or from whom proper permission has not been taken
when needed.
Date:
Index
iv
List of Figures
List of tables
v
Abstract:-
Communication is a fundamental human need, yet individuals with hearing and speech
impairments often face challenges in expressing themselves in a society that predominantly
relies on spoken language. Sign language serves as an essential medium for these
individuals, but the lack of widespread understanding creates a communication barrier
between them and the hearing population.
This project aims to bridge that gap by developing a real-time Sign Language to Text and
Speech Conversion System. The system captures hand gestures using a webcam, detects
hand landmarks through computer vision techniques (such as MediaPipe), and utilizes a
deep learning model to recognize the signs. These recognized gestures are then translated
into corresponding text, which is further converted into speech output using text-to-speech
synthesis.
The system not only helps in real-time translation of sign language but also enhances
accessibility and inclusiveness in communication. It is designed to support sentence-level
recognition, suggestions, and sentence building, providing a more natural conversational
experience. This solution has applications in educational institutions, public service centers,
and personal communication, making it a valuable tool for the hearing and
speech-impaired community.
vi