C3 Abstract
C3 Abstract
(Autonomous)
Cheeryal (V), Keesara (M), Medchal Dist., Telangana - 501 301
DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING
_________________________________________________________________
MINI PROJECT ABSTRACT
IV B.Tech. I SEM ECE - C Section
GUIDE DETAILS:
Name of the Guide K.Satish Babu
Designation Sr.Asst.Professor
Department Electronics and Communication Engineering
MailID [email protected]
Contact Number 9848717250
Generally, sign language is used by the deaf, blind and ones who have speech
impairments, but this becomes difficult for them to communicate with other people who do
not understand sign language.In various researches it has been estimated that there are about
9.1 billion people in the world who are deaf, blind and are have speech impairments and they
face a lot of problems whilst trying to communicate with society in their day to day life.Sign
language which is used to convey information between people relies various body language,
orientation and movements of the arm and fingers etc.
Hand Gesture Vocalizer is a social purpose project that helps people to uplift who are
speech, blind and hearing impaired, by facilitating them to have a better communication with
the public.This project is designed to address the need of developing an electronic device that
can translate sign language into speech and display in order to remove the communication gap
between the dumb and blind and the general public.
Objective: To Develop a portable and potentially low-cost Gesture Vocalizer system using
Arduino for processing sensor data (from a data glove) and Bluetooth for wireless
communication. This system would translate hand gestures into spoken language, aiding
communication for individuals with speech impairments or sign language users.
An Arduino board would be used due to its ease of use and prototyping capabilities. It would
process the sensor data (e.g., flex sensor readings) from the data glove.
The Bluetooth module enables wireless communication between the Arduino and a
smartphone or another device running the text-to-speech engine. The processed gesture data
from the Arduino would be transmitted via Bluetooth.
REFERENCES:
1. JETIR - https://www.jetir.org/papers/JETIR2312048.pdf
2. https://www.instructables.com
Date of Submission: