Sign Language Using Motion Detection
Sign Language Using Motion Detection
Synopsis-
Technical Keywords-
Sign Language, Convolution Neural Network, Image Processing,
Framework, Gestures.
Problem Statement-
We are developing some facilities for deaf people which will help
them to communicate in there daily life using hand signs.
Abstract-
Communication through signs has consistently been a significant way
for communication among hearing and speech impaired humans,
generally called deaf and dumb. It is the only mode of
communicating for such individuals to pass on their messages to
other human beings, and hence other humans need to comprehend
their language. In this project, a sign language detection or
recognition web framework is proposed with the help of image
processing. This application would help in recognizing Sign Language.
The dataset used is the Indian Sign Language dataset. This
application could be used in schools or any place, which would make
the communication process easier between the impaired and
nonimpaired people. The proposed method can be used for the ease
of recognition of sign language. The method used is Deep Learning
for image recognition and the data is trained using Convolution
Neural Network. Using this method, we would recognize the gesture
and predict which sign is shown.
Goals and Objectives-
Goals- To help of differently able people deaf by use of
sign language. It will help them to communicate or
interact with environmental things and people
Introduction-
Sign language is the most significant way of communication
between the impaired people. Establishing an easy way of
communication with deaf and dumb people is very important.
Everyone should be able to understand sign language as it would be
useful in case of any emergency. These individuals communicate
through hand signals or gestures. Signals are essentially the actual
activity structure performed by an individual to pass on the
important data. People trained to know sign language would be able
to communicate efficiently but this would be a problem for the
untrained ones.
Communicating using signs is using gestures and using
hands or expressions to depict a particular sign. Few people use signs
for communicating when they are busy for example in a meeting or
so. There are many types of signs depending on the language and
region. This paper proposes a method for identifying the Indian Sign
Language.
We know the field of Machine Learning, Artificial
Intelligence, Image Processing is advancing and can be used for
multiple domains nowadays. In this paper, Deep Learning is used.
There are many factors that as taken into consideration
when it comes to signing language recognition. The angle of the
gesture also plays a very important role. The type of dataset also
plays a very vital role in the recognition model.
Motivation of project-
Keeping in mind the above drawbacks of the existing method, this
paper proposes a solution to it by using Deep Learning. Also, this
method would help the ease of understanding, and even in public
areas the communication is easy.
FUTURE SCOPE -We could use key point matching and other
techniques for more accurate decisions. The model can be run
further for more epochs. The data could be increased. We can add
words, numbers as well for this. We could also do this by inputting a
constant stream of images and that get a resultant string for a
particular word. We could also put an input field for letters and get
the sign as an output, if possible, to make it into a fully sign
communication platform. We could get an android application for
the same. Text to speech could be added as well. It could be made
multilingual as well.