Implementation of IoT Based Smart Assistance
Implementation of IoT Based Smart Assistance
G. Rithika S. Sandhya
Electrical and Electronics Engineering Electrical and Electronics Engineering
Sri Krishna College of Technology Sri Krishna College of Technology
Coimbatore, India Coimbatore, India
[email protected] [email protected]
Abstract— Communication between normal people and difficult to process data. Data glove based is said to have
people with vocal and hearing troubles is a difficult task. The faster response than vision based. We have proposed a data
sign language used by these people is not understandable by the glove-based approach because the result obtained is accurate,
common people, so it creates a communication barrier. People feasible, and also portable. Based on the movement which is
who are paralyzed also require assistance regularly. For such done by the fingers, the flex sensors detect the bend made by
people we have proposed Implementation of IOT based Smart the finger and the output is varied in terms of resistance. Due
assistance gloves for disabled people. The gloves we designed is to its flexibility and large range of resistance, many commands
very simple yet effective when compared to the existing system. can be fed into it. We have used Arduino uno for more storage
With the help o f flex sensors, the finger gesture is detected and
and faster response. The raspberry pi 3b includes inbuilt Wi-Fi
the corresponding instructions are displayed in the android app
with audio output. The proposed system is implemented by
module, Bluetooth and USB boot which is used to connect
Arduino uno and Raspberry pi, where the communication with the android app. The Arduino uno and the raspberry pi
between these two modules is done by wireless serial port module are transfer signals between them with the help of a wireless
due to its secured data transmission. An alert message will be serial port module.
sent through the GSM module during emergency situation.
Keywords— vocal, hearing, gloves, IOT, communication, flex II. LITERATURE SURVEY
sensors, wireless serial port module, GSM. In Ref [1], the model consists of the Transmitter and
Receiver section. The transmitter section consists of flex
I. In t r o d u c t io n sensors, PIC12F683 microcontroller and a RF transmitter,
whereas the receiver section consists of PIC18F45K20
microcontroller, AC driver, LCD display, voice module and
In India the recent survey shows that there are millions of RF receiver. The flex sensor gestures are converted to
people who have speaking and hearing disabilities. The scale commands and displayed in LCD. Voice recorder and
goes up to 2.4 million people. Disable people struggle to Playback unit is used to pre-record the voice to fetch the
communicate with other in their day-to-day life and also, it's output in the speaker. This model uses microcontroller
hard for them to express their emotions. People with
disabilities generally use sign language to communicate with The ADC channels of the microcontroller are used for
each other but this sign language is difficult for normal people digital conversion. The flex sensors are used to detect the
to understand. Not only normal people, but people with finger gesture and the output is in form of audio through
disabilities find it hard to learn these sign languages. It is also speaker and through LCD. A mode is specified for different
difficult for them to socialize and they will not be able to actions [2].
voice out their opinions. This is the same case with paralyzed In Ref. [3], The overall project is done using the LabVIEW
people. They will not be able to move or communicate. So, the software and data acquisition device (DAQ Card). Flex
gloves we have designed allows the person with disabilities to sensors are employed to detect the finger movements and the
communicate what he/she wants without anyone’s help.
gestures made by the hand using signal processing kit. In this
Though there are many existing approaches to this project. We project the alphabet in the sign language made by the disabled
have designed this disabled assistance glove for faster and
person is captured and it is concatenated into the
smarter response. There are two types of gesture recognition, corresponding word. The overall process is implemented and
namely data glove based and vision based. Vision based
customized on the LabVIEW platform.
recognition is not accurate as it has noise disturbance and is
1160
Authorized licensed use limited to: University of Prince Edward Island. Downloaded on July 04,2021 at 03:52:52 UTC from IEEE Xplore. Restrictions apply.
2021 7th International Conference on Advanced Computing & Communication Systems (ICACCS)
The proposed system is focused for facially paralyzed app and output is also displayed as an audio output. The
persons. It uses only one flex sensor and predefined input is overall process is carried out by Arduino Uno, Raspberry Pi
given and the output is displayed in LCD through NodeMCU and GSM module. The Data transmission between the
ESP8266[4]. Arduino Uno and Raspberry Pi is done with the help of a
In Ref. [5], conducting line sensors (CSL) are used to detect wireless serial port module. An alert message will be sent
the gesture. This line sensor is connected to Arduino uno and through the GSM module during emergency situation to the
the data is fed into the microcontroller in digital form. The emergency contact.
output of the line sensor varies on various positions. So, the
corresponding command is displayed in LCD as text.
A. BLOCK DIAGRAM
A flex sensor is fitted to the gloves. The instructions are fed
into the Arduino AT mega. Whenever the finger makes the
gesture, the predefined instructions are displayed in LCD and
also an audio output is given through the spoke jet with signal
amplification [6].
In Ref. [7], five flex sensors are used. Along with the flex
sensors, tactile sensors and the measurement of the orientation
of the hand is done by the accelerometer. The sensor takes in
analog inputs and the output is in the form of digital output.
ARM processor is used for storing the predefined data. The
sound is stored in SPI memory and by using the speaker the
output is generated. The output is also given in LCD display.
In Ref. [8], they have used a non-vision-based approach with
the help of flex sensors, pressure sensors. In this a real time
image is captured by the preprocessor. After this, feature
extraction is done by using Otsu’s algorithm with the help of
an SVM machine. The corresponding text from the sign Fig 1. Transmitter section
language is converted. MATLAB is used to convert the text
into voice.
In Ref. [9], the whole paper discussed in the area of IoT.
Different sensors are used to detect the movement of hand and
fingers. Whenever a gesture is made, the orientation of the
hand is detected by the ADXL337 accelerometer. These
gestures are interpreted by the help of the Renesas
microcontroller. In this project, the hand gestures are captured
and then these gestures are converted into the corresponding
text and speech format. LCD is used for displaying the text
output and the speaker as audio output.
In Ref. [10], the project focuses on converting the gestures
made by the deaf-dumb people into meaningful text/speech.
The project is carried out by a MEMS sensor and a
microcontroller. The predefined hand gestures made by the
disabled people are captured and stored in the database.
Whenever a hand gesture is made, the MEMS sensor is Fig 2. Receiver section
accelerated and the signal is sent to the microcontroller. Data
B. COMPONENTS
is matched by the microcontroller with the database and the
output is given out in the form of audio by the speaker. The Flex Sensors: Flex sensors generally detect the amount of
system also includes the text to speech converter (TTS), where bending and deflection. As the bending increases, the
the text is also converted into the corresponding speech. resistance also increases. Based on the surface linearity, the
flex sensor resistance also increases. A flex sensor is a two-
III. METHODOLOGY
terminal device; it is not polarized and it does not have
positive and negative terminals. It has two pins namely P1 and
As there is no significant development for disabled people, P2 where p1 is usually connected to the positive of the power
we have designed the smart assistance gloves for them. The source end p2 is connected to the ground as shown in fig 3.
proposed model is designed with the help of flex sensors and
the instructions are fed into the Arduino Uno board. The finger
gesture is captured by the flex sensor and a corresponding
output is displayed in the form of a sentence in the Android
1161
Authorized licensed use limited to: University of Prince Edward Island. Downloaded on July 04,2021 at 03:52:52 UTC from IEEE Xplore. Restrictions apply.
2021 7th International Conference on Advanced Computing & Communication Systems (ICACCS)
Wireless serial port module: This module is compatible with
the Arduino uno board and it is used for two-way RF
communication. It uses AT Command for configuration
setting. The RF band is 2.405Ghz-2.485Ghz and the operating
distance can be up to 200m. It has five pins namely ground,
power supply, transmitter, receiver and a command pin as
shown in fig 6.
1162
Authorized licensed use limited to: University of Prince Edward Island. Downloaded on July 04,2021 at 03:52:52 UTC from IEEE Xplore. Restrictions apply.
2021 7th International Conference on Advanced Computing & Communication Systems (ICACCS)
IV. DETAILED WORKING
Screen2
The smart gloves are used by the deaf and dumb people to
smart assistance glove for disabled people:
communicate their basic needs. The movement in the finger is message:emergency
converted into command. The movement is captured by the
flex sensor and based upon the movement the corresponding
command is shown. The flex sensors are connected to Arduino
uno and the information is stored. GSM and wireless serial
smart assistance glove for disabled people:
port modules are also connected with the Arduino. We have message:turn on the fan
used two wireless serial port modules; one is connected to the
Arduino and the other is connected to the raspberry pi as
shown in fig 9.
smart assistance glove for disabled people: i
By the movement in the flex sensor, the corresponding need food
command is captured and with the help of the Lora transceiver
the command is communicated to the raspberry pi. The
smart assistance glove for disabled people:
instructions are displayed on the webpage and the output is message:help me cross the road
also in the form of audio. The output is also displayed through
a mobile app as shown in fig 10. So that the attender or the
person’s help can be notified as soon as the person does a smart assistance glove for disabled people:
movement. The attender can be notified even when he is far message:hi how are you
away from the disabled person. In case of emergency, a
unique and easy movement is done by the disabled person. So,
during an emergency, an alert is sent in the form of an email
and message to the person’s emergency contact and the
attender as shown in fig 11. So, this method would be easy to Fig 10. Output in Android app
keep a check on the disabled person. The disabled person can
also voice out through these gloves even if they have the
disability of speaking or hearing.
1163
Authorized licensed use limited to: University of Prince Edward Island. Downloaded on July 04,2021 at 03:52:52 UTC from IEEE Xplore. Restrictions apply.
2021 7th International Conference on Advanced Computing & Communication Systems (ICACCS)
VII. CONCLUSION
Thus, the proposed model has the advantage of assisting deaf
and dumb as well as the paralyzed patients by displaying the
output commands in android application with vocal output in
speaker. By using wireless serial port modules, data
transmission is fast and secured. In case of emergency, GSM
will send alert messages to the respective person. Compared to
the vision-based techniques, this data-based gloves will reduce
noise disturbances and offer less complexity in algorithms.
In future, we can enhance this proposed model with a
maximum number of commands. Using AI, the data-based
system can be further enhanced with speech recognition. For
home automation, by using different gestures we can control
various basic functions such as switching home appliances in
an effective manner.
References
[1] Bhaskaran, K. Abhijith, Anoop G. Nair, K. Deepak Ram, Krishnan
Ananthanarayanan, and HR Nandi Vardhan. "Smart gloves for hand
gesture recognition: Sign language to speech conversion system." In
2016 International Conference on Robotics and Automation for
Humanitarian Applications (RAHA), pp. 1-6. IEEE, 2016.
[2] Stefanov, D.H., Bien, Z. and Bang, W.C., 2004. The smart house for
older persons and persons with physical disabilities: structure,
technology arrangements, and perspectives. IEEE transactions on neural
systems and rehabilitation engineering, 12(2), pp.228-250.
[3] Kumuda, S. and Mane, P.K., 2020, February. Smart Assistant for Deaf
and Dumb Using Flexible Resistive Sensor: Implemented on LabVIEW
Platform. In 2020 International Conference on Inventive Computation
Technologies (ICICT) (pp. 994-1000). IEEE.
[4] Lakshmi, K.J., Muneshwar, A., Ratnam, A.V. and Kodali, P., 2020, July.
Patient Assistance using Flex Sensor. In 2020 International Conference
on Communication and Signal Processing (ICCSP) (pp. 00181-00185).
IEEE.
[5] Rajamohan, A., Hemavathy, R. and Dhanalakshmi, M., 2013. Deaf-mute
communication interpreter. International Journal of Scientific
Engineering and Technology, 2(5), pp.336-341.
[6] Kasar, M.S., AnvitaDeshmukh, A. and Ghadage, P., 2016. Smart
Speaking Glove-Virtual tongue for Deaf and Dumb. International
Journal of Advanced Research in Electrical, Electronics and
Instrumentation Engineering, 5(3), p.7.
[7] Khan, M.A.R., Gowtham, B., Saravanan, A.A., Bharathi, R.A. and
Elakya, A., 2019, March. Smart Electric Vehicle. In 2019 5th
International Conference on Advanced Computing & Communication
Systems (ICACCS) (pp. 954-957). IEEE.
[8] Shrote, S.B., Deshpande, M., Deshmukh, P. and Mathapati, S., 2014.
Assistive Translator for Deaf & Dumb People. International Journal of
Electronics Communication and Computer Engineering, 5(4), pp.86-89.
[9] Leninpugalhanthi, P., Janani, R., Nidheesh, S., Mamtha, R.V.,
Keerthana, I. and Kumar, R.S., 2019, March. Power Theft Identification
System Using IoT. In 2019 5th International Conference on Advanced
Computing & Communication Systems (ICACCS) (pp. 825-830). IEEE.
[10] Rohith, H.R., Gowtham, S. and Sharath Chandra, A.S., 2017. Hand
gesture recognition in real time using IR sensor. International Journal of
Pure and Applied Mathematics, April, 14, pp.15-2017.
[11] Lokhande, P., Prajapati, R. and Pansare, S., 2015. Data gloves for sign
language recognition system. International Journal of Computer
Applications, 975, p.8887.
1164
Authorized licensed use limited to: University of Prince Edward Island. Downloaded on July 04,2021 at 03:52:52 UTC from IEEE Xplore. Restrictions apply.