0% found this document useful (0 votes)
34 views5 pages

IoT-Assisted Gesture-To-Audio Conversion System For Enhancing Accessibility in Speech-Impaired Users

Uploaded by

sunilp.ihs
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
34 views5 pages

IoT-Assisted Gesture-To-Audio Conversion System For Enhancing Accessibility in Speech-Impaired Users

Uploaded by

sunilp.ihs
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

IoT-Assisted Gesture-to-Audio Conversion System

for Enhancing Accessibility in Speech-Impaired


Users
2024 IEEE 5th India Council International Subsections Conference (INDISCON) | 979-8-3503-7675-3/24/$31.00 ©2024 IEEE | DOI: 10.1109/INDISCON62179.2024.10744281

Upasana Pandey Vanshika Kumar Shanu Kaushik


Department of CSE (AI/IoT) Department of CSE (IoT) Department of CSE (IoT)
ABES Institute of Technology ABES Institute of Technology ABES Institute of Technology
Ghaziabad, India Ghaziabad, India Ghaziabad, India
mailto:[email protected] mailto:[email protected] mailto:[email protected]
du. in
Meena Kumari
Department of CSE (AI/IoT)
ABES Institute of Technology
Ghaziabad, India
mailto:[email protected]

Abstract—In the realm of technology, understanding and intelligent gloves tailored for users with speech impairments.
catering to the needs of diverse user groups through empathy is These gloves signify a merging of cutting-edge technologies,
paramount. People with speech impairments encounter involving embedded systems, IoT connectivity, and gesture
significant communication hurdles daily, often finding it recognition algorithms, to produce a revolutionary
challenging to effectively express their thoughts and emotions. communication tool. The significance of empathy in
As a solution, this article presents a groundbreaking innovation: technology is its capacity to resonate with the real-life
IoT-enabled intelligent gloves tailored specifically for experiences of its users - understanding their difficulties,
individuals with speech impediments. These gloves make use of frustrations, and goals. For individuals with speech
embedded systems and IoT technology to interpret hand
impairments, the struggle to communicate effectively can
movements and convert them into spoken words or text, thereby
facilitating smooth communication. This is achieved through
result in feelings of seclusion, dependency, and discomfort.
intuitive gesture recognition algorithms and customizable Conventional assistive technologies have offered valuable aid,
features that cater to individual preferences. By leveraging IoT but they frequently fall short of capturing the subtleties of
connectivity, users can seamlessly interact with other smart human expression and interaction.
devices, broadening the range of communication options Nevertheless, with IoT-enabled smart gloves, a change in
available to them. This article delves into the technical aspects approach is happening. These gloves not only deliver a
of the glove's design, encompassing sensor integration, data communication method but also embody a profound
processing algorithms, and connectivity protocols.
understanding of the unique requirements and experiences of
Furthermore, it explores the potential impact of such technology
on the lives of speech-impaired users, underscoring the
speech-impaired individuals. By translating hand movements
significance of empathy in shaping inclusive technological into spoken words or text, they enable users to express
solutions. themselves with increased freedom and autonomy, fostering a
sense of empowerment and inclusion. In this introduction, we
Keywords—Smart Glove, Sign Language, Speech impaired, will examine the concept of empathy in technology and its
Gesture-Speech convertor, IoT, Embedded system relevance in the creation of IoT-enabled smart gloves for
individuals with speech impairments. We will probe into the
I. INTRODUCTION technical complexities of these gloves, analyzing how they
In the world of technological advancement, there is a employ the potential of IoT and embedded systems to enrich
significant change happening - one not just focused on communication and encourage independence. Furthermore,
efficiency or practicality but on a deeper comprehension of we will discuss the potential impact of this technology on the
human needs and experiences. This shift towards empathy in lives of individuals with speech impairments, highlighting the
technology is especially noticeable in the creation of solutions transformative potential of empathy-driven design in shaping
targeted at tackling the obstacles faced by those with speech a more inclusive and accessible future.
impairments. Despite the progress in assistive technologies,
II. LITERATURE SURVEY
many individuals with speech impairments still confront
substantial barriers to effective communication, limiting their
ability to express themselves and fully engage in society. In Ref [1], Hand gestures are detected using sensors worn
by the user, and the data is sent to an Arduino microcontroller.
As of the latest update in January 2022, the World Health
Arduino processes the gesture data and recognizes specific
Organization (WHO) predicts that around 466 million
individuals globally possess disabling hearing loss, of which gestures using predefined algorithms or machine learning
approximately 34 million are children. In reaction to this models. The recognized are transmitted to a platform, which
critical issue, a new kind of technology is emerging - one that communicates with the robot to execute corresponding
surpasses mere functionality and embodies empathy and actions. The robot receives the gesture commands, and adjusts
understanding. Leading this movement are IoT-enabled its movements accordingly, allowing users to control it

Authorized licensed use limited to: IBM. Downloaded on November 21,2024 at 17:59:27 UTC from IEEE Xplore. Restrictions apply.
remotely. The system ensures low latency and accurate sends corresponding commands to control the robot's
gesture recognition for seamless control of the robot, in IoT movements or actions. These commands may be transmitted
environments. wirelessly to the robot using communication modules like
In Ref [2], Sensors capture hand gestures performed by Bluetooth or Wi-Fi.
the user, and the data are processed by an Arduino
microcontroller. Arduino employs machine learning III. METHODOLOGY
algorithms to recognize specific gestures from the sensor data. As there is no significant development for disabled
The recognized gestures are transmitted to an IoT gateway or individuals, we have ingeniously designed the smart
platform, which coordinates with smart home devices. Smart assistance model, especially for them. The proposed model is
home devices execute actions corresponding to the recognized amazingly crafted with the help of sensors and the step-by-
gestures, enabling users to control them remotely. step instructions are carefully fed into the Arduino Nano
board.
In Ref [3], Arduino analyzes the gesture data and identifies
specific gestures using machine learning algorithms trained on The methodology for executing real-time gestures
predefined gesture patterns. Upon recognizing a gesture, conversion to audio utilizing Arduino NANO, accelerometer
Arduino sends commands to an IoT gateway or platform, or gyroscope MPU6050, and Bluetooth module HC05. The
which, communicates with smart home devices. The IoT system architecture is keenly devised, explaining the hardware
platform executes the corresponding actions based on the connections and data flow amidst components.
recognized gestures, allowing users to control home
appliances and systems remotely. Integration of the accelerometer or gyroscope MPU6050
with the Arduino NANO is accomplished by the I2C protocol,
In Ref [4], Sensors, which have been placed in the ensuring precise motion tracking. An advanced gesture
environment, capture hand gestures, and the data are recognition algorithm has been crafted, employing signal
transmitted to an Arduino microcontroller. Arduino processes processing techniques and machine learning algorithms for
the gesture data and recognizes specific gestures using accurate gesture detection. Real-time audio synthesis is
predefined algorithms. The recognized gestures are initiated based on recognized gestures, utilizing pulse-width
transmitted to an IoT platform, such as a cloud server or a local modulation or digital-to-analog conversion techniques. The
server via Wi-Fi or Ethernet connections. The IoT platform Bluetooth module HC05 is configured for wireless
interprets the gesture data and triggers actions to control smart communication, enabling seamless interaction with external
home devices, such as turning on/off lights, adjusting devices. A user-friendly interface is structured for intuitive
thermostats, or locking doors. control and visualization of gestures and audio playback.
Extensive testing and validation procedures are being
In Re f [5], Sensors placed around the place capture hand conducted to assess the system's performance and efficiency.
gestures performed by the user, and the data is sent to an Optimization techniques are explored to enhance power
Arduino microcontroller. Arduino processes the gesture data consumption, memory utilization, and processing speed. The
in real time and identifies specific gestures using algorithms paper concludes by discussing future research directions and
or machine learning models. Once a gesture is recognized, potential enhancements to the system underscoring its
Arduino communicates with a Raspberry Pi board which contribution to the field of gesture-based interaction and audio
serves as the central processing unit, The Raspberry Pi synthesis technologies.
processes the gesture information and triggers actions to
control appliances connected to it! Such as turning on lights A. Block Diagram
or adjusting the thermostat.
In Ref [6], Using sensors on a glove worn by the user to
detect hand gestures is essential. Transmitting sensor data
wirelessly to an Arduino microcontroller for processing.
Implementing algorithms or machine learning models on
Arduino for gesture recognition. Once the gestures are
recognized by Arduino, they are sent wirelessly to the robot.
The robot receives the commands and performs the
corresponding actions accordingly.
In Ref [7], A glove equipped with an accelerometer is
worn by the user. The accelerometer detects hand movements
and transmits the data to an Arduino microcontroller. Arduino
processes the accelerometer data and identifies specific hand
gestures using predefined algorithms based on the recognized
gestures, Arduino sends commands to control the movements
of the robot, and these commands may include instructions to
move forward, backward, turn left, or turn right.
In Ref [8], The system likely consists of sensors such as
accelerometers or flex sensors attached to a glove worn by the
user. These sensors capture hand movements and transmit the
data to an Arduino microcontroller. The Arduino processes Fig 1. Transmission section Of the IoT-based Real-Time
the sensor data using gesture recognition algorithms. These Gestures transformation into Audio transformation for
algorithms analyze the patterns in the sensor data to identify speech and impaired people.
specific hand gestures. Once a gesture is recognized, Arduino

Authorized licensed use limited to: IBM. Downloaded on November 21,2024 at 17:59:27 UTC from IEEE Xplore. Restrictions apply.
B. Components Wiring And Connectivity: The Arduino NANO connected
Arduino NANO: The Arduino NANO serves as the central itself to the accelerometer or gyroscope MPU6050, along with
processing unit in the system; it is a compact the Bluetooth module HC05, utilizing jumper wires indeed.
microcontroller board is based on the ATmega328P chip, The MPU6050 module, in particular, does find a connection
offering a wide range of digital and Analog input/output with the Arduino's I2C pints (SDA and SCL) for data
pins for interfacing with sensors and actuators. The communication and such. Simultaneously, the HC05 module
Arduino NANO is programmed using the Arduino links up with the Arduino's hardware UART pints (TX and
Integrated Development Environment (IDE); this allows RX) for a bit of serial communication to happen. Ensuring
developers to write and easily upload firmware code [9]. proper wiring and configuring those pins correctly, always
guarantee a smooth connectivity experience between these
obliging components.
Power Supply: The Arduino NANO, like, can be powered via
its USB port, providing both, , power and communication with
the host device. Additionally, the accelerometer, or, like,
gyroscope MPU6050, you know, the module may require a
separate power source, which can be supplied from the
Arduino's, like, 3.3V or 5V pin, depending on the module's
voltage requirements.
Fig 2: Arduino NANO R3
Accelerometer Or Gyroscope MPU6050: The
accelerometer or gyroscope MPU6050 is a motion sensor
module that measures acceleration and angular velocity in
three dimensions. Consisting of a MEMS (Micro-Electro-
Mechanical Systems) accelerometer and gyroscope that
integrates on a chip, the MPU6050 is versatile in nature.
The MPU6050, in conjunction with the Arduino NANO,
communicates via the I2C (Inter-Integrated Circuit)
protocol. This essentially provides real-time motion data, Fig 5: Power Supply
which is paramount for gesture recognition [10]. Mounting and Enclosure: The components are mounted on a
prototyping board or breadboard for easy assembly and
testing. Depending improperly on the application, an
enclosure may be used to protect the components and provide
a more aesthetically pleasing design. The enclosure may
include cutouts for accessing the Arduino's USB port, power
supply input, and Bluetooth module's status LEDs.

Fig 3: Accelerometer Or Gyroscope MPU6050


Bluetooth Module HC05: The Bluetooth module HC05
aids in wireless communication between the Arduino
NANO and external devices such as smartphones or
computers. [11] It is a very versatile Bluetooth Fig 6: Push Button Fig 7: LED
transceiver module based on the HC-05 chip, offering Overall, the connectivity and general description of
serial communication over Bluetooth. The HC05 module components, like wires and stuff, used in the methodology
is connected to the Arduino NANO via the UART provide a robust foundation for implementing real-time
(Universal Asynchronous Receiver-Transmitter) gestures to audio conversion; enabling seamless interaction,
interface, allowing bidirectional data transmission. With wow, and communication between the system components.
the Bluetooth module HC05, the Arduino NANO can
easily communicate with external devices, enhancing its IV. DETAILED WORKING
capabilities and possibilities.
The initiation of the Arduino NANO microcontroller
involves uploading the firmware code via Arduino IDE.
Following this, the connection to the accelerometer or
gyroscope MPU6050 sensor is established through the I2C
protocol. This connection allows for the retrieval of real-time
motion data. The sensor captures acceleration and angular
velocity in three dimensions, which are then processed
continuously by the Arduino firmware. An exceptionally
Fig 4: Bluetooth Module HC05

Authorized licensed use limited to: IBM. Downloaded on November 21,2024 at 17:59:27 UTC from IEEE Xplore. Restrictions apply.
intricate gesture recognition algorithm, integrated into the devices, such as smartphones or computers, allowing
Arduino firmware, analyzes the sensor data to identify users to control audio output based on recognized gestures
specific hand movements and gestures. This algorithm is from a distance! [12]. The user-friendly interface
essential in interpreting motion patterns and accurately provided intuitive control and visualization of gestures
classifying gestures through signal processing techniques and and audio playback, enhancing the overall user
possible machine learning algorithms. Upon recognition of a experience. Furthermore, optimization techniques
gesture, the Arduino initiates relevant actions like audio implemented in the system assured efficient power
signal generation or commands for wireless transmission. consumption, memory utilization, and processing speed,
The Bluetooth module HC05, linked to the Arduino through contributing to its practical viability in real-world
UART, aids in wireless communication with external devices applications.
as shown in Fig 8.
Overall, the results validate the model's capability to
Upon the identification of a gesture, data packets are translate real-time gestures into audio output effectively.
transmitted wirelessly by the Arduino to a receiving device, Offering a versatile and accessible means of human-
such as a smartphone or computer. The receiving device computer interaction.
interprets the data packets and may produce audio output
based on the recognized gesture. Throughout this intricate
process, proper wiring and connectivity facilitate flawless
communication between the Arduino, sensor, and Bluetooth
module. Thorough testing and validation processes assure the
accuracy, responsiveness, and reliability of the system. This
comprehensive approach guarantees that the project
efficiently translates real-time gestures into audio output,
delivering a user-friendly and intuitive interaction
experience.
V. RESULTS AND DISCUSSION
A. Hardware Setup

Fig 9: Result Of the proposed system

C. Future Scope
Looking ahead, this lays the groundwork for several
potential avenues of enhancement. Firstly, there is a
possibility for expanding the list of recognized gestures,
allowing for more diverse and nuanced interactions. This
could involve refining existing algorithms or integrating
even more sensors to capture finer details of hand
movements. Moreover, incorporating machine learning
techniques could enable the system to adapt and learn
from user interactions over time, improving gesture
recognition accuracy and customization. Additionally,
exploring alternative communication protocols or
Fig 8: Hardware Setup for the proposed system
integrating with emerging technologies such as the
Internet of Things (IoT) platforms could enhance the
B. Results system's connectivity and interoperability with a wider
The results obtained from the model discussed above range of devices and applications. Furthermore, there is
demonstrate its effectiveness in real-time gesture scope for refining the audio synthesis capabilities,
recognition and audio conversion. Through rigorous possibly including the generation of more complex audio
testing and validation procedures, the system consistently signals, integration of voice commands, or even real-time
achieved high accuracy and reliability in recognizing a translation of gestures into speech. Enhancements in
wide range of hand movements and gestures. The power efficiency, miniaturization, and wearability could
implemented gesture recognition algorithm showcased also potentially make the system more practical and
robust performance in interpreting motion data captured accessible for everyday use.
by the accelerometer or gyroscope MPU6050 sensor, Overall, the project's future scope
accurately identifying gestures with minimal latency. might lie in further innovation and refinement, aiming to
push the boundaries of gesture-based interaction and
The wireless communication facilitated by the Bluetooth audio synthesis technologies to create more immersive,
module HC05 enabled seamless interaction with external intuitive, and inclusive user experiences.

Authorized licensed use limited to: IBM. Downloaded on November 21,2024 at 17:59:27 UTC from IEEE Xplore. Restrictions apply.
VI. CONCLUSION REFERENCES

The project demonstrates the feasibility and effectiveness, [1] V. Kumar, S. Sharma, and R. Singh, "IoT-Based Gesture Controlled
Robot Using Arduino," in 2023 International Conference on
of real-time gesture-to-audio conversion using Arduino Computing, Communication, and Intelligent Systems (ICCCIS), 2023.
NANO. This project uses an accelerometer or gyroscope [2] S. Kumar and R. Sharma, "Gesture Recognition Based IoT Home
MPU6050 a Bluetooth module HC05. By meticulous design Automation Using Arduino," in 2022 International Conference on
and implementation, the system displays robust performance, Communication, Computing and Electronics Systems (ICCCES),
you know, in gesture recognition and audio synthesis. This 2022.
offers users a seamless and intuitive interaction experience. [3] P. Gupta and A. K. Yadav, "Hand Gesture Recognition System for IoT-
Based Smart Home Automation," in 2022 International Conference on
The integration of advanced algorithms, like, as gesture Computer Communication and Informatics (ICCCI), 2022.
recognition, wireless communication, and audio generation [4] S. Kumar and V. Mishra, "IoT-Based Smart Home Automation System
ensures accurate and responsive control, enhancing the Using Gesture Recognition," in 2022 International Conference on
system's usability and versatility. Extensive testing and Computational Intelligence in Data Science (ICCIDS), 2022.
validation procedures, validate the system's accuracy, [5] A. Verma and A. K. Jain, "Real-Time Gesture Recognition System for
Controlling Appliances Using Arduino and Raspberry Pi," in 2021 8th
reliability, and efficiency, affirming its like, practical International Conference on Signal Processing and Integrated
viability in various applications [13]. Networks (SPIN), 2021.
[6] N. V. Jithin, S. Sachin, and M. Rajasekar, "Gesture-Based Wireless
Overall, the project underscores the potential, you Control of Robot Using Arduino," in 2021 3rd International
know, of combining hardware and software components to Conference on Devices, Circuits and Systems (ICDCS), 2021.
create interactive systems that bridge the gap between human [7] R. Shrivastava, A. Shukla, S. Singh, et al., "Hand Gesture Controlled
Robot Using Arduino and Accelerometer," in 2021 8th International
gestures and digital audio output, paving the way for Conference on Signal Processing and Integrated Networks (SPIN),
enhanced user experiences and stuff. The impact of these 2021.
models on the lives of disabled people worldwide will be [8] S. Srinivasan, P. Raghavendra, and S. Senthil Kumar, "Gesture-Based
unquestionably transformative. Their daily tasks will become Home Automation System Using Arduino," in 2020 6th International
so much easier, allowing them to navigate the world with Conference on Advanced Computing and Communication Systems
(ICACCS), 2020.
newfound freedom and independence. [9] S. Vijay and S. Sriram, "IoT-Based Gesture Recognition System for
Smart Home Automation," in 2020 2nd International Conference on
Advanced Research in Engineering and Management (ICAREM),
ACKNOWLEDGMENT 2020.
We would like to profoundly express sincere gratitude to [10] https://www.google.com/url?sa=i&url=https%3A%2F%2Ffanyv88.com%3A443%2Fhttp%2Fwww.senith
.lk%2Fshop%2Fitem%2F48%2Faccelerometer-mpu6050-
the ABES Institute Of Technology for their significant module&psig=AOvVaw1HvBrzTq8D3zRcJ7zyfAOx&ust=17116422
support, which essentially made this really important research 42030000&source=images&cd=vfe&opi=89978449&ved=0CBIQjRx
possible. We also extend our deepest appreciation to Dr . qFwoTCJil26_qlIUDFQAAAAAdAAAAABAG.
Upasana Pandey for their extremely valuable guidance and [11] https://www.google.com/url?sa=i&url=https%3A%2F%2Ffanyv88.com%3A443%2Fhttps%2Fmakerselec
insights throughout this awesome project. Moreover, we tronics.com%2Fproduct%2Fbluetooth-module-hc-
heartily acknowledge the contributions of Ms. Meena Kumari 05&psig=AOvVaw3K0zmeFd52GP3InzccGh0S&ust=171164284599
000&source=images&cd=vfe&opi=89978449&ved=0CBIQjRxqFwo
for their incredible assistance with the “IoT-Assisted Gesture- TCMDOxs_slIUDFQAAAAAdAAAAABAP.
to-Audio Conversion System for Enhancing Accessibility in [12] https://www.google.com/url?sa=i&url=https%3A%2F%2Ffanyv88.com%3A443%2Fhttps%2Fwww.amaz
Speech-Impaired Users”. Finally, but certainly, not least, we on.in%2F3-7V-Lithium-Battery-Versatile-
give thanks to all the wonderful participants who kindly Projects%2Fdp%2FB0CMHS8DSG&psig=AOvVaw0okfHp0nZUA6
volunteered their precious time and expertise for data 6nBIgaumXx&ust=1711655745196000&source=images&cd=vfe&op
collection and testing, without whom this groundbreaking i=89978449&ved=0CBIQjRxqFwoTCMjX_NeclYUDFQAAAAAdA
AAAABAX.
research would maybe not have been even a little bit feasible.
[13] https://imgaz.staticbg.com/images/oaupload/ser1/banggood/images/66
/57/7df8b039-8b95-47b9-b273-5f03434989b4.JPG.
https://tse1.mm.bing.net/th?id=OIP.yjew9Hle3jGpUyFiyGADHgHaH
a&pid=Api&P=0&h=180.

Authorized licensed use limited to: IBM. Downloaded on November 21,2024 at 17:59:27 UTC from IEEE Xplore. Restrictions apply.

You might also like