2023 Voice Assisted Real-Time Object Detection
2023 Voice Assisted Real-Time Object Detection
ZAEEM NAZIR
Ph.D. Scholar, Department of Computer Science, Superior University, Lahore, Pakistan.
Email: [email protected]
MUHAMMAD WASEEM IQBAL
Ph.D., Associate Professor Department of Software Engineering, Superior University, Lahore, Pakistan.
Email: [email protected]
KHALID HAMID
Ph.D. Scholar, Department of Computer Science, Superior University, Lahore, Pakistan and Lecturer at
NCBA & E University East Canal Campus Lahore. Corresponding Author Email: [email protected]
HAFIZ ABDUL BASIT MUHAMMAD
Ph.D. Scholar, Department of Computer Science, Superior University, Lahore, Pakistan and Lecturer at
Minhaj University Lahore. Email: [email protected]
M. ASHRAF NAZIR
Ph.D. Scholar, Department of Computer Science, Superior University, Lahore, Pakistan and Lecturer at GC
University, Lahore. Email: [email protected]
QURRA-TUL-ANN
Government College University, Lahore, Pakistan. Email: [email protected]
NAZIM HUSSAIN
Lecturer, Department of Computer Science, Government College University, Lahore, Pakistan.
Email: [email protected]
Abstract
Visual impairment is a problem that is frequently getting worse everywhere. The World Health Organization
estimates that 284 million individuals worldwide suffer from near- or distance vision impairment. The
suggested work's goal is to create an Android application for blind persons that works with a smartphone
and white cane. As the primary distinction between the proposed system and the current one, we use the
cutting-edge "You Only Look Once: Unified, Real-Time Object Detection" technology. When compared to
other algorithms, YOLOv4-tiny performs twice as quickly. To recognise the actual things in front of the
visually impaired individual in real time, YOLOv4-tiny algorithm was trained on both Custom dataset and
COCO dataset. Then determine how far that object is from the individual and produce an audio output. The
camera is initialised using the OpenCV library, and it then starts taking frames and feeding them to the
system. Other than that, we are using Python 3 for this project. The system then employs the YOLOv4-tiny
algorithm, which has been trained on both the Custom dataset and the COCO dataset, to recognise and
gauge the distance between the objects in front of the user. Text to voice conversion is then used to turn
the objects that were detected into an audio segment. Our system outputs an audio segment that tells the
user the name of the object and its distance from them. The user may now visualise the objects around him
using this knowledge. The suggested method will even shield the user from running into nearby objects,
keeping him safe from harm. An android-based application that represents the full system is available. The
Feb 2023 | 67
Tianjin Daxue Xuebao (Ziran Kexue yu Gongcheng Jishu Ban)/
Journal of Tianjin University Science and Technology
ISSN (Online): 0493-2137
E-Publication: Online Open Access
Vol:56 Issue:02:2023
DOI 10.17605/OSF.IO/APQYH
user of the Android application has the choice between an internal camera and an external camera. The
esp32-cam is the external camera, whereas the internal camera is the one found in the android phone,
which is attached to the white cane. Real-time video from their mobile phone camera or esp32-cam will be
used to detect objects. The user opens the camera and feeds the real-time footage when he decides to
begin the object detection procedure. The YOLOv4-tiny programme processes each frame, detecting
objects and calculating their distances from people. Then, the label and distance of the object are converted
into audio format by the audio system. The user can then hear the audio using their smartphone's speaker.
Keywords: Machine Learning, YOLO V4-Tiny, Visually Impairment, Real-Time Object Detection, Data
Mining, Hypotenuses, STEM-Based Smartphone
1. INTRODUCTION
Information technology has become one of the most advanced and evolving technologies
for improving individuals' lifestyles. This assistive technology has the potential to enhance
the better quality of life for all and especially for people with visual impairment. IT-based
assistive technology has a very broad range to provide computer interaction to blind
people [1]. There are millions of visually impaired people worldwide and the majority lies
in the above 50 age group of blind people who are living in deprived countries. In the
advanced technological society, many blind people are capable to manage their daily
lives with help of non-digital approaches such as white cane users, tactile tiles, and many
other assistive technologies. Still, they face many hurdles and obstacles in an unknown
environment like signs and landmarks due to visual impairment. There are several
navigation assistive and mobile device technologies for visually impaired people [2]. As
technology has evolved from computers to tablets and smartphones. Technology creators
have developed smartphone applications, a program, or software that make the way
easier for daily life needs. Visually impaired people face much harder outdoor barriers as
compared to indoors so it is very important to develop portable devices that should also
be cost-effective. These modern smartphones are usually touch-screen, giving
challenges, especially for students and youngsters with visual impairment [3]. Mobile
computing devices provide a flexible and standard platform for people of all age groups
to use for communication and mobility. Recently several customized mobile phone
applications are industrialized for visually impaired people for making their lives better
and smartphone devices have become the pioneer of evolving technology giving an
opportunity to enhance quality. They have opened a user-friendly interface and integrated
several way findings like navigation assistance modules, virtual audio displays, electronic
travel aids, and text-to-speech applications to present a variety of possibilities for blind
people [4]. The use of mobile phones for education is becoming a more common need
and learning ability, intelligible and performance are the main quality attributes meeting
the user criteria. Touchscreen-based smartphones enable VI users to assist by
performing different tasks easily for example making calls, taking pictures, voice input,
listening, text-to-speech application, and inverted screen color [5]. Visually impaired
students need special self-supporting devices and assistive technology for better
performance in studies and laboratory work. There are many learning and STEM-based
smartphone applications designed for students [6].
Feb 2023 | 68
Tianjin Daxue Xuebao (Ziran Kexue yu Gongcheng Jishu Ban)/
Journal of Tianjin University Science and Technology
ISSN (Online): 0493-2137
E-Publication: Online Open Access
Vol:56 Issue:02:2023
DOI 10.17605/OSF.IO/APQYH
2. LITERATURE REVIEW
Kuriakose along with his co-workers. Present a review of different articles and investigate
different smartphone sensory systems and assist travel aids that are helpful for blind and
visually impaired people. They describe the general information on assistive travel aids
which includes Electronic Orientation Aids, Radio Frequency Identification technology,
and Position Locator Devices also some common sensing devices like depth cameras
and modern smartphone cameras and a section on an assistive navigation system to
classify the optical navigation and methodology to establish verbal communication with
the users. They concluded all the possible interesting applications that can be beneficial
for BVI people [7]. In a study, development of titration ColorCam (TCC) android
application by implementing a java programming language for color blind and visually
impaired people was carried out. This app has a list of indicator names and types of
titrations that help to detect the color and saturation of HSV data and provide useful data
analysis. The effective methodology presented the interest of students in the field of
science. Results reveal the application’s potential and make it a user-friendly mobile app.
In the coming future, this developed application can enable blind students in interesting
challenging activities [6]. Similarly, another paper presents an assessment for developing
potentially reasonable and cost-effective electronic travel aid (ETA) systems in
smartphones that can provide advanced accessible information for indoor and outdoor
navigation. Scientific achievements were studied using the PRISMA methodology. Meta-
analysis results prove ETA prototypes have a better navigation and assistant orientation
with limited interest in touch interfaces and computer vision methods as there was no
evaluation of an existing single navigation solution [4]. In a study author designed
assistant depth camera function using Android Smartphone that can identify the object to
avoid obstacles for blind and visually impaired people. The 3D depth camera can
calculate distance, objects, and obstacle detection integration, and can help the user
interfaces through different smartphone features like voice commands and gestures. The
results of the study show that this application successfully provides flexible and moveable
assistive navigation for blind and visually impaired people [8]. Martiniello et al. studied
and present mainstream devices for the replacement of traditional aids for visually
impaired people. Smartphones and tablets are the latest technology that also helps blind
people in their activities. In this paper, an online survey was conducted with a teenage
group using such developed technology for more than 3 months. Most of the participants
show more interest in having beneficial and supportive devices over traditional tools. As
these technologies have the capability to navigate more efficiently and fulfill many
different requirements needed by blind people like audiobooks, object identification,
recording memos, color identification, etc. This study proves that mainstream devices and
smartphones are becoming more useful and widely adopted by visually impaired people
[9].
In a study regarding examining the issues and challenges of smartphone-based assistive
technologies blind people are having in performing their daily life activities such as
identification of currency notes, the appearance of the interesting object, and many
others. This paper provides a comprehensive overview of the need for advancement in
Feb 2023 | 69
Tianjin Daxue Xuebao (Ziran Kexue yu Gongcheng Jishu Ban)/
Journal of Tianjin University Science and Technology
ISSN (Online): 0493-2137
E-Publication: Online Open Access
Vol:56 Issue:02:2023
DOI 10.17605/OSF.IO/APQYH
the latest assistive technology. Smartphone base technology needs room for
improvement in orientation and cognitive mapping in a complex environment vision base
substitution and sensor-based solutions in a cost-effective manner. They need to provide
better accessibility to media and nonvisual information [10]. In a study, Abraham and his
co-workers explore the use of smartphones and studied the limited resource system
visually impaired and blind people are challenged. Smartphones have the potential to
provide assistive devices for such people. This study directs the interview of 166
candidates individually from different centers. 53.1% of the majority use only a basic
phone and 46.90% of people found the usage of a smartphone mostly for social media
and web browsing. Frequently used functions were image and color description.
Unfortunately, 90% of the participants were unaware of already existing functions and
assistive capabilities on smartphones [11]. Similarly, Nimmolrat et al. produced a
pharmaceutical mobile application for visually impaired people. This application contains
five necessary functions on the bases of the design and development of smart
technology. This mobile application potentially supports visually impaired people by
enhancing the efficiency of health care. Even with the high capacity of design and
development of application, there is still room for improvement in health care services
[12]. In a study for the development of a mobile application for visually impaired people
and some children who are challenged by birth for better communication with others. The
present work develops the system using an android application for the user interface
design using Axure rp 9 app. In the present work, some existing developed technologies
and smartphone applications were utilized for blind people to communicate with their
families more conveniently [13]. A paper focus on the developing factor of useful
evaluation of mobile applications for blind people. This paper unifies the systematic
literature review (SLR) for efficient usability to evaluate developed mobile applications for
persons with visual impairment. Besides modified and general usability evaluation
methods study focuses on the factors influencing satisfaction, simplicity, and efficiency
which are the most important attributes and sufficiently helpful for visually impaired
people. The analysis successfully proves the usability dimensions which are efficiency,
effectiveness, and satisfaction that can provide an important evaluation in the mobile
usability for impaired and blind people [14].
In a study related to investigating mobile applications adopted by blind people based on
the unified theory of acceptance and use of technology (UTAUT) model. With an online
survey, the social influence, facilitating conditions, performance expectancy, and self-
efficacy were examined for the different age groups. The qualitative data analysis shows
the accessibility of mobile phones for blind people. The results reveal the practical
implementation for the development of the mobile app and day-to-day activities. Older
age group people were less impacted by social influence. The study proved that the
qualitative findings provide practical implications for the design of mobile app
development for persons with visual impairment [15]. A paper on the user-centered
design of a mobile app that can provide appropriate information about drugs and medicine
to visually impaired people. The mobile app executes four phases’ identification,
requirement, interface design, and usability. Results reveal the incorporation problem
Feb 2023 | 70
Tianjin Daxue Xuebao (Ziran Kexue yu Gongcheng Jishu Ban)/
Journal of Tianjin University Science and Technology
ISSN (Online): 0493-2137
E-Publication: Online Open Access
Vol:56 Issue:02:2023
DOI 10.17605/OSF.IO/APQYH
that was identified in the usability test. Overall Farmaceutic App is capable to use medical
information efficiently and improve the adherence therapy (AT) of visually impaired people
[16]. In another paper related to the public transport systems through RF communications
based on MOVIDIS and embedded systems to use more safely and efficiently. These
developed systems assist people with visual disabilities through their mobility in public
transport autonomously. The products used for ATmega328P microcontrollers, HC-12
series communication, and TI-CC1101 RF transceivers. With the help of radio frequency,
various modules can communicate with each other and allow users to interact
successfully and make their lives easier [17].
Real and Araujo studied and examine comprehensive perceptions of the development of
multidisciplinary nature for blind people. The advanced and updated analysis was
described with the help of previous reviews. The article is based on the recent navigation
systems of artificial visions and sensory substitution devices. The study reveals the
current benefits in technological areas to attain self-sufficiency for BVI people. The
investigation discusses major flaws in the design of the navigation system, the diversity
of assistive technology, and sensory disability insufficient to provide enough data to users
[18]. Botzer and his co-workers developed a system that could help blind people with the
distance through sounds of different frequencies and analyzed the system with the Hebb
William maze. Eight visually blind people were selected for this experiment, and they
complete the last three trials faster out of five. The developed system had the potential to
assist blind people efficiently and was investigated for further navigation [19]. Lin et al.
came up with a simple smartphone-based guiding system to avoid navigation problems
for visually impaired people. This improved technology system enables blind people to
travel safely and more conveniently. A combination of Image recognition systems and
smartphone applications was used in this research to produce a simple and efficient
assistive system for blind people. Operating modes can be chosen according to the
network availability. On operating the system, the smartphone sends the captured image
to the server that uses the ConvNet/CNN algorithm to recognize obstacles in the image
and afterward sends more than 60% of the results back to a smartphone which is
sufficient to assist the visually impaired people [20]. In a case study by approaching
different user-friendly techniques to get progressing experience. A unique communication
experience was associated with visually impaired people using camera applications. The
usability analysis was performed with the number of contestants by the retrospective
think-aloud method and test their performance. Different design assistive technology and
related suggestion camera applications on smartphones were analyzed to develop and
help visually impaired people [21]. In a study attempts to develop accessible mobile
application design guideline for visually impaired people. They investigated a specific task
with few participants to observe the behavior of visually impaired people operating mobile
applications. Interview analysis was performed to find the accessibility problem in typing
and voiceover functions. The second adopted method was the Heuristic walkthrough
method to improve the current condition for disabled people and develop accessible
mobile phone applications for future guidelines [22]. In a study for the development of
mobile phones and other handheld devices accessible through different sensory channels
Feb 2023 | 71
Tianjin Daxue Xuebao (Ziran Kexue yu Gongcheng Jishu Ban)/
Journal of Tianjin University Science and Technology
ISSN (Online): 0493-2137
E-Publication: Online Open Access
Vol:56 Issue:02:2023
DOI 10.17605/OSF.IO/APQYH
such as audio and touch to facilitate visually impaired people. The advance and improved
mobile assistive technology need a successful association of computer science to obtain
the potential application of such technology. This paper conduct more effective
accessibility of mobile phone for visual impairment. The results discuss distinct assistive
applications designed using mainstream devices that can be used in surroundings [23,
24].
3. METHODOLOGY
The proposed system is presented on an android application that detects different objects
in real-time.
3.1 System Working
In the android application the user has two options for camera that are Internal and
External camera. The Internal camera is the camera that is present in the android phone
and External camera is the esp32-camera that is placed on white cane. The object
detection will be done from real-time video taken from their android phone camera or
esp32-camera. When visually impaired people decide to begin the object detection
procedure he opens the camera of android phone or esp32-cam and feeds the real-time
video to YOLOv4-tiny model. The object detection is done by using YOLOv4-tiny
algorithm which is trained on both Custom and COCO dataset. After this, the distance is
calculated by using distance formula. Then audio system converts the object’s label and
distance into audio format. The audio is then played on the android phone speaker as an
output for the visually impaired (Fig. 1).
Feb 2023 | 72
Tianjin Daxue Xuebao (Ziran Kexue yu Gongcheng Jishu Ban)/
Journal of Tianjin University Science and Technology
ISSN (Online): 0493-2137
E-Publication: Online Open Access
Vol:56 Issue:02:2023
DOI 10.17605/OSF.IO/APQYH
Custom and MS-COCO (Microsoft Common objects in Context) image datasets are used
[25]. Data acquired from the MS-COCO image dataset can be used to identify objects. 3,
30,000 images make up this dataset, although more approximately 2,000 000 of them
have labels, divided equally between training and testing. This dataset has 80 classes
and 640x480 image quality [26]. For Custom dataset the images are taken from "Kaggle
Repository" and “Chrome web store” which contains 4500 images for object detection. In
this dataset, we collect 500 images of each object. For object detection Custom classes
are bed, chair, couch/sofa, table, wall, door, machine, stairs, and person. Images that are
collected from Kaggle are already arranged but those images that are collected from
chrome web store are not arranged firstly these must be arranged in an order. These
images are arranged by using python script. After these images are labeled by using
labelImg tool. A bounding box is created around the images. In this way the needed area
of image is selected. The size of images with image format that is yolo is saved in the
selected directory and two types of files are generated that are annotated image file and
text file [27-42] (Fig. 2).
Feb 2023 | 73
Tianjin Daxue Xuebao (Ziran Kexue yu Gongcheng Jishu Ban)/
Journal of Tianjin University Science and Technology
ISSN (Online): 0493-2137
E-Publication: Online Open Access
Vol:56 Issue:02:2023
DOI 10.17605/OSF.IO/APQYH
Feb 2023 | 74
Tianjin Daxue Xuebao (Ziran Kexue yu Gongcheng Jishu Ban)/
Journal of Tianjin University Science and Technology
ISSN (Online): 0493-2137
E-Publication: Online Open Access
Vol:56 Issue:02:2023
DOI 10.17605/OSF.IO/APQYH
Figure 4: Image passes through a lens; it appears along with the relevant angles
The exact distance of such object as from convex length is shown by the line "a."
Additionally, "b" describes how the real image appears. Imagine a triangle with a baseline
of "a" on the left edge of the image (the new light refracting image), and then construct a
triangle on the right side that is identical to the triangle here on left. This means that the
new bottom of the reverse triangle will maintain the same vertical angle. This means that
the new bottom of the reverse triangle will maintain the same vertical angle.
Furthermore, if we examine the two triangles from the right side, we can see that "a" and
"b" are identical and that the angles formed on every side of both vertices are located at
an acute angle to one another. It leads us to believe that both of the triangles on the upper
edge are comparable. Since they are comparable, so the ratio of the adjacent angles will
always be comparable. So a/b = X/Y. If we contrast two triangles on the right edge of the
image, one of which has a right angle (90°) while the other two have opposite angles that
are equal. As a result, X and Y are both hypotenuses of a triangle that is comparable to
them both and has a right angle. In this case, the new function can be expressed as:
a/b = X/Y= c/(b-c) (1)
When we calculate that equation, we would discover:
1/c = 1/a +1/b (2)
And will finally arrive at
Distance = c + Z/k (3)
By just using the given equations, where c seems to be the focal length, commonly known
as the arc length:
c = (180 x 3.14 x 2)/ 360 (4)
With the use of this distance calculation, we can get our ultimate result in "meters." The
distance is calculated using this formula.
Distance = (180 x 3.14 x 2) / ((width + height x 360) x 1000 + 3)/3.937
[47]. (5)
3.2.4 Audio Output:
In this project, Google Text-To-Speech (gTTS) library is used to generate audio output
for the visually impaired. This library converts the text into speech and saved as a mp3
Feb 2023 | 75
Tianjin Daxue Xuebao (Ziran Kexue yu Gongcheng Jishu Ban)/
Journal of Tianjin University Science and Technology
ISSN (Online): 0493-2137
E-Publication: Online Open Access
Vol:56 Issue:02:2023
DOI 10.17605/OSF.IO/APQYH
file. The audio segment is the output of our system that gives the name of the detected
object as well as distance from that object to the person. The audio output helps the
visually impaired people in understanding their surrounding environment [48-51].
3.2.5 ESP32-Camera Module
For object detection process esp32-camera is also used that is placed on white cane. In
the proposed system model gets feed from the camera using webserver running inside
esp32-camera. When esp32-camera gets power, it searches Wi-Fi by a name dlink-51C4.
After entering password android application gets an IP address and connected to esp32-
cam. Later android application takes video stream from the esp32-camera and feed the
model through IP address.
Figure 6: The output of the camera detecting images In front and measuring it in
meters
Feb 2023 | 76
Tianjin Daxue Xuebao (Ziran Kexue yu Gongcheng Jishu Ban)/
Journal of Tianjin University Science and Technology
ISSN (Online): 0493-2137
E-Publication: Online Open Access
Vol:56 Issue:02:2023
DOI 10.17605/OSF.IO/APQYH
References
[1] Hakobyan, L., Lumsden, J., O’Sullivan, D., & Bartlett, H. (2013). Mobile assistive technologies for the
visually impaired. Survey of ophthalmology, 58(6), 513-528.
[2] Ahmetovic, D., Gleason, C., Kitani, K. M., Takagi, H., & Asakawa, C. (2016, April). NavCog: turn-by-turn
smartphone navigation assistant for people with visual impairments or blindness. In Proceedings of the
13th International Web for All Conference (pp. 1-2).
[3] Griffin-Shirley, N., Banda, D. R., Ajuwon, P. M., Cheon, J., Lee, J., Park, H. R., & Lyngdoh, S. N. (2017).
A survey on the use of mobile applications for people who are visually impaired. Journal of Visual
Impairment & Blindness, 111(4), 307-323.
[4] Budrionis, A., Plikynas, D., Daniušis, P., & Indrulionis, A. (2022). Smartphone-based computer vision
travelling aids for blind and visually impaired individuals: A systematic review. Assistive
Technology, 34(2), 178-194.
[5] Huang, H. (2018). Blind users’ expectations of touch interfaces: factors affecting interface accessibility
of touchscreen-based smartphones for people with moderate visual impairment. Universal Access in the
Information Society, 17(2), 291-304.
[6] Bandyopadhyay, S., & Rathod, B. B. (2017). The sound and feel of titrations: A smartphone aid for color-
blind and visually impaired students.
[7] Kuriakose, B., Shrestha, R., & Sandnes, F. E. (2022). Tools and technologies for blind and visually
impaired navigation support: a review. IETE Technical Review, 39(1), 3-18.
[8] See, A. R., Sasing, B. G., & Advincula, W. D. (2022). A Smartphone-Based Mobility Assistant Using
Depth Imaging for Visually Impaired and Blind. Applied Sciences, 12(6), 2802.
[9] Martiniello, N., Eisenbarth, W., Lehane, C., Johnson, A., & Wittich, W. (2022). Exploring the use of
smartphones and tablets among people with visual impairments: Are mainstream devices replacing the
use of traditional visual aids?. Assistive Technology, 34(1), 34-45.
Feb 2023 | 77
Tianjin Daxue Xuebao (Ziran Kexue yu Gongcheng Jishu Ban)/
Journal of Tianjin University Science and Technology
ISSN (Online): 0493-2137
E-Publication: Online Open Access
Vol:56 Issue:02:2023
DOI 10.17605/OSF.IO/APQYH
[10] Khan, A., & Khusro, S. (2021). An insight into smartphone-based assistive solutions for visually
impaired and blind people: issues, challenges and opportunities. Universal Access in the Information
Society, 20(2), 265-298.
[11] Abraham, C. H., Boadi-Kusi, B., Morny, E. K. A., & Agyekum, P. (2021). Smartphone usage among
people living with severe visual impairment and blindness. Assistive Technology, 1-8.
[12] Nimmolrat, A., Khuwuthyakorn, P., Wientong, P., & Thinnukool, O. (2021). Pharmaceutical mobile
application for visually-impaired people in Thailand: development and implementation. BMC medical
informatics and decision making, 21(1), 1-19.
[13] Warnars, H. L., Nicholas, N., Raihan, M., Ramadhan, A., Mantoro, T., & Wan Adnan, W. A. (2021).
Mobile application for the blind and their family. TEM J, 10, 1039-1044.
[14] Hussain, A., & Omar, A. M. (2020). Usability evaluation model for mobile visually impaired applications.
[15] Moon, H., Cheon, J., Lee, J., Banda, D. R., Griffin-Shirley, N., & Ajuwon, P. M. (2020). Factors
influencing the intention of persons with visual impairment to adopt mobile applications based on the
UTAUT model. Universal Access in the Information Society, 1-15.
[16] Madrigal-Cadavid, J., Amariles, P., Pino-Marín, D., Granados, J., & Giraldo, N. (2020). Design and
development of a mobile app of drug information for people with visual impairment. Research in Social
and Administrative Pharmacy, 16(1), 62-67.
[17] Sáez, Y., Muñoz, J., Canto, F., García, A., & Montes, H. (2019). Assisting visually impaired people in
the public transport system through RF-communication and embedded systems. Sensors, 19(6), 1282.
[18] Real, S., & Araujo, A. (2019). Navigation systems for the blind and visually impaired: Past work,
challenges, and open problems. Sensors, 19(15), 3404.
[19] Botzer, A., Shvalb, N., & she, B. B. M. (2018, September). Using sound feedback to help blind people
navigate. In Proceedings of the 36th European Conference on Cognitive Ergonomics (pp. 1-3).
[20] Lin, B. S., Lee, C. C., & Chiang, P. Y. (2017). Simple smartphone-based guiding system for visually
impaired people. Sensors, 17(6), 1371.
[21] K. Hamid, H. Muhammad, M. waseem Iqbal, S. Bukhari, A. Nazir, and S. Bhatti, “ML-BASED
USABILITY EVALUATION OF EDUCATIONAL MOBILE APPS FOR GROWN-UPS AND ADULTS,” Jilin
Daxue Xuebao (Gongxueban)/Journal of Jilin University (Engineering and Technology Edition), vol. 41,
pp. 352–370, Dec. 2022, doi: 10.17605/OSF.IO/YJ2E5.
[22] D. Hussain, S. Rafiq, U. Haseeb, M. W. Iqbal, K. Hamid, S. U. Bhatti and M. Aqeel. “HCI Empowered
Automobiles Performance by Reducing Carbon-Monoxide”. Jilin Daxue Xuebao (Gongxueban)/Journal
of Jilin University (Engineering and Technology Edition). Vol. 41, no. 12, Pp. 48-74, 2022.
[23] A. Yousaf, M. W. Iqbal, M. Arif, A. jaffar, A. Brezulianu and O. Geman. “Adoption of Conceptual Model
for Smartphones among Older People”. MDPI, Applied Sciences, Vol. 12, no. 24, Pp. 1-14, 2022.
[24] Kahraman, M., & Turhan, C. (2022). An intelligent indoor guidance and navigation system for the
visually impaired. Assistive Technology, 34(4), 478-486.
[25] R. Khalid, M. W. Iqbal, N. Samad, M. Ishfaq, R. Rashed and S. Rafiq. “Traffic Light Issues for Visually
Impaired People”. Jilin Daxue Xuebao (Gongxueban)/Journal of Jilin University (Engineering and
Technology Edition). Vol. 41, no. 12, Pp. 371-391, 2022.
[26] Naganjaneyulu Satuluri, G. A., Mounika, V., & Shalini, R. (2022). Real-Time Object Detection Using
Various Yolo Algorithms with Audio Feedback. Mathematical Statistician and Engineering Applications,
71(3), 774-783.
[27] “Kaggle,” [Online]. Available: https://www.kaggle.com/competitions/day-3-kaggle-competition.
Feb 2023 | 78
Tianjin Daxue Xuebao (Ziran Kexue yu Gongcheng Jishu Ban)/
Journal of Tianjin University Science and Technology
ISSN (Online): 0493-2137
E-Publication: Online Open Access
Vol:56 Issue:02:2023
DOI 10.17605/OSF.IO/APQYH
Feb 2023 | 79
Tianjin Daxue Xuebao (Ziran Kexue yu Gongcheng Jishu Ban)/
Journal of Tianjin University Science and Technology
ISSN (Online): 0493-2137
E-Publication: Online Open Access
Vol:56 Issue:02:2023
DOI 10.17605/OSF.IO/APQYH
Feb 2023 | 80