0% found this document useful (0 votes)
55 views4 pages

Mouse Cursor Control With Eye Gestures

Human Computer Interface

Uploaded by

Gaurav Kathe
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
55 views4 pages

Mouse Cursor Control With Eye Gestures

Human Computer Interface

Uploaded by

Gaurav Kathe
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Proceedings of the 7th International Conference on Inventive Computation Technologies (ICICT 2024)

Xplore Part Number : CFP24F70-ART ; ISBN : 979-8-3503-5929-9

Mouse Cursor Control with Eye Gestures


1st A. Sree Rama Chandra Murthy 2nd Daggula Anitha 3rd Narra Manidhar Chowdary
Department of Computer Science and Department of Computer Science and Department of Computer Science and
Engineering Engineering Engineering
2024 International Conference on Inventive Computation Technologies (ICICT) | 979-8-3503-5929-9/24/$31.00 ©2024 IEEE | DOI: 10.1109/ICICT60155.2024.10544549

Lakireddy Bali Reddy College of Lakireddy Bali Reddy College of Lakireddy Bali Reddy College of
Engineering (Autonomous) Engineering (Autonomous) Engineering (Autonomous)
Mylavaram, India Mylavaram, India Mylavaram, India
[email protected] anithareddy1215@gmailcom [email protected]
4th Srinivasan Siddhartha Sai Kumar
Department of Computer Science and
Engineering
Lakireddy Bali Reddy College of
Engineering (Autonomous)
Mylavaram, India
[email protected]

Abstract—It can be quite difficult for a person with physical make computer usage not only effective but also universally
disabilities to control the mouse. The research study suggested accessible.
employing eye movements to control the mouse pointer to
Computers are easily accessible to anyone who can use their
provide a solution for individuals who are unable to utilize a
physical mouse. An alternate method of utilizing a computer is hands to control a mouse and keyboard and see the content
called eye gazing, which uses eye movements to operate the on the monitor. Nowadays, numerous approaches have arisen
mouse. Eye gazing is an alternate way that lets a person control that allow even blind people to utilize computers via text-to-
their computer with their eyes if they find touchscreens and speech technology, which pronounces the words and content
mice difficult. One essential real-time input method for human- displayed on the screen. Nonetheless, people without hands
computer interaction is eye movement, which is crucial for those
with physical disabilities. This system proposes a revolutionary
face difficulties in computer use due to their incapacity to
eye control method that uses a webcam and doesn't require any operate mouse activities. There are numerous approaches for
additional hardware in order to increase the eye tracking employing cameras to capture our faces and use them for a
technique's mobility, usability, and reliability in user-computer variety of purposes. However, there is no proper framework
interaction. The main goal of the suggested system is to offer an for assisting physically challenged people. People with
easy-to-use interactive mode that just requires the user's eyes. disabilities and limited movement, particularly those who can
The suggested system's usage flow is made to completely mimic
the regular routines of people. The suggested method explains
only move their eyes, would benefit from a system that allows
how to utilize a webcam and Python to control the pointer them to operate the mouse cursor using their gaze.
movement on the monitor by implementing both iris and Furthermore, persons without disabilities may demand such
movement of the cursor based on iris position. a system. The overarching goal of scientific visualization,
Keywords—Eye gaze, Mouse cursor control, Assistive virtual reality for individuals, and multimedia technology is
technology, Eye tracking, Human-computer communication, to determine the most effective way for people to connect
Accessibility, HCI, User-computer interaction. with computer systems. Initially, the camera takes the image
and uses OpenCV code to detect pupils, focusing on the eye
I. INTRODUCTION
in the image. This mechanism establishes the human eye's
In an era of rapid technological advancement, the imperative center position, specifically the pupil. Technology for
to ensure universal access to the benefits of computer tracking eye, which uses an eye tracker to record eye
technology becomes increasingly evident. Unfortunately, the movement and location, is becoming increasingly important
standard input devices of mice and keyboards pose significant in the fields of psychology [1], advertising [2], and graphical
challenges for individuals with physical disabilities, limiting user interfaces [3]. Eye trackers have been for a while, but
their ability to engage with computers effectively. This they were mostly employed in laboratory tests to examine the
project responds to the pressing need for alternative nature of human eye movements, rather than as an actual
communication and interaction methods, particularly tailored control medium within a human-computer interface
to the diverse needs of disabled individuals, thereby fostering (HCI).[2]. The eyes are the window to the soul. Eye motions
their inclusion in the digital landscape. By focusing on those give a rich and useful window into a person's thoughts and
with limited hand mobility, the project seeks to pioneer a intentions, hence investigating eye movement has become a
system that enables computer interaction solely through eye popular and hard topic of Human-Computer Interaction
movements, providing a groundbreaking solution for greater (HCI) research. Eye-movement detection research
independence. The innovative approach of replacing encompasses eye tracking, gazing, blinking, and pupil
traditional input devices with eye movements as the primary movements; these research areas are frequently overlapping
means of control addresses the unique challenges faced by or interchangeable.[3] Eye trackers have a longstanding
paralyzed or physically challenged individuals, aiming to presence in research laboratories, where their primary
application has been to study human eye movements.

979-8-3503-5929-9/24/$31.00 ©2024 IEEE 980


Authorized licensed use limited to: Dr. D. Y. Patil Educational Complex Akurdi. Downloaded on August 07,2024 at 04:52:29 UTC from IEEE Xplore. Restrictions apply.
Proceedings of the 7th International Conference on Inventive Computation Technologies (ICICT 2024)
Xplore Part Number : CFP24F70-ART ; ISBN : 979-8-3503-5929-9

However, they have historically been less utilized as an active adaptability to diverse facial features, hindering universal
control medium in human-computer interfaces (HCI). [4] It applicability in a 100-word summary.[4]
is used to help people those who are not having hands can M. Mangaiyarkarasi and A. Geetha present a vision-based
work with mouse with help of their eyes. human-computer interface that detects faces using a webcam
and matches templates to identify eye regions. Haar feature
technique extracts eye features, and SVM classification
II. LITERATURE SURVEY recognizes eye movements (open, close, left, right). Cursor
control is achieved by correlating specific eye movements
The system, designed by Michelle Alva, Neil Castellino, with directional cursor movements using Hough transform
Rashmi Deshpande, Kavita Sonawane, and Monalisa Lopes, with a circular method. The system, operating on a notebook
utilizes Haar cascade classifiers for face detection and with a standard webcam, facilitates computer interaction for
Circular Hough Transform algorithm to precisely track the physically challenged individuals based on eye movements.
pupil via a standard webcam. Image selection is determined However, accuracy may be affected by lighting variations
by gaze duration, confirmed through pupil tracking, and diverse facial features, while external factors like
triggering notifications for the aide upon patient background interference or sudden movements may impact
confirmation. This non-intrusive, eye-based interface serves reliability.[5]
paralyzed individuals, streamlining communication.
However, effectiveness may be compromised for those with Prithvi J introduces an innovative Gesture-Controlled
impaired eye control. Webcam calibration and lighting system, revolutionizing human-computer interaction through
conditions impact face and pupil detection accuracy, seamless integration of hand gestures and voice commands.
warranting attention to these factors for optimal system Unlike traditional methods, our system eliminates the
performance.[1] necessity for physical communication with a computer,
providing virtual interface for comprehensive control over
The eye control system, developed by Xuebai Zhang, input and output operations. Utilizing cutting-edge Machine
Xiaolong Liu, Shyan-Ming Yuan, and Shu-Fan Lin, Learning and Computer Vision algorithms, the system excels
seamlessly integrates mouse and keyboard functions, in recognizing both static and dynamic hand gestures, along
prioritizing simplicity and user convenience. Leveraging with efficient processing of voice commands. Remarkably,
natural eye movements, the system includes a precision- no additional hardware is required, enhancing user
enhancing magnifier module. Through interactive tasks like convenience. It moves cursor by hand movements. The
article searches and multimedia web browsing, it outperforms system comprises two integral modules. The first module
existing systems. Utilizing the Technology Acceptance employs Media Pipe Hand detection, directly interacting with
Model, perceived effectiveness is gauged, affirming the hand gestures, while the second module interfaces with
system's efficiency. However, user adaptation may pose gloves of any uniform color.[15]
initial challenges, impacting immediate acceptance. The
integration of mouse and keyboard functions, while III. EXISTING SYSTEM
beneficial, may introduce technical complexities to the The Gesture Controlled Virtual Mouse system incorporates a
system. Overall, the results underscore high usability and module, that improves the user experience by using speech
interface effectiveness. [2] automation. This component empowers people to easily
T. Soukupová and Jan Cech's real-time eye blink perform cursor activities like as pointing, moving, and
detection system makes use of landmark detectors trained on scrolling, selecting with the ease of voice commands. The
a variety of datasets, ensuring resistance to changes in head voice automation module is crafted with cutting-edge speech
orientation, illumination, and facial expressions. Accurate recognition algorithms, ensuring precise identification of user
landmark positions are obtained, and the eye aspect ratio is voice commands. Its seamless interaction with the system's
used as a single measure that describes the eye width in every other two modules allow for a fluid transition between hand
frame. A Support Vector Machine (SVM) classifier identifies movements and voice commands. The drawback of this
blink patterns within short temporal frames of EAR values, system is while speech recognition it may face problems due
surpassing conventional techniques on typical datasets. to noisy environment and with accents.[15]
Despite its effectiveness, challenges may arise in extreme • Input Layer: In this layer, it takes source image and
lighting or occluded scenarios, affecting landmark detection performs preparation activities, such as normalization, to
accuracy. The reliance on SVM classifiers may also pose ensure optimal data input.
limitations in handling complex non-linear relationships,
necessitating careful optimization for optimal • Convolution Layer: Utilizing multiple filters, it performs
performance.[3] convolution operations on the source image to extracting
pertinent features. The resulting output is known as a
Chairat Krachan and Suree Pumrin, integrates Haar-like feature image.
feature-based eye and face detection with Bilateral Total
Variation super-resolution, enhancing low-resolution • Activating Function: By adding nonlinearity to the feature
webcam images. Centering on eye-pair detection, it translates maps, this step improves the network's ability to learn
to mouse coordinates, enabling clicks via left and right eye complex patterns.
blinks. Evaluation encompasses sensitivity to distance, head
pose, and eye-center variations, mirroring real-world usage. • Pooling Layer: To diminish spatial dimensions and
Challenges include low-light impact on fluorescence-based minimize computational complexity, this layer reduces
illumination, potential for unintentional actions due to precise the size of the feature maps through pooling operations.
eye control, calibration issues affecting accuracy, and limited

979-8-3503-5929-9/24/$31.00 ©2024 IEEE 981


Authorized licensed use limited to: Dr. D. Y. Patil Educational Complex Akurdi. Downloaded on August 07,2024 at 04:52:29 UTC from IEEE Xplore. Restrictions apply.
Proceedings of the 7th International Conference on Inventive Computation Technologies (ICICT 2024)
Xplore Part Number : CFP24F70-ART ; ISBN : 979-8-3503-5929-9

• By repeating convolution, Activating, and pooling steps analyzing patterns, enabling precise localization. This
are repeated through multiple layers, enhancing the process facilitates narrowing down the region of interest for
network's ability to discern hierarchical features. eye tracking, crucial for accurately interpreting user input in
the cursor control system.By OpenCV it will read the image
• Flatten Layer: In this layer, it converts feature maps to and features file.
vector format, preparing them for input into subsequent
connected layer. (𝑃2 − 𝑃6) + (𝑃3 − 𝑃5)
𝐸𝐴𝑅 = (1)
• Connected Layer: Performing the categorization process, 2(𝑃4 − 𝑃1)
this layer analyzes the flattened vector data to make
categorical predictions, completing the convolutional
neural network architecture.
IV. PROPOSED SYSTEM 4.1.3 INTERPRETING EYE MOVEMENTS

The proposed system is developed to address the specific Interpreting eye movements involves analyzing the detected
eyes, particularly focusing on blinking patterns. By
needs of individuals with limited hand mobility through an
leveraging computer vision techniques, the system
innovative eye-based cursor movement approach.
recognizes specific eye gestures and translates them into
Leveraging OpenCV-based algorithms, the system aims to actionable input commands. This functionality allows users
achieve precise and stable pupil position detection within the to control the cursor by natural eye movements, enhancing
user's eyes, ensuring accurate cursor control. The initial step the overall usability of the eye-based interface.
involves accessing the computer's webcam to capture real-
time video or images, serving as the input source for eye 4.1.4 CURSOR CONTROL
tracking. The system employs computer vision techniques for Cursor control refers to altering the position of the on-screen
eye detection, possibly incorporating face detection to narrow cursor in response to interpreted eye movements. The
down the region of interest. To enhance accuracy, the project technology uses the information received from eye tracking
includes a module for tracking head movement, adjusting the to move the cursor smoothly across the screen. Integrating
pointer position according to the user's head orientation. The head movement tracking, if applicable, enhances precision.
core functionality lies in interpreting eye movements, This dynamic interaction ensures users can navigate digital
particularly blinking, as input commands for actions like interfaces effectively using their eyes, fostering inclusivity
selecting or clicking on-screen elements. Ultimately, this and accessibility.
user-friendly solution is poised to foster independence and 4.1.5 TESTING AND CALIBRATION
inclusivity in the digital realm, promoting equal opportunities
for individuals with disabilities. Testing and calibration involve assessing the system's
performance and fine-tuning parameters to optimize accuracy
4.1 SYSTEM ARCHITECTURE and user experience. Users provide feedback during testing,
allowing developers to refine algorithms and address any
issues. Calibration processes, tailored to individual user
preferences, ensure the system adapts to diverse needs,
promoting reliability and personalization. Regular testing and
calibration cycles enhance the eye-based cursor movement
system's robustness and responsiveness in various usage
scenarios.

Fig. 1 System Design

4.1.1 CAPTURE REAL-TIME VIDEO


Capturing real-time video involves accessing a
computer's webcam to obtain a continuous stream of live
video frames. Using OpenCV in Python, the program
initializes the camera, retrieves video frames, and displays
them. This essential step provides the input source for
subsequent image processing tasks in the eye-based cursor
movement system. Fig. 2 Eyeball movement on X axis

4.1.2 EYE AND FACE DETECTION


Eye and face detection employ computer vision algorithms to
locate eyes and faces within a video frame. Utilizing Haar
cascades in OpenCV, the system identifies facial features by

979-8-3503-5929-9/24/$31.00 ©2024 IEEE 982


Authorized licensed use limited to: Dr. D. Y. Patil Educational Complex Akurdi. Downloaded on August 07,2024 at 04:52:29 UTC from IEEE Xplore. Restrictions apply.
Proceedings of the 7th International Conference on Inventive Computation Technologies (ICICT 2024)
Xplore Part Number : CFP24F70-ART ; ISBN : 979-8-3503-5929-9

quality of life. Being able to engage with the digital world


effectively opens up opportunities for communication,
education, work, and entertainment that might otherwise be
limited. In conclusion, this research study is a significant step
forward in making technology more inclusive and accessible
for all individuals, regardless of their physical abilities. It
aligns with the broader societal goal of creating an
Information Society where everyone can participate fully and
equally in the digital age, bridging the gap between
technology and disability.
References
[1] Neil Castellino and Michelle Alva, “An image-based eye controlled
assistive system for paralytic patients”, IEEE Conference
Publications, 2017.
[2] Shu-Fan Lin, Xuebai Zhang, Shyan-Ming Yuan, "Eye Tracking
Fig. 3 Eyeball movement on Y axis Based Control System for Natural Human-Computer
Interaction", Computational Intelligence and Neuroscience, 2017.
[3] Čech and Soukupová, “Real-Time Eye Blink Detection using Facial
Landmarks”, Center for Machine Perception, February 2016.
[4] Suree Pumrin and Chairat kraichan, “Face and eye tracking for
controlling computer functions”, IEEE Conference Publications,
2014.
[5] A. Geetha and M. Mangaiyarkarasi, “Cursor Control System Using
Facial Expressions for Human-Computer Interaction”, Vol 8
Issue 1 APRIL 2014, ISSN: 0976-1353 International Journal of
Emerging Technology in Computer Science & Electronics, pp 30-
34.
[6] Florina Ungureanu, Robert Gabriel Lupu and Valentin Siriteanu,
“Eye tracking mouse for human-computer interaction”, IEEE
Conference Publications, 2013.
[7] Jilin Tu and Thomas Huang, Face as Mouse through Visual Face
Fig. 4 Eyeball movement X and Y axis Tracking, 2005.
[8]EniChul Lee Kang Ryoung Park, A robust eye gaze tracking method
based on a virtual eyeball model, Springer, pp. 319-337, Apr
2008.
[9] John J. Magee, Margrit Betke, James Gips, Matthew R. Scott and
Benjamin N. Waber, "A Human-Computer Interface Using
Symmetry Between Eyes to Detect Gaze Direction", IEEE Trans,
vol. 38, no. 6, pp. 1248-1259, Nov 2008.
[10] Sunita Barve, Dhaval Dholakiya, Shashank Gupta and Dhananjay
Dhatrak, "Facial Feature Based Method For Real Time Face
Detection and Tracking I-CURSOR", International Journal of
Engg Research and App., vol. 2, pp. 1406-1410, Apr 2012.
[11] Yu-Tzu Lin, Ruei-Yan Lin, Yu-Chih Lin and Greg C Lee, Real-
time eye-gaze estimation using a low-resolution webcam,
Springer, pp. 543-568, Aug 2012.
[12] Samuel Epstein-Eric Missimer MargritBetke, "Using Kernels for
avideo-based mouse-replacement interface", Springer link, Nov
2012.
[13] Zakir Hossain, Md Maruf Hossain Shuvo and Prionjit Sarker,
"Hardware and software implementation of real time
electrooculogram (EOG) acquisition system to control computer
cursor with eyeball movement", 2017 4th International
Conference on Advances in Electrical Engineering (ICAEE), pp.
Fig.5 Selecting the folder through eyes 132-137, 2017.
[14] Jun-Seok Lee, Kyung-hwa Yu, Sang-won Leigh, Jin-Yong Chung
and Sung-Goo Cho, Method for controlling device on the basis of
V. CONCLUSION eyeball motion and device therefor, January 2018.
[15]Prithvi J, Suraj Nair, “Gesture Controlled Virtual Mouse with Voice
The research work has achieved its primary goal by Automation” ISSN: 2278-0181,IJERTV12IS040131,Vol. 12
creating a system that allows people with disabilities to Issue 04, April-2023
[16] Po-Lei Lee, Jyun-Jie Sie, Yu-Ju Liu, Chi-Hsun Wu, Ming-Huan
interact with computers effectively using their eye
Lee, Chih-Hung Shu, et al., "An SSVEP-actuated brain computer
movements. This successful implementation means that interface using phase-tagged flickering sequences: a cursor
individuals who may have limited or no use of their hands can system", Annals of biomedical engineering, vol. 38, no. 7, pp.
still use computers independently. The use of advanced 2383-2397, 2010.
algorithms for tasks has significantly improved computer
accessibility for individuals with disabilities. These
algorithms ensure that the system can accurately and reliably
track a user's eye movements, which is crucial for effective
cursor control. By providing a means for disabled individuals
to interact with computers more easily and independently, the
project has the potential to greatly improve their overall

979-8-3503-5929-9/24/$31.00 ©2024 IEEE 983


Authorized licensed use limited to: Dr. D. Y. Patil Educational Complex Akurdi. Downloaded on August 07,2024 at 04:52:29 UTC from IEEE Xplore. Restrictions apply.

You might also like