Human Computer Interaction Based Eye Controlled Mouse: Histogram of Oriented Gradients (HOG)
Human Computer Interaction Based Eye Controlled Mouse: Histogram of Oriented Gradients (HOG)
Human Computer Interaction Based Eye Controlled Mouse: Histogram of Oriented Gradients (HOG)
Abstract— With advanced technologies in this digital or in case of those who are unable to moves their
era, there is always scope for development in the field hands, there arises a need for using these hands free
of computing. Hands free computing is in demand as mouse.
of today it addresses the needs of quadriplegics. This Usually , eye movements and facial movements are the
paper presents a Human computer interaction (HCI) basis for hands free mouse.
system that is of great importance to amputees and
those who have issues with using their hands. The II. RELAT ED WORK
system built is an eye based interface that acts as a There are various methods using which this can be
computer mouse to translate eye movements such as achieved. The camera mouse was proposed by Margrit
blinking, gazing and squinting towards the mouse Betke et. al.[1] for people who are quadriplegic and
cursor actions. The system in discussion makes use of nonverbal . The movements of the user are tracked
a simple webcam and its software requirements are using a camera and these can be mapped to the
Python(3.6), OpenCv , numpy and a few other movements of the mouse pointer which is visible on
packages which are necessary for face recognition. The the screen. Yet another method was proposed by
face detector can be built using the HOG (Histogram Robert Gabriel Lupu, et. al.[2] for human computer
of oriented Gradients) feature along with a linear interaction that made use of head mounted device to
classifier, and the sliding window technique. It is track eye movement and to translate it on screen .
hands free and no external hardware or sensors are Another technique by Prof. Prashant salunke et.al [3]
required. presents a techniques of eye tracking using Hough
Keywords: Python(3.6), OpenCv ,Human computer transform.
interaction, numpy,
face recognition, Histogram of Oriented Gradients A lot of work is being done to improve the
(HOG) characteristics of HCI.A paper by Muhammad Usman
Ghani, et. al [4] suggests that the movements of the
eye can be read as an input and used to help the user
access the interfaces without using any other hardware
I. INT RODUCT ION
device such as a mouse or a keyboard[5]. This can be
The computer mouse or moving the finger has been a achieved by using image processing algorithms and
very common approach to move the cursor along the computer vision. One way to detect the eyes is, by
screen in the current technology. The system detects using the Haar cascade feature.The eyes can be
any movement in the mouse or the finger to map it to detected by matching it with templates which would
the movement of the cursor. Some people who do not already be stored as sugges ted by Vaibhav Nangare et.
have their arms to be operational, called as ‘amputees’ al [6]. To get an accurate image of iris an IR sensor can
will not be able to make use of the current technology be used. A gyroscope can be used for the orientation of
to use the mouse. Hence, if the movement of their the head as suggested by Anamika Mali et. al [7]. The
eyeball can be tracked and if the direction towards click operation can be implemented by ‘gazing’ or
which the eye is looking at can be determined, the staring at the screen. Also, by gazing at a fraction of
movement of the eyeball can be mapped to the cursor any portion of the screen(upper or lower), the scroll
and the amputee will be able to move the cursor at function can be implemented as proposed by Zhang et.
will. An ‘eye tracking mouse’ will be of a lot of use to al [8].
an amputee Currently, the eye tracking mouse is not
available at a large scale, and only a few companies Along with eye movements, it becomes easier if we
have developed this technology and have made it incorporate some subtle movements of the face and its
available. We intend to prepare an eye tracking mouse parts as well. A real-time eye blink detection using
where most of the functions of the mouse will be facial landmarks as suggested by Tereza Soukupova
available, so that the user can move the cursor using and Jan ´ Cech [9] brings out how the blink action can
his eye. We try to estimate the ‘gaze’ direction of the be detected using facial landmarks. This is a major
user and move the cursor along the direction along aspect as blinking actions are necessary for translating
which his eye is trying to focus. it into clicking actions. Detecting eyes and other facial
parts can be done using openCv and Python with dlib
The pointing and clicking action of the mouse has [10].Similarly even blink can be detected.[11].the
remained a standard for quite some time. However , paper by Christos Sagonas et.al [12] discusses the
due to some reasons one may find them uncomfortable
2) Once the frame is extracted, the regions of the face Once the 4 arrays are prepared, boundaries, or
need to be detected. Hence, the frames will undergo a ‘contours’ are drawn around the points using 3 of
set of image-processing functions to process the frame these arrays by connecting these points, using the
in a suitable way, so that it is easy for the program to ‘drawcontour’ function and the shape formed is
detect the features such as eyes, mouth, nose, etc. around the two eyes and the mouth.
=
978-1-7281-0167-5/19/$31.00 ©2019 IEEE 363
Proceedings of the Third International Conference on Electronics Communication and Aerospace Technology [ICECA 2019]
IEEE Conference Record # 45616; IEEE Xplore ISBN: 978-1-7281-0167-5
iv) Mouth and Eye aspect ratios: Once the contours Mouth-Aspect-Ratio (MAR)
are drawn, it is necessary to have a reference for the
shapes, which, when compared with, gives the
program any information about any action made by
these regions such as blinking, yawing, etc.
IV. RESULT
The scroll mode is activated by squinting. The This work can be extended to improve the speed of the
system by using better trained models. Also, the
scrolling can be done by moving the head up-down system can be made more dynamic by making the
which is called as pitching and by sideways called as change in the position of the cursor, proportional to the
yawing. Scroll mode is deactivated by squinting again. amount of rotation of the user’s head, i.e., the user can
The clicking action takes place by winking the eye. decide, at what rate he wants the position of the cursor
Right wink corresponds to right wink and left click to change. Also, future research work can be done on
corresponds to left wink making the ratio more accurate, since the range of the
values are the result of the aspect ratios, which is
usually small. Hence, to make the algorithm detect the
actions more accurately, there can be some
modification in the formulae for the aspect ratios used.
Also, to make the process of detection of the face more
easy, some image processing techniques can be used
before the model detects the face and features of the
face.
366
Proceedings of the Third International Conference on Electronics Communication and Aerospace Technology [ICECA 2019]
IEEE Conference Record # 45616; IEEE Xplore ISBN: 978-1-7281-0167-5