Real Time Implementation of Eye Tracking System Using Arduino Uno Based Hardware Interface

Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

International Journal of Engineering and Techniques - Volume 2 Issue 6, Nov – Dec 2016

RESEARCH ARTICLE OPEN ACCESS

Real Time Implementation of Eye Tracking


System Using Arduino Uno Based Hardware
Interface
B.K. Venugopal1, Dilson D’souza2
1
Associate professor 2Master of Engineering
Dept. of Electronics and Communication, University Visvesvaraya College of Engineering, Bangalore

Abstract:
Eye tracking system has played a significant role in many of today’s applications ranging from military
applications to automotive industries and healthcare sectors. In this paper, a novel system for eye tracking and
estimation of its direction of movement is performed. The proposed system is implemented in real time using an
arduino uno microcontroller and a zigbee wireless device. Experimental results show a successful eye tracking and
movement estimation in real time scenario using the proposed hardware interface.

Keywords — Arduino based hardware, eye tracking system, Binarization, eye region classification, Hough
transform, Viola-Jones algorithm and zigbee wireless device.
1. Wireless LAN Medium Access Control (MAC) and

1. INTRODUCTION
Eye tracking is one of the significant way towards purpose of respect on surface or the look heading.
measuring either the purpose of look (where one is A basic alignment technique of the individual is
looking) or the movement of an eye with respect to generally required before utilizing the eye tracker.
the head. An eye tracker is a gadget for measuring Two general sorts of infrared/close infrared
eye positions and eye development. Eye trackers (otherwise called dynamic light) eye following
are utilized as a part of exploration on the visual procedures are utilized: enhanced pupil and dim
framework, in brain research, in psycholinguistics, pupil. Their distinction depends on the area of the
showcasing, as an information gadget for human- light source as for the optics. On the off chance that
PC cooperation, and in item plan. There are various the luminance is coaxial with the optical way, then
techniques for measuring eye development. The the eye goes about as a retroreflector as the light
most mainstream variation utilizes video pictures reflects off the retina making a splendid student
from which the eye position is extricated. Different impact like red eye. In the event that the light
techniques use look curls or depend on the source is balanced from the optical way, then the
electrooculogram. student seems dull on the grounds that the
The most broadly utilized current plans are video- retroreflection
based eye trackers. A camera concentrates on one from the retina is coordinated far from the camera.
or both eyes and records their development as the Splendid pupil following makes more noteworthy
viewer takes a consideration at some sort of boost. iris/pupil contrast, permitting more vigorous eye
Most present day eye-trackers utilize the focal point following with all iris pigmentation, and incredibly
of the pupil and infrared/close infrared non- diminishes obstruction brought about by eyelashes
collimated light to make corneal reflections (CR). and other darkening features.
The vector between the understudy focus and the It likewise permits following in lighting conditions
corneal reflections can be utilized to process the going from aggregate haziness to brilliant. Yet,
splendid understudy methods are not powerful to

ISSN: 2395-1303 http://www.ijetjournal.org Page 140


International Journal of Engineering and Techniques - Volume 2 Issue 6, Nov – Dec 2016
track outside, as unessential IR sources meddle with difficulty in analyzing the eye- tracking experiment.
monitoring. The work also produces definitions on different
Another, less utilized, strategy known to be an metrics used along with suggestions of using the
inactive light. It utilizes the noticeable light to same from other related fields.
luminance, which may make a few diversions users. S. Chandra et. al [2] proposed an application based
Another test with this strategy is the differentiation on eye-tracking and their respective interaction with
of the understudy is not exactly in the dynamic light human beings. The work mainly focuses on the
strategies, accordingly, the focal point of iris is determination of direction of gaze along with their
utilized for figuring the vector instead. This count position and their and sequence of movement. The
needs to distinguish the limit of iris and the white work focuses on three aspects in the eye-tracking
sclera (limbus following). It introduces another test experiment. The first objective was to provide the
for vertical eye developments because of block of user an insight into the underlying issues present in
eyelids. the eye-movement technology. The second
The following sections in this paper are as follows. objective was to provide the user a guidance
The first section gives a brief introduction concerning the development of the eye-movement
concerning the significance of the area of interest technology. The third objective was to recognize
and the general problems associated with it. A the challenges and prospects in building a Man And
review of literature and related works is given in Machine Interfacing (MAMI) systems using the
section 2. The proposed system and implementation principle of eye-tracking. Experimental results
concerning the architecture and work flow of the show a reduced computational time in with respect
project is given in section 3 and 4 respectively. The to gaze input as compared to mouse input.
obtained results is given in section 5 and finally the Y. Zhang et. al [3] performed a comparative
conclusion of the overall work along with analysis on the stationary and mobile eye tracking
references is given. wayfinding system considering the EXITs design.
A mobile tracking system was used to detect the
2. LITERATURE SURVEY AND RELATED eye movement where the EXIT design was used in
WORKS the building. A stationary eye- tracking technique
The application which extensively uses the concept was used for the same purpose. A comparative
of eye tracking include sectors in automotive analysis was performed which resulted in some
industries, medical research, Fatigue simulation, conclusions which was primarily procured from
Vehicle simulators, cognitive studies, computer elements of appearnece, design and placement of
vision, activity recognition, etc. The significance of the EXIT on the building. Empirical methodologies
eye detection and tracking in commercial were introduced in the field of research with respect
applications has increased over a period of time. to wayfinding and space decision.
This significance of eye tracking applications lead D. Miyamoto et. al [4] proposed a method for the
to more efficient and robust designs which is enforcement of phishing prevention habits using the
necessary in many of today’s modern devices. eye tracking technique. The experiment was
An extensive review of literature has been done in performed where the eye movement of 23
the field of healthcare applications concerning the participants were analyzed and also considered that
eye tracking system. Some of these methods are the novice participant did not tend to possess a
mentioned below. similar habit. A prototype named eyebit was
developed which required the participant to look in
Z. Sharafi et. al [1] performed an evaluation in the the address bar and consequently enter some
metrics concerning the eye tracking in the software address in the web forms. The system was designed
engineering. The author brings about different such that it deactivates and reactivates (control
metrics concerning the eye tracking into one parameter) based on the participant eye movement
common standard to facililate researchers facing pattern. Experimental results show that the
significance in effectiveness of the proposal which

ISSN: 2395-1303 http://www.ijetjournal.org Page 141


International Journal of Engineering and Techniques - Volume 2 Issue 6, Nov – Dec 2016
was based on the participant eye movement patient. The conducted experiment resulted in a real
prediction even though an incovineance was time simulation of an eye tracking system.
observed in the Eyebit system. M. Kim et. al [8] proposed a system of eye
R. G Bozomitu et. al [5] proposed an eye tracking movement detection and tracking based on the
system based on the circular Hough transform cardinal direction. The experiment was conducted
algorithm. The experiment was aimed towards in such a way that the participants were asked to
producing benefits to the neuromotor disabled look towards north in a given cardinal direction and
patients. The signals were captured using an the respective reaction time for each cardinal
infrared video camera along with a pc. The process direction was observed. The experimental results
of eye tracking system implemented in this work demonstrates that the participant’s eye is fixed on
uses a keyword technology. Experimental results the character part, corner of shape and the cross
show that the optimal performance of the pupil eye over point. The cardinal direction was observed to
detection movement was based on a trade-off be small in range considering the gaze movement.
between the algorithms computational time and the A significant observation proclaimed in this
precision of detection of the pupil region in the eye- experiment was that the clarity of the cardinal
movement. direction confirmed the presence or absence of the
R. G. Bozomito et. al [6] performed a comparative character information.
analysis on the eye tracking applications with C. Jin et. al [9] proposed a technique based on the
respect to its pupil. The experiment was a gaze point compensation technique using the eye
comparison between two eye-movement detecting tracking system. The work proposes a length
algorithms mainly the circular Hough transform and compensation algorithm by considering the gaze
the Starbust transform. A parameter based point algorithm. Experimental result showed a
algorithm was implemented in the circular hough deviation of less than 1 cm in both the vertical and
transform and a feature based algorithm was horizontal directions.
implemented in the in the Starbust transform. In Z. Zhang et. al [10] proposed a classification
order to improve the curser stability a Gaussian method to separate the novice from experienced
smoothing filter along with appropriate spike drivers considering their reflect drivers attention
removal technique was implemented. Experimental and their skill. The classifier used in this
results showed that the Starbust based experiment is the binary Gaussian classifier
transformation had a higher accuracy as compared considering a two dimensional data which is
to circular hough based transformation with respect obtained from the gaze behavior from the
to cursor movement, but due to the notably high participants. A comparative analysis between the
disparity of the pupil center position in successive Gaussian based classifier and the Gaussian mixture
frames the Starbust based transformation had high models were performed. Experimental results show
levels of noise. that the performance of the proposed Gaussian
O. Mazmar et. al [7] proposed a eye-ball tracking based classifier was higher in efficiency as
system which was performed in a real time compared to the Gaussian mixture models.
environment. The experiment was aimed towards F. B Taher et. al [11] proposed a controller based
people having disabled neuromotor systems. The electric powered wheelchair (EPW) system for
experiment was conducted by developing a human certain types of disabled persons based on a
computer interface facilitating the neurological combination of EEG (Electroencephalogram)
control of the disabled patients by observing only signals with that of the eye tracking system. The
the eye-movement of the patients. A serial proposed technique illustrate the significance of
communication was performed which sent data using a multi source control process in the EPW.
from the webcam to a MATALB based simulating The first part involves elaboration of separate
tool. A segmentation process followed by a centroid control techniques involving EEG and the eye
calculation for the pupil was performed which tracking system. The second part involves a
results in producing a control signal from the combination of the previous two techniques by

ISSN: 2395-1303 http://www.ijetjournal.org Page 142


International Journal of Engineering and Techniques - Volume 2 Issue 6, Nov – Dec 2016
considering a data fusion algorithm. Experimental The third module is used to perform the detection
results show that by considering only EEG based and tracking of the movement of the pupil region of
control signal without the inclusion of eye tracking the eye and to set a direction respectively. The
system, the result was much higher in performance, obtained direction is then sent to a hardware
but the use of EEG devices had a disadvantage of experimental setup consisting of an Arduino uno
limited energy. Hence by implementing the microcontroller with a wireless zigbee device for
proposed system this limitation was resolved. wireless transmission of data.
K. Kurzhals et. al [12] proposed an eye tracking The general architecture for the proposed system is
system based on computer visualization system. given in fig 1 as shown below. The purpose of the
This work mainly deals with the development of tri- proposed system is the detection, identification and
dimensional fly through animation system. The tracking of the pupil region of the eye in real time
technique portrayed an accurate representation of scenario. The video is captured from the image
the distribution of the galaxy which was visually capturing device and frames are extracted, for each
and aesthetically pleasing to the observer. frame the three module defined in the previous
F. Zhou et. al [13] proposed an eye tracking system section is performed. The resulting direction data of
which was based on the particle filtering algorithm. the movement of the pupil region of the eye is sent
First an AdaBoost based classifier was to the hardware experimental setup as control
implemented to procure the information of the eye signal. The functionalities associated for the
region. Performing a comparison of two proposed eye tracking system are basically divided
consecutive frames in a video sequence the eye into three modules which consists of image pre-
region information is seeked, for this purpose the processing, pupil region detection and pupil
particle filtering algorithm was implemented. detection and movement detection.
Experimental result shows an improvement in the
computational time between consecutive frames 4. IMPLEMENTATION
due to the reduced search region for the human eye. The procedure for the implementation for the eye
K. Kuzhals et. al [14] proposed an eye tracking tracking system is given as follows.
mechanism primarily for the application involving In the first module of image pre-preprocessing, the
Personal Visual Analytics (VSA). This work video is captured from an image capturing device,
primarily explains the challenges which arises in the respect frame is extracted from the video, the
real time scenarios in context of VSA along with image which is in the RGB format is converted to a
the prospective of potential research in the single dimension gray scale format. Gamma
respective application. The author also presents a correction is applied to the image to set the image
technique for representing the area of interest to an ideal brightness and contrast. The gamma
considering multiple videos correction is computed by controlling a parameter
called the Gamma factor. The eye region is detected
3. PROPOSED SYSTEM using the viola Jones algorithm, In this algorithm
the features are extracted using the haar wavelet
The purpose of the proposed system is to perform transformation, An integral image is created and the
detection and tracking of the pupil region of the corresponding training is performed suing the
eye. The application of this system could be found AdaBoost training method and finally the cascading
in areas ranging from military applications to classifiers are used for the detection of the eye
healthcare sectors. The proposed system mainly region in the face. The threshold is set for which the
consists of three general modules. The first module pixel value of the image will be zero value for value
is where the video is captured in real time and the below the threshold and the pixel value will be 1 if
eye detection is performed using the Viola-jones the value is above the threshold. The user adjusts
algorithm. The second module is used to perform this setting until the pupil region is clearly
binarization and application of Hough identified and defined in the eye. These parameters
transformation to detect the pupil region of the eye. is then to Binarization process.

ISSN: 2395-1303 http://www.ijetjournal.org Page 143


International Journal of Engineering and Techniques - Volume 2 Issue 6, Nov – Dec 2016
in fig. 5.

.
Fig 2: proposed methodology for eye detection and
binarization
Fig 1: proposed system

The second module consists of the definition of the


pupil region of the proposed system, initially a
minimum and maximum radii are defined for which
the values (or radii value) identified within this
range will be termed as circle. The set parameters
(i.e the two radii) are then sent to the Hough
Transformation which detects and defines the
circular region in the eye. (fig. 3)
The third module consists of the detection and
movement of the pupil region in the eye. A
reference point is set to the pupil region which Fig 3: proposed methodology for Hough Transformation
consists of a matrix having radii and centers as rows
and columns. A comparative analysis is performed
with respect to previous frame and the current
frame considering the pixel position of the pupil
region in the eye. The condition given below are set
for the direction of movement of the vehicle based
pupil movement. (fig.4)
Condition 1: if rows of current frame is greater than
the rows of previous frame, then move ‘RIGHT’
Condition 2: if rows of current frame is lesser than
the rows of previous frame, then move ‘LEFT’
Condition 3: if columns of current frame is greater Fig 4: proposed methodology for eye movement detection
than the columns of previous frame, then move
‘FORWARD’
Condition 4: if columns of current frame is lesser
than the columns of previous frame, then ‘STOP’
The hardware experimental setup is represented as shown

ISSN: 2395-1303 http://www.ijetjournal.org Page 144


International Journal of Engineering and Techniques - Volume 2 Issue 6, Nov – Dec 2016
1. Mean Square Error (MSE): The MSE is
defined as measurement of average of square of
the error or deviation between the obtained and
reference data. It is mathematically defined as
follows,

∑ ………………… (1)

Where,  reference data


 Estimated data
Fig 5: Hardware experimental setup
2. Peak signal to noise ratio (PSNR): The PSNR
5. SIMULATION RESULTS AND is defined as the log of ratio of power of noise
PARAMETRIC EVALUATIONS to its respective noise in the image. It is
mathematically defined as,
The evaluation performed with respect to image
forgery detection is mentioned as follows,
follo
10 log ……………… (2)
Database
Table 1: database for the proposed eye tracking
Where,  maximum number
numb of pixels in
system
the image
Sl.no Parameters Description (value)
3. Sensitivity:: Sensitivity is defined as the ratio of
1. Video Device Integrated webcam
true positive to sum of true positive and false
2. Video format mjpg 1024X768 negative. It is mathematically defined as shown
3. Returned RGB in eq.3,
colour space
4. trigger Infinite Sensitivity = TP/ (TP + FN) …………… (3)
5. Frames/trigger 1 Where, FN  False negative
6. n Number of frames TP  True Positive
(126) 4. Specificity:: It is defined as the ratio of true
7. Image 1024X768X3 negative to sum of true negative and false
resolution positive. It is mathematically represented as
8. Number of 126 shown in eq.5,
frames Specificity = TN/ (TN + FP) …………… (4)
Parametric analysis
Where, TN  True Negative
Table 2: parametric analysis for the proposed
FP  False Positive
eye tracking system
Sl.no Parameters Description
Table 3: Experimental results
1. Video webcam
parameters image
Device
2. n Number of frames Elapsed time 33.82 s
3. g Gamma factor
PSNR 14.2 (avg.)
4. th Threshold value for
binarization TP 65/126
5. min Minimum radius for pupil
TN 30/126
6. max Maximum radius for pupil
7. dir Direction of pupil FP 20/126
movement
FN 11/126

ISSN: 2395-1303 http://www.ijetjournal.org Page 145


International Journal of Engineering and Techniques - Volume 2 Issue 6, Nov – Dec 2016
6. CONCLUSION system based on circular Hough transform
For the experiment considering the proposed algorithm. In E-Health and Bioengineering
eye tracking system, a successful interface Conference (EHB), 2015 (pp. 1-4). IEEE.
between the computer and the hardware was [6] Păsărică, A., Bozomitu, R.G., Cehan, V.,
achieved. Binarization and Hough Lupu, R.G. and Rotariu, C., 2015, October.
transformation was performed for the successful Pupil detection algorithms for eye tracking
detection, identification and tracking of the applications. In Design and Technology in
pupil region of the eye. The hardware setup was Electronic Packaging (SIITME), 2015 IEEE
built using an Arduino uno microcontroller and 21st International Symposium for (pp. 161-
zigbee wireless device. Experimental result 164). IEEE.
show a successful detection and tracking (with [7] Mazhar, O., Shah, T.A., Khan, M.A. and
respect to direction) of the pupil region of the Tehami, S., 2015, October. A real-time
eye which is serially connected to the hardware webcam based Eye Ball Tracking System
interface as a control signal. using MATLAB. In Design and Technology
in Electronic Packaging (SIITME), 2015
IEEE 21st International Symposium for (pp.
REFERENCES 139-142). IEEE.
[8] Kim, M., Morimoto, K. and Kuwahara, N.,
[1] Sharafi, Zohreh, Timothy Shaffer, and
2015, July. Using Eye Tracking to
Bonita Sharif. "Eye-Tracking Metrics in
Investigate Understandability of Cardinal
Software Engineering." In 2015 Asia-
Direction. In Applied Computing and
Pacific Software Engineering Conference
Information Technology/2nd International
(APSEC), pp. 96-103. IEEE, 2015.
Conference on Computational Science and
[2] Chandra, S., Sharma, G., Malhotra, S., Jha,
Intelligence (ACIT-CSI), 2015 3rd
D. and Mittal, A.P., 2015, December. Eye
International Conference on(pp. 201-206).
tracking based human computer interaction:
IEEE.
Applications and their uses. In 2015
[9] Jin, C. and Li, Y., 2015, August. Research
International Conference on Man and
of Gaze Point Compensation Method in Eye
Machine Interfacing (MAMI) (pp. 1-5).
Tracking System. In Intelligent Human-
IEEE.
Machine Systems and Cybernetics (IHMSC),
[3] Zhang, Y., Zheng, X., Hong, W. and Mou,
2015 7th International Conference on (Vol.
X., 2015, December. A comparison study of
2, pp. 12-15). IEEE.
stationary and mobile eye tracking on
[10] Zhang, Z., Kubo, T., Watanabe, J.,
EXITs design in a wayfinding system.
Shibata, T., Ikeda, K., Bando, T., Hitomi, K.
In 2015 Asia-Pacific Signal and Information
and Egawa, M., 2015, July. A classification
Processing Association Annual Summit and
method between novice and experienced
Conference (APSIPA) (pp. 649-653). IEEE.
drivers using eye tracking data and Gaussian
[4] Miyamoto, D., Iimura, T., Blanc, G.,
process classifier. In Society of Instrument
Tazaki, H. and Kadobayashi, Y., 2014,
and Control Engineers of Japan (SICE),
September. EyeBit: eye-tracking approach
2015 54th Annual Conference of the (pp.
for enforcing phishing prevention habits.
1409-1412). IEEE.
In 2014 Third International Workshop on
[11] Taher, F.B., Amor, N.B. and Jallouli,
Building Analysis Datasets and Gathering
M., 2015, September. A multimodal
Experience Returns for Security
wheelchair control system based on EEG
(BADGERS) (pp. 56-65). IEEE.
signals and Eye tracking fusion.
[5] Bozomitu, R.G., Păsărică, A., Cehan, V.,
InInnovations in Intelligent SysTems and
Lupu, R.G., Rotariu, C. and Coca, E., 2015,
Applications (INISTA), 2015 International
November. Implementation of eye-tracking
Symposium on (pp. 1-8). IEEE.

ISSN: 2395-1303 http://www.ijetjournal.org Page 146


International Journal of Engineering and Techniques - Volume 2 Issue 6, Nov – Dec 2016
[12] Kurzhals, K., Burch, M., Pfeiffer, T.
and Weiskopf, D., 2015. Eye Tracking in
Computer-Based Visualization. Computing
in Science & Engineering, 17(5), pp.64-71.
[13] Zhou, F., Chen, W. and Fang, H.,
2014, November. Robust eye tracking and
location method based on Particle filtering
algorithm. In 2014 IEEE 3rd International
Conference on Cloud Computing and
Intelligence Systems (pp. 247-252). IEEE.
[14] Kurzhals, K. and Weiskopf, D.,
2015. Eye Tracking for Personal Visual
Analytics. IEEE computer graphics and
applications, 35(4), pp.64-72.

ISSN: 2395-1303 http://www.ijetjournal.org Page 147

You might also like