0% found this document useful (0 votes)
41 views5 pages

Real Time Eye Gaze Tracking For Traffic Safety On Raspberry Pi

Camera

Uploaded by

Okyaka baba
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
41 views5 pages

Real Time Eye Gaze Tracking For Traffic Safety On Raspberry Pi

Camera

Uploaded by

Okyaka baba
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

International Journal of Research p-ISSN: 2348-6848

Available at https://edupediapublications.org/journals e -ISSN: 2348-795X


Vol ume 04 Issue 02
Fe bruary 2017

Real time Eye Gaze Tracking for Traffic Safety on Raspberry Pi

A BDUL RA HA M TULLA BA IG M r. Y. Dav id So lo mo n Raju


DEP ARTMENT OF ECE A s s o ciat e Pro fes s o r
HOLYMARY INSTITUTE OF TECHNOLOGY DEPA RTMENT OF ECE HOLYM A RY
AND SCIENCE INSTITUTE OF TECHNOLOG A ND SCIENCE

them prone to severe car accidents. NHTSA has classified


driver d istraction into four types which are auditory, visual,
biomechanical and cognitive. Reasons for distraction can be
Abstract — M ost of the automobile accidents are caused by distracted
driving after drinking alcohol, driving at night time, driv ing
driving. Passively monitoring driver’s eyes can help in detecting state
without taking rest, aging, fatigue because of continuous
of mind and alertness of driver and thus can reduce risk of accidents.
Proposed system includes three main parts 1) Facial feature tracking driving, long working hours and night shifts etc. No wadays
2) Eye gaze and 3D head pose estimation 3) Eyes off the road and another concern for distracted driving is the use of mob ile
fatigue detection. Video feed from camera installed on car dashboard phones and other electronic devices while driving.
tracks features of driver in real time (25FPS). Infrared illuminator is NSTHA has stated that text ing, bro wsing, and
used at night time to detect facial features clearly without distracting dialing is the reason for longest time of drivers taking their
driver. Image processing algorithm is developed in OpenCV to Eyes Off the Road (EOR) and increase the risk of crash by
estimate where the driver is looking by combining 3D head pose three times.
estimation and eye gaze estimation. Algorithm is implemented on
Proposed driver monitoring system based Eye gaze
Raspberry Pi board to make a compact embedded system. Different
zones are defined which includes the side mirrors, the rear-view tracking and 3D head pose estimation can help in continuously
mirror, the instrument board, and different zones in the windshield as monitoring and alerting driver in case of eyes off the road
points few of which are eyes off the road points. SVM is used to train (EOR) or distraction and drowsiness. Implementing system
and classify different combinations of gaze and head pose angles to which will give audio alerts and wheel vibration alerts
determine exact point of gaze. Based on algorithm output if driver’s depending on situation. To reduce nu mber of accidents caused
eyes are off the road or eyes are closed due to fatigue then by distraction is the motivation behind this project in order to
accordingly audio and steering vibration warnings are given to driver. improve traffic safety

Keywords: Eye gaze tracking, automobile safety, driver monitoring II. PREVIOUS WORK
s ystem, head pose estimation, Raspberry Pi. A. Driver monitoring systems
Fatigue/distraction detection can be mainly categorized into
I. INT RODUCT ION three categories [10]. (1) Approaches based on bioelectric
The main aim of eye gaze tracking based driver measurements (for e.g., ECG and EEG), (2) Approaches based
monitoring system is to reduce accidents caused by distracted on steering motion, lane departure (3) Approaches based on
driving. Distraction is mentioned as main cause in 78% of driver face mon itoring (eye closure %, eyelid distance,
crashes and 65% of near-crashes in NHTSA (National blinking rate, gaze direction and head rotation etc.)
Highway Traffic Safety Administration) study
(2013).Distraction is a major factor in more than 20% of all TABLE I Co mparison between Types of Driver Monitoring
accidents including fatalities and serious injuries. Systems
Distracted drivers tend to decrease attention to
important information needed for safe driving which makes

Available online: https://edupediapublications.org/journals/index.php/IJR/ P a g e | 2471


International Journal of Research p-ISSN: 2348-6848
Available at https://edupediapublications.org/journals e -ISSN: 2348-795X
Vol ume 04 Issue 02
Fe bruary 2017

based methods and regression based. The eye-model-


based techniques use the geometrical model of the
eye along with NIR light sources. In regression based
method, vector between the pupil center and corneal
reflections is mapped and tracked geometrically with
a polynomial regression function to find gaze
coordinates on a virtual screen.

C. Head pose estimation methods


The methods for head orientation can be categorized into four
categories [9].
Driver mon itoring systems which use driver face monitoring  Methods based on shape features with eye position:
can be classified into intrusive and non-intrusive techniques. These methods analyze the geo metric arrangement of
Intrusive techniques need special attach ments such as facial features to determine the head orientation e.g.
electrodes, goggles or head mounted device. These devices are AAM (Active appearance model) [8].
attached to the skin and hence interfere with the user. Intrusive
 Methods based on shape features without eye
methods interfere with the user and are inconvenient, hence position: These methods use simp ler features like
limited for laboratory testing. The systems that do not have center of the face, left and right borders etc. rather
any physical contact with the user are called remote systems
than detailed features of face such as nose, eyes, and
or non-intrusive systems. These techniques are mostly based face contour. Thus this method is simpler [6].
on imagprocessing and can be passive image based or video
 Methods based on texture features: These methods
feed based [10].
identify driver’s face in the image and analyzes
intensity pattern to determine the head orientation.
If any other features like head pose estimation is co mbined
Many learning techniques such as KPCA, PCA,
with eye gaze tracking can give better accuracy than eye gaze
LDA, and kernel d iscriminate analysis (KDA) are
tracking alone especially in cases where spectacles and
used to extract texture features. These features are
sunglasses are used by driver [3]. The detection of driver
then classified to obtain head orientation [11].
distraction mostly depends on the classification technique [2].
Support Vector Machine (SVM ) classifier is widely used for
 Methods based on hybrid features: These methods
gaze estimat ion. One or mo re features are used to design
based on hybrid features combine texture and shape
SVM. Real time Hidden Markov Models (HMMs) are also
features to determine the head orientation. Initial
used in so me approaches. But SVM are mo re co mmon and
head orientation is determined using texture based
accurate with average accuracy of 82%.
features and detailed head orientation is found by
using 3-D face model tracking and fitting [6].
B. Eye gaze tracking methods
The concept of eye-gaze tracking and estimation is hot
III. SYSTEM DESCRIPTION
research topic in last few years. Eye-gaze t racking methods
can be categorized into two approaches.

 Appearance based model: It use the position of the


pupils and general shape of the eyes and relative to
the eye corners for finding the point of gaze [8]. A
pre-trained model of the appearance and shape of the
eye region is fitted to a sequence of image frames.
 Feature based model: These methods use
characteristics of the eye to identify a set of features
like contours, eye corners of NIR illu minators
(LEDs) [7]. They can be further divided into model-

Available online: https://edupediapublications.org/journals/index.php/IJR/ P a g e | 2472


International Journal of Research p-ISSN: 2348-6848
Available at https://edupediapublications.org/journals e -ISSN: 2348-795X
Vol ume 04 Issue 02
Fe bruary 2017

A. System specifications
Raspberry Pi 2 single board computer is used in the system as
central module of co mplete embedded image capturing and
processing system. The main signal processing chip unit used
in Raspberry Pi is Broadco m 700M Hz Chip in which CPU is a
32 b it A RM11- RISC p rocessor. Camera used in the p roject is
Logitech c270 HD webcam. Raspbian-wheezy operating
system image is installed on Raspberry Pi wh ich is a version
of Linu x. It is a free operating system based on Debian which
is optimized specially for the hardware of Raspberry Pi.
Operating system is the set of basic utilities and programs that
makes Raspberry Pi run p roperly as a standalone computer.
The operating system on the board was installed on a class 4
8GB Samsung Micro SD Card with M icro SD Adapter. Micro
SD card is preloaded with the Raspberry Pi NOOBS (New Out
of Bo x Soft ware) package. OpenCV is then installed on
Raspbian-wheezy using Micro SD card. Audio alarms can be
generated using audio output of Raspberry Pi by either using
speaker or headphone. Vibration motor is used to generate
steering vibrations depending on the conditions.
B. Image acquisition
The image acquisition module uses a low-cost web camera.
USB webcam is used based on CCD mechanism. Camera is
interfaced to Raspberry Pi 2 using USB port. Camera is placed
on car dashboard above the steering wheel appro ximately 35-
40 cm away pointing straight to the driver. Placing camera in
this way makes capturing driver’s face very easy. Operation of D. Eye gaze tracking and estimation
the camera at night time is achieved using an IR illu mination Driver eye gaze is constantly changing during driving
source to provide a clear image of the driver’s face without depending on surrounding conditions. Thus detecting eyes is
distracting driver. 36 LED IR illu minator is used which is not sufficient. Eyes need to be tracked in real t ime.
fitted with LDR for automatic on off. Continuously Adaptive Mean Shift (CAMSHIFT) algorith m is
used for real time eye tracking. Pupils of the eyes are tracked.
C. Facial feature tracking E. Head pose estimation
Facial feature tracking algorith m is implemented in OpenCV Drivers tend to change their head pose while driving. 3D head
using Raspberry Pi. Vio la Jones algorithm is used which is a pose estimation is required to recognize in which direction
four step algorith m 1) Haar feature selection (eyes) 2) Create driver is looking. 3D Head pose is estimat ion is proposed by
an integral image 3) Adaboost training on integral image 4) combin ing Active Appearance Models (AAM) and Pose from
Cascading the classifier. Eyes.xml is the library file used for Orthography and Scaling with ITerations (POSIT). Out of the
eye detection. Rectangular frames are used to denote eye. The three Eu ler angles only yaw and pitch angles are extracted and
Haar Classifiers algorith m rap idly detects object using used and not roll angle. Yaw and pitch angles are sufficient to
AdaBoost classifier cascades with help of Haar features [4]. detect head pose direction.

F. Eyes off the road detection and fatigue detection


Similar to concept used in [3] [6] [9], Different zones are
defined in the car as shown in fig.3. Zones are defined in point
of view of a driver with left hand drive system. 11 d ifferent
gaze zones representing the dashboard, the centre console, the
rear-v iew mirror, two side mirrors and six zones on the
Available online: https://edupediapublications.org/journals/index.php/IJR/ P a g e | 2473
International Journal of Research p-ISSN: 2348-6848
Available at https://edupediapublications.org/journals e -ISSN: 2348-795X
Vol ume 04 Issue 02
Fe bruary 2017

windshield. These defined zones cover most of the possible area from face.
gaze directions involved in real-world driving.

Fig 6 shows the open eyes are detected. Both left and right
eyes are detected with their x and y coordinate positions. Also
TABLE II Shows different on the road and off the road points. for straight view face its showing correct head position.
Driver’s gaze direction lies in which area depends on the
combination of eye gaze estimat ion and head pose estimat ion
calculations. SVM Classifier is defined for d ifferent
combinations of eye and head pose directions. Then it is
decided whether driver gaze lies in off the road and
accordingly alerts are generated.

A scientific definition for fatigue has not been defined. But


there are several relations. There is a relation eye movement,
eye closure and fatigue. Eye movement is us ed to detect
drowsiness i.e. fatigue. If closed eyes are detected longer than
3 seconds then driver is supposed to be in drowsy state and
wake up alert is generated.

IV. RESULTS A ND DISCUSSIONS

A. Facial feature tracking and fatigue detection


OpenCV allows user to select us region of interest (ROI). In
this region of interest is eye region. Eyes are detected with a
rectangular box. System is also able to detect eyes when
wearing spectacles. Open and close eyes are detected by B. Face detection and Eye tracking
system. Images shown are the rectangular extract ion of the eye Face detection and eye detection can be observed in fig 8

Available online: https://edupediapublications.org/journals/index.php/IJR/ P a g e | 2474


International Journal of Research p-ISSN: 2348-6848
Available at https://edupediapublications.org/journals e -ISSN: 2348-795X
Vol ume 04 Issue 02
Fe bruary 2017

raspberry pi to be compact and low cost. Future work can


include improving driver monitoring system with h elp of
automatic calibration, determining vehicle states, weather
conditions, vehicle speed etc.

REFERENCES
[1] Gregory A. Maltz , “Eye gaze user interface and caliberation method,” US
patent 20140049452, Feb 20, 2014.
[2] Y. Liao; S. E. Li; W. Wang; Y. Wang; G. Li; B. Cheng, "Detection of
Driver Cognitive Distraction: A Comparison Study of Stop-Controlled
Intersection and Speed-Limited Highway," in IEEE Transactions on
Intelligent Transportation Systems , vol.PP, no.99, pp.1-10`, Jan 2016.
[3] F. Vicente, Z. Huang, X. Xiong, F. De la T orre, W. Zhang and D. Levi,
"Driver Gaze T racking and Eyes Off the Road Detection System," in IEEE
T ransactions on Intelligent T ransportation Systems, vol. 16, no. 4, pp. 2014-
2027, Aug. 2015. [4] O. Stan, L. Miclea and A. Centea, "Eye-Gaze Tracking
Method Driven by Raspberry PI Applicable in Automotive Traffic Safety,"
Artificial Intelligence, Modelling and Simulation (AIMS), 2014 2nd International
Conference on, Madrid, 2014, pp. 126-130, Nov 2014.
[5] Dasgupta, Anjith George, S. L. Happy, and Aurobinda Routray, “A
Vision-Based System for Monitoring the Loss of Attention in Automotive
Eye pupil tracking is shown by blue circles and face is shown Drivers,” IEEE Transactions on Intelligent Transportation Systems, vol. 14, no.
4, pp.1825-1838, December 2013.
by a yellow square.
[6] Xianping Fu, Xiao Guan, Eli Peli, Hongbo Liu, and Gang Luo “ Automatic
Calibration Method for Driver’s Head Orientation in Natural Driving
V. CONCLUSION Environment”, IEEE Transactions on Intelligent Transportation Systems, vol.
14, no. 1, pp.303-312 March 2013.
The goal of the system is to detect eye gaze d irection and head [7] S. Bazrafkan, A. Kar and C. Costache, "Eye Gaze for Consumer
pose direction to detect if eyes off the road. Another goal is to Electronics: Controlling and commanding intelligent systems.," in IEEE
Consumer Electronics Magazine, vol. 4, no. 4, pp. 65 -71, Oct. 2015.
detect drowsiness condition of driver and alert driver in both [8] F. Lu, T . Okabe, Y. Sugano, and Y. Sato, “Learning gaze biases with head
conditions. Viola Jones algorith m is imp lemented in Open CV motion for head pose-free gaze estimation,” Image Vis. Comput ., vol. 32, no.
3, pp. 169–179, 2014.
for rapid face detection with eyes extract ion. Real t ime eye
[9] S. J. Lee, J. Jo, H. G. Jung, K. R. Park and J. Kim, "Real-Time Gaze
gaze tracking is proposed with CAMSHIFT algorith m. Pupil Estimator Based on Driver's Head Orientation for Forward Collision Warning
tracking is achieved Head pose estimation is proposed with System," in IEEE Transactions on Intelligent Transportation Systems, vol. 12,
no. 1, pp. 254-267, March 2011
AAM and POSIT. Different gaze zones are defined and eyes [10] Mohamad S, Muhammad.P., Mohsen.S., Mahmood F., “ A Review on
off the road can be detected by combining eye gaze and head Driver Face Monitoring Systems for Fatigue and Distraction Detection”,
International Journal of Advanced Science and Technology Vol.64 (2014),
position. Fatigue detection can be achieved by detecting pp.73-100.
closed eyes. If driver eyes are off the road or if he is drowsy [11] P. M. Corcoran, F. Nanu, S. Petrescu, and P. Bigioi, “Real-time eye gaze
tracking for gaming design and consumer electronics systems,” IEEE Trans
then alert will be generated. System is robust as two methods `1Consum. Electron., vol. 58, no. 2, pp. 347–355,2012.
are co mbined to find gaze. If one method is failed to detect [12] https://www.raspberrypi.org
properly, other will work. System is also robust under night or [13] http://opencv.org
low light conditions due to use of IR illu minators and build on

Available online: https://edupediapublications.org/journals/index.php/IJR/ P a g e | 2475

You might also like