Real Time Eye Gaze Tracking For Traffic Safety On Raspberry Pi
Real Time Eye Gaze Tracking For Traffic Safety On Raspberry Pi
Keywords: Eye gaze tracking, automobile safety, driver monitoring II. PREVIOUS WORK
s ystem, head pose estimation, Raspberry Pi. A. Driver monitoring systems
Fatigue/distraction detection can be mainly categorized into
I. INT RODUCT ION three categories [10]. (1) Approaches based on bioelectric
The main aim of eye gaze tracking based driver measurements (for e.g., ECG and EEG), (2) Approaches based
monitoring system is to reduce accidents caused by distracted on steering motion, lane departure (3) Approaches based on
driving. Distraction is mentioned as main cause in 78% of driver face mon itoring (eye closure %, eyelid distance,
crashes and 65% of near-crashes in NHTSA (National blinking rate, gaze direction and head rotation etc.)
Highway Traffic Safety Administration) study
(2013).Distraction is a major factor in more than 20% of all TABLE I Co mparison between Types of Driver Monitoring
accidents including fatalities and serious injuries. Systems
Distracted drivers tend to decrease attention to
important information needed for safe driving which makes
A. System specifications
Raspberry Pi 2 single board computer is used in the system as
central module of co mplete embedded image capturing and
processing system. The main signal processing chip unit used
in Raspberry Pi is Broadco m 700M Hz Chip in which CPU is a
32 b it A RM11- RISC p rocessor. Camera used in the p roject is
Logitech c270 HD webcam. Raspbian-wheezy operating
system image is installed on Raspberry Pi wh ich is a version
of Linu x. It is a free operating system based on Debian which
is optimized specially for the hardware of Raspberry Pi.
Operating system is the set of basic utilities and programs that
makes Raspberry Pi run p roperly as a standalone computer.
The operating system on the board was installed on a class 4
8GB Samsung Micro SD Card with M icro SD Adapter. Micro
SD card is preloaded with the Raspberry Pi NOOBS (New Out
of Bo x Soft ware) package. OpenCV is then installed on
Raspbian-wheezy using Micro SD card. Audio alarms can be
generated using audio output of Raspberry Pi by either using
speaker or headphone. Vibration motor is used to generate
steering vibrations depending on the conditions.
B. Image acquisition
The image acquisition module uses a low-cost web camera.
USB webcam is used based on CCD mechanism. Camera is
interfaced to Raspberry Pi 2 using USB port. Camera is placed
on car dashboard above the steering wheel appro ximately 35-
40 cm away pointing straight to the driver. Placing camera in
this way makes capturing driver’s face very easy. Operation of D. Eye gaze tracking and estimation
the camera at night time is achieved using an IR illu mination Driver eye gaze is constantly changing during driving
source to provide a clear image of the driver’s face without depending on surrounding conditions. Thus detecting eyes is
distracting driver. 36 LED IR illu minator is used which is not sufficient. Eyes need to be tracked in real t ime.
fitted with LDR for automatic on off. Continuously Adaptive Mean Shift (CAMSHIFT) algorith m is
used for real time eye tracking. Pupils of the eyes are tracked.
C. Facial feature tracking E. Head pose estimation
Facial feature tracking algorith m is implemented in OpenCV Drivers tend to change their head pose while driving. 3D head
using Raspberry Pi. Vio la Jones algorithm is used which is a pose estimation is required to recognize in which direction
four step algorith m 1) Haar feature selection (eyes) 2) Create driver is looking. 3D Head pose is estimat ion is proposed by
an integral image 3) Adaboost training on integral image 4) combin ing Active Appearance Models (AAM) and Pose from
Cascading the classifier. Eyes.xml is the library file used for Orthography and Scaling with ITerations (POSIT). Out of the
eye detection. Rectangular frames are used to denote eye. The three Eu ler angles only yaw and pitch angles are extracted and
Haar Classifiers algorith m rap idly detects object using used and not roll angle. Yaw and pitch angles are sufficient to
AdaBoost classifier cascades with help of Haar features [4]. detect head pose direction.
windshield. These defined zones cover most of the possible area from face.
gaze directions involved in real-world driving.
Fig 6 shows the open eyes are detected. Both left and right
eyes are detected with their x and y coordinate positions. Also
TABLE II Shows different on the road and off the road points. for straight view face its showing correct head position.
Driver’s gaze direction lies in which area depends on the
combination of eye gaze estimat ion and head pose estimat ion
calculations. SVM Classifier is defined for d ifferent
combinations of eye and head pose directions. Then it is
decided whether driver gaze lies in off the road and
accordingly alerts are generated.
REFERENCES
[1] Gregory A. Maltz , “Eye gaze user interface and caliberation method,” US
patent 20140049452, Feb 20, 2014.
[2] Y. Liao; S. E. Li; W. Wang; Y. Wang; G. Li; B. Cheng, "Detection of
Driver Cognitive Distraction: A Comparison Study of Stop-Controlled
Intersection and Speed-Limited Highway," in IEEE Transactions on
Intelligent Transportation Systems , vol.PP, no.99, pp.1-10`, Jan 2016.
[3] F. Vicente, Z. Huang, X. Xiong, F. De la T orre, W. Zhang and D. Levi,
"Driver Gaze T racking and Eyes Off the Road Detection System," in IEEE
T ransactions on Intelligent T ransportation Systems, vol. 16, no. 4, pp. 2014-
2027, Aug. 2015. [4] O. Stan, L. Miclea and A. Centea, "Eye-Gaze Tracking
Method Driven by Raspberry PI Applicable in Automotive Traffic Safety,"
Artificial Intelligence, Modelling and Simulation (AIMS), 2014 2nd International
Conference on, Madrid, 2014, pp. 126-130, Nov 2014.
[5] Dasgupta, Anjith George, S. L. Happy, and Aurobinda Routray, “A
Vision-Based System for Monitoring the Loss of Attention in Automotive
Eye pupil tracking is shown by blue circles and face is shown Drivers,” IEEE Transactions on Intelligent Transportation Systems, vol. 14, no.
4, pp.1825-1838, December 2013.
by a yellow square.
[6] Xianping Fu, Xiao Guan, Eli Peli, Hongbo Liu, and Gang Luo “ Automatic
Calibration Method for Driver’s Head Orientation in Natural Driving
V. CONCLUSION Environment”, IEEE Transactions on Intelligent Transportation Systems, vol.
14, no. 1, pp.303-312 March 2013.
The goal of the system is to detect eye gaze d irection and head [7] S. Bazrafkan, A. Kar and C. Costache, "Eye Gaze for Consumer
pose direction to detect if eyes off the road. Another goal is to Electronics: Controlling and commanding intelligent systems.," in IEEE
Consumer Electronics Magazine, vol. 4, no. 4, pp. 65 -71, Oct. 2015.
detect drowsiness condition of driver and alert driver in both [8] F. Lu, T . Okabe, Y. Sugano, and Y. Sato, “Learning gaze biases with head
conditions. Viola Jones algorith m is imp lemented in Open CV motion for head pose-free gaze estimation,” Image Vis. Comput ., vol. 32, no.
3, pp. 169–179, 2014.
for rapid face detection with eyes extract ion. Real t ime eye
[9] S. J. Lee, J. Jo, H. G. Jung, K. R. Park and J. Kim, "Real-Time Gaze
gaze tracking is proposed with CAMSHIFT algorith m. Pupil Estimator Based on Driver's Head Orientation for Forward Collision Warning
tracking is achieved Head pose estimation is proposed with System," in IEEE Transactions on Intelligent Transportation Systems, vol. 12,
no. 1, pp. 254-267, March 2011
AAM and POSIT. Different gaze zones are defined and eyes [10] Mohamad S, Muhammad.P., Mohsen.S., Mahmood F., “ A Review on
off the road can be detected by combining eye gaze and head Driver Face Monitoring Systems for Fatigue and Distraction Detection”,
International Journal of Advanced Science and Technology Vol.64 (2014),
position. Fatigue detection can be achieved by detecting pp.73-100.
closed eyes. If driver eyes are off the road or if he is drowsy [11] P. M. Corcoran, F. Nanu, S. Petrescu, and P. Bigioi, “Real-time eye gaze
tracking for gaming design and consumer electronics systems,” IEEE Trans
then alert will be generated. System is robust as two methods `1Consum. Electron., vol. 58, no. 2, pp. 347–355,2012.
are co mbined to find gaze. If one method is failed to detect [12] https://www.raspberrypi.org
properly, other will work. System is also robust under night or [13] http://opencv.org
low light conditions due to use of IR illu minators and build on