0% found this document useful (0 votes)
25 views22 pages

DRIVER Seminar Report

The document discusses techniques for detecting driver fatigue and distraction. It divides fatigue detection techniques into two categories: visual techniques based on analyzing facial features like eyes and mouth, and non-visual techniques based on physiological signals or vehicle parameters. It also discusses techniques for detecting driver distraction and how mobile technologies can enable driver monitoring solutions using smartphones or wearables. Finally, it proposes disseminating driver behavior data through vehicle-to-vehicle communication to support safe driving.

Uploaded by

mj982988
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views22 pages

DRIVER Seminar Report

The document discusses techniques for detecting driver fatigue and distraction. It divides fatigue detection techniques into two categories: visual techniques based on analyzing facial features like eyes and mouth, and non-visual techniques based on physiological signals or vehicle parameters. It also discusses techniques for detecting driver distraction and how mobile technologies can enable driver monitoring solutions using smartphones or wearables. Finally, it proposes disseminating driver behavior data through vehicle-to-vehicle communication to support safe driving.

Uploaded by

mj982988
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

Driver Behavior Analysis for Safe Driving

A seminar report
Submitted in partial fulfilment of the requirements for the
award of degree of
MASTER OF TECHNOLOGY
in
Computer Science and Engineering
by
Mohit Kumar Jain
(2022PCP5434)
Under the supervision of:
Dr. Arka Prokash Mazumdar
Assistant Professor
Department of Computer Science and Engineering

figs/Mnit_logo.png

Malaviya National Institute of Technology Jaipur


(December 2023)
Candidate Declaration

I, Mohit Kumar Jain, declare that the work that is presented in this seminar report enti-
tled, “Driver Behavior Analysis for Safe Driving” is a record of my work carried under
the supervision of Dr. Arka Prokash Mazumdar , Assistant Professor, Department of
Computer Science and Engineering, Malaviya National Institute of Technology. I confirm

that:

• This work was done wholly or mainly in fulfilment for the award of degree of
“Master of Technology” in Computer Science and Engineering at Malaviya Na-
tional Institute of Technology Jaipur

• Where I have consulted the published work of others, this is always clearly at-
tributed.

• Where I have quoted from the work of others, the source is always given.

• I have acknowledged all main sources of help.

Date: Mohit Kumar Jain


Place: MNIT, Jaipur 2022PCP5434
Certificate

This is to certify that the seminar report titled ”Driver Behavior Analysis for Safe Driv-
ing” is being submitted by Mohit Kumar Jain bearing Enrollment No. 2022PCP5434 in
partial fulfilment for the award of degree of “Master of Technology” in Computer Sci-
ence and Engineering at Malaviya National Institute of Technology Jaipur. It is a record
of an authentic work carried out by him under my supervision.

Date: Dr. Arka Prokash Mazumdar


Place: MNIT, Jaipur Assistant Professor
Department of Computer Science and Engineering
Malaviya National Institute of Technology, Jaipur
Acknowledgment

I would like to express the deepest appreciation towards my supervisor


Dr. Arka Prokash Mazumdar, Assistant Professor, Department of Computer Science
and Engineering MNIT Jaipur, for his motivating support, guidance, and valuable sugges-
tions.
I would like to express my heartfelt gratitude to Dr. Namita Mittal, Head of Department
of Computer Science and Engineering, MNIT Jaipur, for her encouragement, suggestions
and support.

i
Abstract

Driver drowsiness and distraction are two main rea- sons for traffic accidents and the re-
lated financial losses. There- fore, researchers have been working for more than a decade
on designing driver inattention monitoring systems. As a result, several detection tech-
niques for the detection of both drowsiness and distraction have been proposed in the
literature. Some of these techniques were successfully adopted and implemented by the
leading car companies. This paper discusses and provides a comprehensive insight into
the well-established techniques for driver inattention monitoring and introduces the use of
most recent and futuristic solutions exploiting mobile technologies such as smartphones
and wearable devices. Then, a proposal is made for the active of such systems into car-to-
car communication to support vehicular ad hoc network’s (VANET’s) primary aim of safe
driving. We call this approach the dissemination of driver be- havior via C2C communi-
cation. Throughout this paper, the most remarkable studies of the last five years were ex-
amined thoroughly in order to reveal the recent driver monitoring techniques and demon-
strate the basic pros and cons. In addition, the studies were categorized into two groups:
driver drowsiness and distraction. Then, research on the driver drowsiness was further
divided into two main subgroups based on the exploitation of either vi- sual features or
nonvisual features. A comprehensive compilation, including used features, classification
methods, accuracy rates, system parameters, and environmental details, was represented
as tables to highlight the (dis)advantages and/or limitations of the aforementioned cate-
gories. A similar approach was also taken for the methods used for the detection of driver
distraction.

ii
Contents

List of Figures iv

1 Introduction 1

2 DRIVER FATIGUE DETECTION TECHNIQUES 2


2.1 Techniques Based on Visual Features . . . . . . . . . . . . . . . . . . . . . 3
2.2 Techniques Based on Non-Visual Features . . . . . . . . . . . . . . . . . . 5

3 Driver Distraction Detection Techniques 8

4 ENABLING MOBILE TECHNOLOGIES FOR DRIVER MONITORING 10


4.1 Smartphone Solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
4.2 Wearable Solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

5 DRIVER BEHAVIOR DISSEMINATION FOR SAFE DRIVING 12


5.1 Commercial Driver Assistance Systems . . . . . . . . . . . . . . . . . . . . 12

6 FUTURE DIRECTIONS AND CONCLUSION 13

References 14

iii
List of Figures

2.1 THE LIST OF MOST REMARKABLE AND LEADING SOLUTIONS


FOR DRIVER DROWSINESS DETECTION[1] . . . . . . . . . . . . . . . 3
2.2 (a)Normal eye region,(b) binary image after thresholding the image in(a),(c)
blinked eye region, (d) binary image after thresholding the image in (c). . 5
2.3 The system diagram of eye blinking frequency analysis[2] . . . . . . . . . 6
2.4 The general system diagram of mouth and yawning analysis.[3] . . . . . . 7
2.5 a) The normal state of the mouth. (b) The initial state of the yawning.
(c) The mouth gets wider while yawning compared to speaking. (d) The
mouth state after completing yawning.[2] . . . . . . . . . . . . . . . . . . . 7

3.1 head images. (a) Left-front. (b) Front. (c) Up. (d) Right-front. (e) Left-
side. (f) Rear. (g) Down. (h) Right-side.[2] . . . . . . . . . . . . . . . . . . 9

4.1 The system design of smartphone solution[4] . . . . . . . . . . . . . . . . . 11

iv
Chapter 1

Introduction

• A significant number of serious accidents occur all over the world.Cause of acci-
dents by human mistakes, such as typing a text message,speaking on the phone,
eating etc., while driving.

• Drowsiness, sleepiness or distraction could also result in critical and accidents.

• In order to prevent vehicle accidents, researchers aim at systems to monitor drivers


and to measure the distraction level of drivers. These kinds of systems are broadly
called Advance Driver Assistance Systems (ADAS), Driver Inattention Monitoring
Systems, and Driver Alert Control Systems

• Systems designed for the analysis and detection of drowsiness can be broadly di-
vided into two categories: visual features based and non-visual features based.

• Visual features focuses on extracting facial features like face, eyes and mouth.

• Techniques based on non-visual features can be divided into two categories: driver
physiological analysis and vehicle parameter analysis.

1
Chapter 2

DRIVER FATIGUE DETECTION


TECHNIQUES

Fatigue is a term used to describe the general overall feeling of tiredness and a lack of
energy [5]. It also refers to as drowsiness, exhaustion, lethargy, and listlessness and it
describes a physical and/or mental state of being tired and weak. Although physical and
mental fatigues are different from each other, the two often exist together - if a person is
physically exhausted for long enough, he/she will also be mentally tired.
It is important to understand that fatigue is not a disease and could be overcome by taking
a rest or sleeping. However, fatigue may cause serious accidents especially while a person
is driving a car, a bus, a railroad train or other vehicles that require constant attention [2].

• Facial Expressions: A drowsy person can be detected from his/her facial expres-
sions such as yawning, happiness, and anger. Drowsy person will have less facial
expressions and exhibit more frequent yawning.

• Head Movements: A drowsy person can be detected from his/her head movements.
Drowsy person will exhibit certain unique head movement such as head nodding.

• Gaze (Pupil) Movements: A drowsy person can be detected from his/her gaze pupil
movements. It is observable that drowsy person has a narrow gaze region than when
they are alert. Also drowsy person has less saccadic movements than when they are
alert.

• Eyelid Movement: A drowsy person can be detected from his/her eyelid move-
ments. It is observable that drowsy person will blink distinctly slower than when

2
Figure 2.1: THE LIST OF MOST REMARKABLE AND LEADING SOLUTIONS FOR
DRIVER DROWSINESS DETECTION[1]

they are alert. Besides, drowsy person will close his/her eyes for a longer time
than when they are alert. To make it simple, drowsy person has longer eye closure
duration than the alert person.

Several studies are thoroughly examined and proposed methods, features used for classi-
fication along with the respective classifiers, limitations, (dis)advantages and accuracy of
the methods as well as their experimental setup are given in Fig 2.1.

2.1 Techniques Based on Visual Features


The eye state, eye blinking frequency, mouth state and yawning frequency of a driver are
the key factors for detecting drowsiness [6]. Eye closure duration is the important pa-
rameter for detecting driver drowsiness. Systems that use this technique usually monitor
eye states and the position of the iris through a specific time period to estimate the eye
blinking frequency and the eye closure duration.
Systems requires PERCLOS (Percentage of Eye Closure) is a reliable and valid metric to

3
determine the alertness level of the driver[7].
Eye state analysis mostly exploits the PERCLOS value as drowsiness metric, which shows
the percentage of time in a minute that the eyes are 80 if the driver is tired, eye closure
duration will increase and the owsiness metric, which shows the percentage of time in a
minute that the eyes are 80 percent are closed [1]. The author[4] mainly classified the
techniques based on computer vision methods for detecting driver drowsiness as follow-
ing:

1. Eye State Analysis:

• The systems applying this technique focus on the states of eyes[7].


System warns the driver by generating an alarm, if the driver closes his/her
eyes for a particular time show in Fig.2.2.

• Some systems based on this technique use a database where both closed and
open templates of eye are stored.

• There are some limitations, such as lighting conditions and sunglasses that
affect the accuracy of the template matching technique.

2. Eye Blinking Analysis:

• Whenever a driver is tired or feels sleepy, his/her eye’s blinking frequency


changes and the eyelid closure duration starts involuntarily to prolong.

• To be more specific, when the driver is alert, his/her eye blinking frequency is
low and his/her eyelid closure duration will be slower.

• When the driver is exhausted, his/her eye blinking frequency gets higher (more
closed-eye images) and his/her eyelid closure duration will be shorter.

• The system diagram of eye blinking analysis is given in Fig. 2.3.

3. Mouth and Yawning Analysis

• In this approach detecting drowsiness involves two main phases to analyze the
changes in facial expressions properly that imply drowsiness.

• First, the driver’s face is detected by using cascade classifiers and tracked in
the series of frame shots taken by the camera.

4
Figure 2.2: (a)Normal eye region,(b) binary image after thresholding the image in(a),(c)
blinked eye region, (d) binary image after thresholding the image in (c).

• After locating the driver’s face, the next step is to detect and track the location
of the mouth.

• For mouth detection the researchers have used the face detection algorithm,
process show in Fig 2.4.

• Afterwards, yawning has been analyzed to determine the level of the drowsi-
ness.

• Mouth opens wide and the distance between its counters gets larger show in
Fig 2.5.

2.2 Techniques Based on Non-Visual Features


Driver-based features usually refer to the brain activity and heart rate of a driver, whereas
vehicle-based features include the pressure exerted on the brake, the fluctuation on the ve-
hicle speed, the angle of the wheels, the steering wheel movement, etc. Under this section
these techniques are briefly discussed in terms of detecting driver fatigue.

5
Figure 2.3: The system diagram of eye blinking frequency analysis[2]

• Non-visual features for driver behavior analysis can be categorized into driver-
based and vehicle-based features. Driver-based features include brain activity, heart
rate, and physiological indexes like EEG, ECG, and EOG. Vehicle-based features
include brake pressure, vehicle speed fluctuation, wheel angle, and steering wheel
movement.

• Physiological indexes like EEG, ECG, and EOG can be used to detect driver drowsi-
ness and fatigue. These indexes can be visually observed through eye and mouth
activity, head nodding, or digitally registered via monitoring devices.

• Non-visual methods for fatigue detection, such as EEG measurement, show promise
in terms of reliability and feasibility.

• Vehicle-based features like brake pressure and wheel angle can also provide reliable
information for detecting driver fatigue[1].

6
Figure 2.4: The general system diagram of mouth and yawning analysis.[3]

Figure 2.5: a) The normal state of the mouth. (b) The initial state of the yawning. (c)
The mouth gets wider while yawning compared to speaking. (d) The mouth state after
completing yawning.[2]

7
Chapter 3

Driver Distraction Detection Techniques

Distraction is anything that diverts the driver’s attention from the primary tasks of navi-
gating the vehicle and responding to critical events despite the presence of obstacles or
other people. To put it another way, distraction refers to anything that takes your eyes off
the road (visual distraction), your mind off the road (cognitive distraction), or your hands
off the wheel (manual distraction) [6]. Driver distractions are the leading cause of most
vehicle crashes throughout the world.
Detecting driver distraction or inattention is to monitor the driver head pose and gaze di-
rection that infers quite reliable information about driver distraction. Head pose and gaze
direction of driver can be measured by applying com- puter vision techniques properly.
Different articles and studies were examined and several key factors including accuracy,
advantages, disadvantages, limitations, methods, classifiers and system implementation
for this technique.
The most common methods for the detection of head pose and gaze direction are shaped-
based, appearance-based and hybrid methods . According to comparisons of these meth-
ods in [8], the shape-based methods use a prior model of eye shape and surrounding struc-
tures, the appearance-based methods rely on models built directly on the appearance of
the eye region. The appearance-based approach is conceptually related to template match-
ing by constructing an image patch model and performing eye detection through model
matching using a similarity measure.

8
Figure 3.1: head images. (a) Left-front. (b) Front. (c) Up. (d) Right-front. (e) Left-side.
(f) Rear. (g) Down. (h) Right-side.[2]

9
Chapter 4

ENABLING MOBILE
TECHNOLOGIES FOR DRIVER
MONITORING

To detect driver drowsiness and sleepiness mostly come embedded into car systems which
restrict their advantage due to associated high costs. On the other hand, considering to-
day’s mobile technologies it would speed up the proliferation of the available driver inat-
tention monitoring systems by porting them into mobile platforms. Thus, people would
start to exploit these non-intrusive affordable systems. We believe that all types of vehi-
cles even the old ones could have its own driver safety solutions via mobile equipment,
such as smartphones and wearable devices. The widespread usage of these low-cost and
lightweight mobile devices composed of high speed processors, high-quality cameras and
several sensors makes them a new alternative for safe driving.

4.1 Smartphone Solutions


A mobile application for Android smartphones was developed that both detects driver
drowsiness and sleepiness by applying image-processing techniques on video frames ob-
tained via the front camera and alerts driver.
The system suffers from the same limitations, such as sunglasses and rapidly changing
lighting conditions.
A new driver safety app, CarSafe, was introduced which aims at detecting and warning

10
Figure 4.1: The system design of smartphone solution[4]

drivers against dangerous driving conditions and behaviors Fig 4.1.


This application simply exploits computer vision and machine learning algorithms in or-
der to monitor and detect whether the driver is tired or distracted using the front-facing
camera, while at the same time tracking road conditions using the rear-facing camera.

4.2 Wearable Solutions


One of the most wearable devices is Google Glass. Google Glass has four sensors that
could be used for activity recognition: a camera, a microphone, an inertial measurement
unit (IMU), and an infrared proximity sensor facing towards the user’s eye that can be
used for blink detection.
The authors presented how the information about eye blinking frequency and head motion
patterns gathered from Google Glass sensors can be used to categorize and analyze differ-
ent types of high-level activities of the user. By taking the Google Glass as a prototype,
there are other companies which developed some smart glasses that effectively and aptly
monitor a user’s level of fatigue or drowsiness, and then acts accordingly to ensure safety
and protection. As sensors play an important role to gather data for those smart glasses,.

11
Chapter 5

DRIVER BEHAVIOR
DISSEMINATION FOR SAFE
DRIVING

The available studies mostly focus on in car solutions. How ever, vehicular communica-
tion enables the cars to disseminate data collected from car and driver. Thus, we believe
that safe driving solutions will evolve in near future and more sophisticated systems will
take place to warn drivers against other potentially threatening drivers.

5.1 Commercial Driver Assistance Systems


• Driver behavior dissemination refers to the sharing of data collected from cars and
drivers to enhance safe driving practices.

• Vehicular communication enables cars to exchange information about potentially


threatening drivers, allowing for early warnings and proactive measures.

• The integration of driver behavior analysis systems into car-to-car communication


can support safe driving by alerting drivers to potential risks from other drivers.

• The use of sensors and communication technologies can enable real-time monitor-
ing and dissemination of driver behavior data to prevent accidents.

• The more sophisticated systems will evolve in the future, leveraging car-to-car com-
munication to enhance safe driving practices

12
Chapter 6

FUTURE DIRECTIONS AND


CONCLUSION

• Several detection techniques for driver drowsiness and distraction have been pro-
posed in the literature, and some of these techniques have been successfully adopted
by leading car companies.

• The research categorizes studies on driver drowsiness and distraction, examining


the features, classification methods, accuracy rates, system parameters, and envi-
ronmental details of these studies.

• Driver behavior analysis systems into car-to-car communication to support safe


driving in vehicular ad hoc networks.

• The conclusion suggests that combining multiple techniques in hybrid solutions


can enhance the accuracy and confidence of driver fatigue and distraction detection
systems.

• Future systems should include both in-car and inter-car alerting mechanisms to
warn inattentive drivers.

13
References

[1] S. R. Langton, H. Honeyman, and E. Tessler, “The influence of head contour and nose
angle on the perception of eye-gaze direction,” Perception & psychophysics, vol. 66,
pp. 752–771, 2004.

[2] S. Kaplan, M. A. Guvensan, A. G. Yavuz, and Y. Karalurt, “Driver behavior analysis


for safe driving: A survey,” IEEE Transactions on Intelligent Transportation Systems,
vol. 16, no. 6, pp. 3017–3032, 2015.

[3] A. B. Albu, B. Widsten, T. Wang, J. Lan, and J. Mah, “A computer vision-based sys-
tem for real-time detection of sleep onset in fatigued drivers,” in 2008 IEEE intelligent
vehicles symposium, pp. 25–30, IEEE, 2008.

[4] C.-W. You, N. D. Lane, F. Chen, R. Wang, Z. Chen, T. J. Bao, M. Montes-de Oca,
Y. Cheng, M. Lin, L. Torresani, et al., “Carsafe app: Alerting drowsy and distracted
drivers using dual cameras on smartphones,” in Proceeding of the 11th annual inter-
national conference on Mobile systems, applications, and services, pp. 13–26, 2013.

[5] L. M. Bergasa, J. Nuevo, M. A. Sotelo, R. Barea, and M. E. Lopez, “Real-time sys-


tem for monitoring driver vigilance,” IEEE Transactions on intelligent transportation
systems, vol. 7, no. 1, pp. 63–77, 2006.

[6] J. Krajewski, D. Sommer, U. Trutschel, D. Edwards, and M. Golz, “Steering wheel


behavior based estimation of fatigue,” in Driving assessment conference, vol. 5, Uni-
versity of Iowa, 2009.

[7] C. Sun, J. H. Li, Y. Song, and L. Jin, “Real-time driver fatigue detection based on eye
state recognition,” Applied mechanics and Materials, vol. 457, pp. 944–952, 2014.

14
[8] A. Williamson and T. Chamberlain, “Review of on-road driver fatigue monitoring
devices,” NSW Injury Risk Management Research Centre, University of New South
Wales, 2005.

15

You might also like