Autonomous Vehicle With Emergency Braking Algorith
Autonomous Vehicle With Emergency Braking Algorith
Autonomous Vehicle With Emergency Braking Algorith
Abstract: The automobile revolution and growth in the number of cars produced several issues, and
vehicle accidents remain one of the most serious road‐related issues. Human mistakes and a failure
to brake quickly are the main causes of accidents. There may be serious outcomes to driving when
distracted. To address the aforementioned issues, an autonomous emergency braking system
(AEBS) was developed. To support such an AEBS, scalable, reliable, secure, fault‐tolerant, and in‐
teroperable technologies are required. An advanced emergency braking system (EBS) with sensor
fusion is proposed in this paper that can autonomously identify a probable forward collision and
Citation: Alsuwian, T.; Saeed, R.B.; activate the vehicle braking system to brake the vehicle to avoid or mitigate a collision. Additionally,
Amin, A.A. Advanced Autonomous it provides a non‐linear speed controller that facilitates the AEBS to apply the brakes in an emer‐
Vehicle with Emergency Braking gency. Sensor fusion using lidar, radar, and vision sensors makes the AEBS more efficient and more
Algorithm Based on Multi‐Sensor reliable to detect vehicles or obstacles and decreases the chance of collision to a minimum level. A
Fusion and Super Twisting Speed
MATLAB/Simulink environment was used for simulation experiments and the results demon‐
Controller. Appl. Sci. 2022, 12, 8458.
strated the stable operation of the AEBS to avoid forward collisions in the event of an error in the
https://doi.org/10.3390/app12178458
measurement of any one sensor while any vehicle is detected. The presented work establishes that
Academic Editors: Erika Ottaviano, the EBS sensor fusion unit is a highly reliable solution for detecting the leading vehicle at the proper
Jose Machado, Katarzyna Antosz, time and the AEBS controller can apply the brake in the situation of forwarding obstacle detection.
Dariusz Mazurkiewicz, Yi Ren,
Pierluigi Rea, Rochdi El Abdi, Keywords: advanced emergency braking; autonomous emergency braking; sensor fusion; super
Marina Ranga, Vijaya Kumar
twisting controller; fault tolerance
Manupati and Emilia Villani
A key advantage of the AEBS is that it helps the driver avoid car accidents and re‐
duces the severity of those that are unavoidable. However, there are some conventional
autonomous emergency braking (AEB) disadvantages to consider. The possibility of mak‐
ing a mistake by a sensor or system is one of them. Figure 1 provides an overview of the
AEBS process.
An advanced emergency braking system (EBS) with sensor fusion is proposed in this
paper that can autonomously identify a probable forward collision and activate the vehi‐
cle forward collision warning (FCW) and braking system to brake the vehicle to avoid or
mitigate a collision. If the system detects a risk of collision with vehicles or pedestrians at
the front of the car, the driver will be notified by visual and aural alarms, and light, auto‐
matic brakes will be applied. This is to obtain the driver’s attention and get them to act
quickly to prevent collisions. If the driver fails to decelerate and the likelihood of an acci‐
dent increases, the system will automatically deploy emergency braking just before the
crash. This will aid in avoiding or reducing the collision’s damage. If the driver applies
the brakes, it can also enhance the braking force, but not enough to avoid a crash. Vehicles
are detected by all AEBSs, and many of them can also identify pedestrians and bicycles.
The nomenclature contains a list of abbreviations and symbols.
it can automatically apply the brakes to prevent a potential collision. An automated brak‐
ing system can also communicate with a car’s GPS and use its database of traffic signs and
other data to apply the brakes quickly if the driver does not [7].
The electronic control unit (ECU) on your car gives the AEBS access to additional
data about it. The AEBS can evaluate whether the speed at which you are traveling has
the potential to result in a collision by factoring in your vehicle’s speed and measuring the
distance to the object detected in front of the vehicle. When an impending collision is iden‐
tified, the AEBS will examine the braking systems. The AEBS will not step in if you have
already applied the brakes and are slowing down sufficiently to avoid the crash. How‐
ever, if you have not applied the brakes or have not done so with enough force because of
the approaching obstacle or vehicle, the AEBS will take control and apply the brakes for
you [8].
To create a more thorough and accurate environmental model, sensor fusion com‐
bines the data from all of these different types of sensors, or a highly variable collection of
sensor modalities. By using a technique called internal and exterior sensor fusion, it may
also correlate with the information obtained from the environment [11]. The information
from numerous sensors of the same sort, such as radars, might also be combined by a
vehicle via sensor fusion. Trying to take advantage of slightly overlapping areas of view
enhances the experience. More than one sensor will pick up things simultaneously when
more than one radar scans the area around a vehicle [12].
The detection likelihood and reliability of things nearby the vehicle can be increased
by fusing or overlapping the detections from those various sensors when interpreted by
global 360° perception software, which also produces a more accurate and trustworthy
picture of the environment. The most fundamental distinction between centralized and
decentralized sensor fusion is made by the data being used, the raw sensor data, the fea‐
tures extracted from the sensor data, and the judgments made by utilizing the extracted
features and other information. Depending on how it is used, sensor fusion may have the
following advantages:
Improved data dependability
Improved data quality
Estimate the unmeasured states
Appl. Sci. 2022, 12, 8458 4 of 23
Expanded coverage
Radar or vision‐based systems are frequently employed in previously built systems,
and the implementation of an AEBS can be conducted in MATLAB and Simulink. The
suggested technique uses sensor data fusion from radars, vision, and lidars. The funda‐
mental form used to track automobiles, people, or any obstacle in good or bad weather is
the sensors model, which is based on the acquisition of MATLAB and Simulink. Three
advanced sensor‐based sensor fusion techniques are proposed, which are included by the
MATLAB model in the emergency braking procedure. In contrast to earlier linear systems,
our advanced system also features a non‐linear super twisting speed controller which also
supports EBS.
This paper aims to demonstrate stable AEBS operation to avoid or minimize forward
collisions and prevent accidents while any vehicle is identified. In Section 2, the existing
AEBS and fusion methods that have been applied in this field are discussed. The research
and methodology are covered in Section 3, a simulation is discussed in Section 4, Section
5 consists of Results and Discussions, and Sections 6 and 7 consist of a comparison with
the existing research and the conclusion, respectively.
2. Literature Review
The article [13] presents a recognition model for the front wheel of the car which is
dependent upon the backpropagation (BP) neural network and hidden Markov model. In
this proposed system, the inputs included are the brake pedal, accelerator pedal, and ve‐
hicle speed data, which are used to examine the driver’s intention. An AEB model for the
following vehicle is suggested that may dynamically adjust the critical braking distance
under varied driving circumstances to avoid rear‐end collisions, according to the recog‐
nized driver’s intention provided through the Internet of Vehicles. In [14], based on model
predictive control (MPC) theory, a variable time headway autonomous emergency brak‐
ing (AEB) control method is suggested. This kind of interference is addressed by the var‐
iable time headway‐based safety distance model. The AEB controller is designed with col‐
lision avoidance, driver characteristics, and ride comfort variables in mind. The variables
used to build the state equation are the separation between the two vehicles, their relative
velocities, and the ego vehicle’s velocities and accelerations.
The authors in [15] describe the new moving object identification and tracking sys‐
tem, which builds on and improves the previous system developed for the DARPA Urban
Challenge in 2007. The author updated the previous motion and observation models to
include active sensors and vision sensors. The vision module in the new system recognizes
bikers, pedestrians, and cars and generates a vision target for them. This visual recogni‐
tion information is used by the proposed system to improve the tracking prediction, data
association, and motion categorization of the previous system.
The purpose of [16] was to investigate the design and operation of an AEBS using the
many fundamentals of mechanical and electronic engineering, commonly referred to as
Mechatronics. An ultrasonic sensor combined with a stereo camera identifies an impedi‐
ment in front of the car and provides information about the relative distances between the
object and the vehicles in this system. The ECU will then determine whether or not an
accident is likely to occur, and the brake will be deployed autonomously as a result of this
approach. Time‐to‐collision (TTC) is one of the most extensively used time‐based ap‐
proaches presented in [17], and it was designed to evaluate the time it will take for an
accident occurs between a previous and accompanying vehicle. In Figure 2, a summary of
the TTC algorithm is presented.
Appl. Sci. 2022, 12, 8458 5 of 23
The TTC was used to calculate the remaining time for two vehicles to collide [18].
The following is how the TTC algorithm is calculated.
h L
TTC (1)
V V
In Equation (1), h stands for the distance between the preceding and following vehi‐
cles, VF and VP stand for the speed of the following and preceding vehicles, and L stands
for the length of the preceding vehicle.
To address the shortcomings of the previous method, the researchers proposed new
stopping distance‐based algorithms (SDAs) [19]. Calculating a safe stopping distance is
one of the most effective ways to keep an eye on the possibility of a rear‐end accident.
SDA‐based techniques for assessing rear‐end collision risk are based on the assumption
that in a car‐following position, the leading car’s friction coefficient must be higher than
the pursuing cars. An overview of the SDA is shown below in Figure 3 [20].
When stopping at a given speed, longer reaction times increase the thinking distance;
however, collisions can occur in this algorithm with shorter reaction times and faster de‐
cision times [21]. Based on non‐parametric techniques, many scholars have sought to con‐
struct collision‐warning solutions. Artificial Neural Network (ANN)‐based technology is
one of the most important new technologies in the rear‐end collision warning system
(CWS). The advantage of the ANN‐based approach is that it can deal with solving com‐
plex unobservable problems. Initially offered in [22] was a vehicle control strategy for
multi‐directional collision avoidance.
Numerous researchers have tried to develop collision‐warning systems based on
non‐parametric methods because the prior parametric deterministic approaches have the
drawback that they do not represent the influence of PRT. ANN‐based technology is one
of the most important new technologies on the back‐end CWS. The advantage of using an
ANN‐based approach is that it can deal with handling complex unobservable problems.
Initially suggested in [23] was a vehicle control strategy for multi‐directional collision
avoidance. The multi‐layer perceptron neural network and fuzzy logic algorithms, respec‐
tively, were used in this study. More recently, a method to integrate fuzzy logic and neural
networks was meant to create an algorithm that operates a car on the highway based on
highly accurate GPS data [24].
Appl. Sci. 2022, 12, 8458 6 of 23
The authors in [25] developed an algorithm for making decisions for AEB pedestrians
using radar and camera sensors for the data fusion technique. The EuroNCAP protocol’s
potential collision avoidance scenarios were examined and a reliable pedestrian tracking
method was suggested. By creating the system activation zone with the relative speed and
potential distance needed to stop pedestrians, as well as by utilizing a brake model to
anticipate the collision avoidance time, the performance of the AEB system was improved.
In [26], a MATLAB‐created AEBS module, based on two radar sensors, was suggested.
The proposed AEBS depends on a short‐range and long‐range radar, the time between the
detection of the obstacle in front of the autonomous vehicle, and the speed of the autono‐
mous vehicle.
In [27], an autonomous vehicle collision avoidance and pedestrian detection system
was proposed based on stereo vision. It monitors the area using two cameras that are
placed at a set distance apart. When a pedestrian is detected, the system estimates the
stopping distance, and the suggested controller algorithm will initiate braking if the cal‐
culated distance is less than the safe driving distance. Using an infrared camera that can
detect heat and lidar sensors in RC cars for obstacle detection, ref. [28] proposed a new
obstacle recognition method for accurately identifying front cars and pedestrians and re‐
ducing the danger of vehicle collisions in bad weather. By using RC cars as testbeds for
autonomous vehicles, the author showed that the suggested technique is feasible by inte‐
grating lidars and a thermal infrared camera on vehicles.
In [29], a new method to determine the distance (Hurdle Detection) was proposed
for a secure environment within a moving vehicle. Eight ultrasonic sensors are employed
in this system to detect various object types. The car and sensor are allowed to function
normally up to the sensor detecting a potential risk by putting into practice a potential
increase in the safety system in the vehicle. The literature review suggests that the existing
system employed one or two (mostly radar and vision) types of sensors for data fusion
with various EBS controller algorithms. Existing obstacle identification fusion technolo‐
gies have resulted in major accidents because they cannot effectively identify vehicles or
pedestrians at night and in bad weather. False positives of any sensor are also drawbacks
of the previously existing AEB systems that may cause disruptions or accidents.
In this paper, our contribution is to introduce a dependable multi‐sensor fusion ar‐
chitecture and a reliable decision‐making algorithm for the AEB controller to perform au‐
tonomous emergency braking and protect pedestrians. The proposed multi‐sensor fusion
architecture has three different types of sensors: radar, lidar, and vision sensors. The TTC
and stopping time calculation approaches are used to implement the AEB controller and
FCW algorithm. A non‐linear speed controller is included in place of existing linear con‐
trollers in addition to supporting AEB. Additionally, a suitable collision decision and pre‐
diction algorithm is created and carefully investigated for the EuroNCAP AEB pedestrian
situations. This system also managed the trade‐off between AEB performance and false
positives by setting the threshold for AEB activation and cautiously preventing false pos‐
itives to achieve an accurate performance of the system.
3. Research Methodology
An advanced active safety system to protect vehicles from collisions is the AEBS. It
is made to assist drivers in preventing or lessening crashes with other drivers on the road.
The Simulink model of MATLAB was used to implement the AEBS. The implementation
of an EBS can be performed in MATLAB and Simulink and the radar and vision‐based
systems were widely used in the previously implemented system, but the proposed sys‐
tem uses lidar, radar, and vision with sensor fusion. The sensors model based on the ac‐
quisition of MATLAB and Simulink was used as a basic form to track vehicles, pedestri‐
ans, or any hurdle in both good and bad weather. In the case of the emergency braking
process, three advanced sensors based on the sensor fusion method were projected and
incorporated by the model provided by MATLAB [30,31].
Appl. Sci. 2022, 12, 8458 7 of 23
At the start, three sensors radar, vision, and lidars detect and track objects near the
ego vehicle. When an ego vehicle approaches a leading vehicle, the sensor measures the
ego vehicle’s distance from the leading vehicle, calculates the speeds of both vehicles, cal‐
culates the time to collision, activates the forward collision alert, and then calculates the
stopping distance. When the lead vehicle’s TTC is less than the TFCW, the FCW alert is
activated. Due to distractions, the driver may fail to engage the brakes; in this situation,
the AEBS operates autonomously to avoid or lessen the impact. All the processes are
shown below in the flow chart and block diagram in Figures 4 and 5.
Our proposed system is divided into two parts: sensor fusion and AEB control. First,
the AEB algorithm’s control system is divided into two primary subsystems, the AEB con‐
trol subsystem, and the speed control subsystem, both of which are illustrated in this sec‐
tion. The AEB control subsystem is responsible for conducting vehicle braking, while the
speed control subsystem is in charge of accelerating the vehicle. This study only examines
the scenario in which the leading and ego cars both move in the same lane.
Appl. Sci. 2022, 12, 8458 8 of 23
Full Braking
Partial Braking
Forward Collision
Warning (FCW) 𝟐𝒏𝒅
𝟏𝒔𝒕 𝑺𝒕𝒂𝒈𝒆 𝑺𝒕𝒂𝒈𝒆
Time‐To‐Collision
𝑻𝑭𝒄𝒘 𝑻𝑷𝑩𝟏 𝑻𝑷𝑩𝟐 𝑻𝑭𝑩 (TTC)
The following circumstances, which are indicated in Figure 8 givens in Equations (8)–
(11), cause the ego vehicle’s state to change:
0 𝑇 𝑎𝑛𝑑 𝑇 |𝑇 | (8)
0 𝑇 𝑎𝑛𝑑 𝑇 |𝑇 | (9)
0 𝑇 𝑎𝑛𝑑 𝑇 |𝑇 | (10)
0 𝑇 𝑎𝑛𝑑 𝑇 |𝑇 | (11)
Appl. Sci. 2022, 12, 8458 10 of 23
The ego vehicle accelerates to attain the defined velocity after the driving scenario
has been started, and if the defined velocity is reached, it maintains the velocity. The car
in front of the ego vehicle is detected by the radar, lidar, and vision sensors, which extract
data. As soon as requirement 1A is satisfied, the FCW is turned on. Until condition 2A is
satisfied, the ego vehicle remains in State 1 in motion. Starting at this point, the ego vehicle
applies cascaded brakes. Similar to this, the ego car is in Stage 3 or 4 when condition 3A
or 4A has been satisfied.
The ego vehicle begins to decelerate as soon as it enters the braking stage and contin‐
ues to do so until its speed is less than 0.1 m/s. There are two PID controllers in this re‐
search, as indicated below. During the acceleration phase, one is in charge of regulating
the ego vehicle’s speed (State 0). The other is used as leverage to manage the ego vehicle’s
braking stage deceleration (State 2–4). The second PID controller is the part of the AEB
controller which controls the different stages of braking, as shown in below Figure 9. The
implementation of this controller was conducted in a simulation using a signal flow graph
where the controller decides on which declaration rate to apply for the desired velocity
and braking condition.
is available and is only present during the reaching phase, whereas the first component is
described by the discontinuity of the derivative concerning time. The control algorithm is
defined by the following control law [32].
Super Twisting PI
𝜎 𝑢 𝜎 𝑢
1
ꭍ 𝐾 ꭍ
‐1
𝑢 𝑡 𝑈 𝑡 𝑈 𝑡 (12)
𝑢 𝑖𝑓 |𝑢| 1
𝑢 𝑡 𝑊𝑠𝑖𝑔𝑛 𝜎 (13)
𝑖𝑓 |𝑢| 1
λ|𝜎 |𝜌𝑠𝑖𝑔𝑛 𝜎 𝑖𝑓 |𝜎| 𝜎
𝑢 𝑡 (14)
λ|𝜎 |𝜌𝑠𝑖𝑔𝑛 𝜎 𝑖𝑓 |𝜎| 𝜎
where u is the boundary control value, 𝜎 is the boundary layer surrounding the sliding
surface σ, and W, λ, and ρ are control gains. In our controller, u is the output that controls
the acceleration, and the inputs of the controller are σ and the error or distance between
the sliding surface and our current value. The proposed speed controller in the system
based on the ST‐SMC accelerates the vehicle in normal conditions to its desired velocity
and when braking is applied. Under this condition, the speed controller decelerates the
vehicle using a throttle to support the EBS.
Target List of
detections Radar object detections
Radar
detection
List of Fused
Images Camera‐based detections representation
Camera Fusion
Classification
Laser List of
points Lidar object detections
Lidar
detection
Moving Data of
Vehicle Odemetry Object Moving
Filter State Tracking Objects
Figure 11. The system with multiple sensors for perception [33].
The relative distance and relative velocity of the item are the details needed in this
study for data fusion. Equations (15)–(18) explain the measured data z(k) at time step k.
𝑧 𝑘 𝑧 𝑘 ,𝑧 𝑘 ,𝑧 𝑘 , (15)
𝑧 𝑘 𝑣 ,𝑣 ,…,𝑣 𝑣 𝑥𝑦𝑥 𝑖 1, … , 𝑝 (16)
𝑧 𝑘 𝑟 ,𝑟 ,…,𝑟 𝑟 𝑥𝑦𝑥 𝑖 1, … , 𝑞 (17)
𝑥 𝑥
⎡ ⎤
𝑦 ⎢𝑥 tan 𝜃 ⎥
𝑥 𝑥 ⎢ ⎥ (20)
⎢ 𝑥 ⎥
𝑦
⎣ 𝑦 ⎦
4. Simulation
The Simulink diagram of the suggested model is shown in Figure 13. The sensors and
actuators, roads, lidars, cameras, and radar sensors utilized in the simulation were all
specified by this subsystem. The raw data were pre‐processed to extract items in the lidar
tracking algorithm subsystem model. The AEBS with a sensor fusion block had a compu‐
tation system that included vehicle detection systems from the camera, radar, and lidar
sensors, and it specified the longitudinal and lateral evaluation logic that gave the control
system acceleration controls, information about the ego vehicle reference path, infor‐
mation about the most important object, and the steering angle. The ego vehicle was mod‐
eled by the vehicle dynamics subsystem using a bicycle simulator, and its status was main‐
tained using instructions from the AEBS controllers model.
Appl. Sci. 2022, 12, 8458 14 of 23
Figure 13. The advanced emergency braking system with sensor fusion.
A scenario was created and it included one leading vehicle and an ego vehicle. A
straight road was implemented with a standard width, a custom figure window to plot
the scenario, and the stop time for the scenario was set to the required seconds. The cars
were set up in a driving scenario, the all‐set driving scenario object was imported into the
Driving Scenario Designer App, and the results were exported. In the proposed system, a
dashboard display in the vehicle similar to that in Figure 14 was present. The purpose of
the dashboard is to warn drivers when the FCW state activates and to inform them about
the braking status of the vehicle during first, second, and fully braking.
block receives data from the cuboid lidar, rectangular lidar, rectangular radar, and rectan‐
gular vision detection and outputs fused tracks. The data concatenation block creates a
single‐track bus by combining the number of detections from all the sources. Using the
Source‐Config variable through the Pre‐Load‐Function callback, the fuser source config‐
uration for the radar, lidar, and vision is set. Track fusion is represented in this graph at a
single time step. The fact that the fused tracks are more accurate and precise than the
individual sensor detections shows that the fusing of the detection estimates from all three
sensors improves track accuracy. Fusion detection is utilized for additional computations.
In this scenario, all the sensors are healthy; no error is found in any sensor. The stopping
time is the period from when an ego vehicle first decelerates by using its brakes until it
comes to a complete stop. The stopping time is calculated mathematically using Equation
(2). When the FCW system alerts a driver that a collision with the lead vehicle is imminent,
they are required to react to the alarm and to apply the brake during the delay period.
Equation (3) gives the total distance that the ego vehicle must travel before colliding with
the lead vehicle.
Figure 15. Sensors measured detections and calculated detections with sensor fusion.
The TTC of the leading vehicle must be smaller than the TFCW for the FCW alert to
be activated. The AEBS reacts automatically to prevent or decrease the impact of the col‐
lision when a driver fails to apply the brakes. AEBSs typically use a step‐by‐step braking
technique that alternates between partial braking in multiple stages and full braking. The
following Figure 16 describes the autonomous emergency braking logic utilized by the
AEBS controller to initiate the FCW and show the AEBS status.
Appl. Sci. 2022, 12, 8458 16 of 23
Figure 16. Status of AEB, FCW, ego car stop, and collision at the normal scenario.
Figures 16 and 17 show how the vehicle’s speed increased and reached its maximum
speed of 8.3 m/s in one second, as well as how the FCW and PB influenced the vehicle’s
speed. The leading car was identified as soon as the system started, and the ego vehicle
used the sensor fusion algorithm to identify it. The first stage of the FCW became true in
the 1.2 s condition, at which point the FCW was immediately triggered. After some time
when the driver did not apply the brake during the warning period and at 2.2 s, State 2 of
the first PB became true, the ego vehicle started to slow down, the first stage of partial
braking was engaged, and at 4.7 s, the whole vehicle uniformly stopped. The speed of the
ego vehicle is depicted here. The proposed braking system has different reaction times in
different scenarios, relative velocities, and relative distances. In the given example sce‐
nario, our ego vehicle velocity was 8.33 m/s, the brake was applied at 2.2 s, and the vehicle
stopped at 4.7 s. Almost 2.5 s was the reaction time of the first partial braking when the
vehicle speed was 8.33 m/s.
Figures 18 and 19 show how the vehicle’s speed increased until it reached its top
speed of 12.33 m/s in 1.8 s. The ego vehicle used the sensor fusion technique to identify
the leading vehicle once it was recognized in front after achieving its top speed. In the 1.8
s when the leading vehicle was detected, the first state of the FCW became true and at
which point the FCW was immediately triggered. In the next State 2, the first PB became
true, the ego vehicle started to slow down, and the first stage of partial braking was en‐
gaged. At 2 s, State 3 and the second stage of partial braking occurred and at last, in 2.1 s,
State 4 and the fully braking stage were applied. The ego vehicle came to a complete stop
at 3.4 s. In this case, the AEB completely avoided the rear end. When the ego vehicle ve‐
locity was 12.33 m/s, the brake was applied at 1.8 s and the vehicle stopped at 3.4 s. Almost
1.6 sec was the reaction time. Figure 19 shows the change in vehicle speed as the brakes
were applied at various stages.
Figure 18. Status of AEBS, FCW, ego car stop, and collision at high speed.
Figure 19. The velocity of the vehicle when various stages of brakes are applied.
Appl. Sci. 2022, 12, 8458 18 of 23
Figure 20 shows the time‐to‐collision behavior and stop times for the FCW, first and
second stage partial braking, and full braking time when the speed of the vehicle changed
with time.
Figure 20. The behavior of time‐to‐collision and the stopping times for FCW, first stage partial, sec‐
ond stage partial, and full brake.
Figure 22. Results of earlier sensor fusion technique when using two sensors.
Table 2. Detail properties of radar, vision, lidar, proposed fusion, and previous fusion.
Weather Condi‐
Strength Capability Weakness Strength Capability
tions
Lightening Con‐
Strength Strength Weakness Strength Capability
ditions
Dirt Strength Capability Weakness Strength Capability
Velocity Strength Capability Capability Strength Strength
Distance Accu‐
Capability Strength Capability Strength Capability
racy
Distance Range Strength Capability Capability Strength Strength
Data Density Weakness Capability Strength Strength Strength
Classification Weakness Capability Strength Strength Capability
Packaging Strength Weakness Capability Strength Strength
7. Conclusions
To achieve autonomous emergency braking and protect pedestrians, a decision‐mak‐
ing algorithm and a dependable sensor fusion architecture were proposed in this paper.
The proposed model had three types of sensors: radar, lidar, and vision sensors. The im‐
plementation of a simple trajectory prediction and a data‐association technique was per‐
formed with an emphasis on the effective and trustworthy tracking of various target
kinds. To find pedestrians that were either obscured or not spotted, a possible track was
used based on a momentary assessment. Track management was provided by the sug‐
gested multi‐sensor fusion system, which was stable and dependable. Furthermore, the
EuroNCAP AEB pedestrian scenarios were thoroughly examined, and suitable collision
decision and prediction algorithms were provided. The failure or false positives of any
sensor may cause disturbances or accidents; therefore, to obtain an accurate performance
of the system, the proposed system managed the trade‐off between AEB performance and
false positives by setting the threshold for AEB activation and cautiously preventing false
positives. This study demonstrated that the suggested AEBS based on sensor fusion is a
very reliable option for emergency braking in the autonomous vehicle since it prevented
the system from failing in the case of false‐positive detection of any one sensor.
To obtain more robust and reliable detection from sensors, advanced fault‐tolerant
approaches to the sensor fusion part of the system may be used in the future. To increase
the system’s accuracy and efficiency, the processing delays caused by the environment
may also be considered.
Nomenclature
Abbreviation Description
AEBS Autonomous Emergency Braking System
EBS Emergency Braking System
AEB Autonomous Emergency Braking
FCW Forward Collision Warning
Appl. Sci. 2022, 12, 8458 21 of 23
References
1. Road Traffic Injuries. Available online: https://www.who.int/news‐room/fact‐sheets/detail/road‐traffic injuries (accessed on 16
July 2022).
2. Pollard, J.K. Evaluation of the Vehicle Radar Safety Systems’ Rashid Radar Safety Brake Collision Warning System; U.S. Department of
Transportation, National Highway Traffic Safety Administration, Office of Crash Avoidance: Washington, DC, USA, 1988.
3. Ucińska, M.; Pełka, M. The effectiveness of the AEB system in the context of the safety of vulnerable road users. Open Eng. 2021,
11, 977–993. https://doi.org/10.1515/eng‐2021‐0097.
4. Isaksson‐Hellman, I.; Lindman, M.; The Effect of a Low‐Speed Automatic Brake System Estimated From Real Life Data. Ann.
Adv. Automot. Med. 2012, 56, 10.
5. Shahbaz, M.H.; Amin, A.A. Design of Active Fault Tolerant Control System for Air Fuel Ratio Control of Internal Combustion
Engines Using Artificial Neural Networks. IEEE Access 2021, 9, 46022–46032. https://doi.org/10.1109/ACCESS.2021.3068164.
Appl. Sci. 2022, 12, 8458 22 of 23
6. Ivanov, M.; Kristalniy, S.R.; Popov, N.V.; Toporkov, M.A.; Isakova, M.I. New testing methods of automatic emergency braking
systems and the experience of their application. IOP Conf. Ser. Mater. Sci. Eng. 2018, 386, 012019. https://doi.org/10.1088/1757‐
899X/386/1/012019.
7. Yeong, D.J.; Velasco‐Hernandez, G.; Barry, J.; Walsh, J. Sensor and Sensor Fusion Technology in Autonomous Vehicles: A Re‐
view. Sensors 2021, 21, 2140. https://doi.org/10.3390/s21062140.
8. Amin, A.; Mahmood‐ul‐Hasan, K. Hybrid fault tolerant control for air–fuel ratio control of internal combustion gasoline engine
using Kalman filters with advanced redundancy. Meas. Control. 2019, 52, 473–492. https://doi.org/10.1177/0020294019842593.
9. What Is Sensor Fusion? Available online: https://www.aptiv.com/en/insights/article/what‐is‐sensor‐fusion (accessed on 16 July
2022).
10. Amin, A.; Mahmood‐Ul‐Hasan, K. Advanced Fault Tolerant Air‐Fuel Ratio Control of Internal Combustion Gas Engine for
Sensor and Actuator Faults. J. Mag. 2019, 7, 17634–17643. https://doi.org/10.1109/ACCESS.2019.2894796.
11. Alatise, M.B.; Hancke, G.P. A Review on Challenges of Autonomous Mobile Robot and Sensor Fusion Methods. IEEE Access
2020, 8, 39830–39846, https://doi.org/10.1109/ACCESS.2020.2975643.
12. Zhang, R.; Li, K.; He, Z.; Wang, H.; You, F. Advanced Emergency Braking Control Based on a Nonlinear Model Predictive
Algorithm for Intelligent Vehicles. Appl. Sci. 2017, 7, 504. https://doi.org/10.3390/app7050504.
13. Yang, W.; Liu, J.; Zhou, K.; Zhang, Z.; Qu, X. An Automatic Emergency Braking Model considering Driver’s Intention Recogni‐
tion of the Front Vehicle. J. Adv. Transp. 2020, 2020, 5172305. https://doi.org/10.1155/2020/5172305.
14. Guo, L.; Ge, P.; Sun, D. Variable Time Headway Autonomous Emergency Braking Control Algorithm Based on Model Predic‐
tive Control. In Proceedings of the 2020 Chinese Automation Congress (CAC), Shanghai, China, 7–8 November 2020; pp. 1794–
1798. https://doi.org/10.1109/CAC51589.2020.9327238.
15. Cho, H.; Seo, Y.‐W.; Kumar, B.V.K.V.; Rajkumar, R.R. A multi‐sensor fusion system for moving object detection and tracking in
urban driving environments. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA),
Hong Kong, China, 31 May–5 June 2014; pp. 1836–1843. https://doi.org/10.1109/ICRA.2014.6907100.
16. Khan, M.S.; Rao, A.K.; Choudhary, N.; Sharma, J.K.; Tejeshwar; Jha, N.; Mishra, G. Electromagnetic System using Ultrasonic
Sensor. Int. J. Civ. Mech. Energy Sci. 2021, 7, 1–4. https://doi.org/10.22161/ijcmes.74.1.
17. Sharizli; Rahizar, R.; Karim, M.R.; Saifizul, A.A. New Method for Distance‐based Close Following Safety Indicator. Traffic Inj.
Prev. 2015, 16, 190–195. https://doi.org/10.1080/15389588.2014.921913.
18. Kusano, K.D.; Gabler, H. Method for Estimating Time to Collision at Braking in Real‐World, Lead Vehicle Stopped Rear‐End
Crashes for Use in Pre‐Crash System Design. SAE Int. J. Passeng. Cars Mech. Syst. 2011, 4, 435–443. https://doi.org/10.4271/2011‐
01‐0576.
19. Amin, A.; Mahmood‐ul‐Hasan, K. Robust active fault‐tolerant control for internal combustion gas engine for air–fuel ratio con‐
trol with statistical regression‐based observer model. Meas. Control. 2019, 52, 1179–1194.
https://doi.org/10.1177/0020294018823031.
20. Flanagan, S.K.; Tang, Z.; He, J.; Yusoff, I. Investigating and Modeling of Cooperative Vehicle‐to‐Vehicle Safety Stopping Dis‐
tance. Future Internet 2021, 13, 68. https://doi.org/10.3390/fi13030068.
21. Amin, A.; Hasan, K.M. A review of Fault Tolerant Control Systems: Advancements and applications. Measurement 2019,143, 58–
68. https://doi.org/10.1016/j.measurement.2019.04.083.
22. Lee, D.; Yeo, H. A study on the rear‐end collision warning system by considering different perception‐reaction time using multi‐
layer perceptron neural network. In Proceedings of the 2015 IEEE Intelligent Vehicles Symposium (IV), Seoul, South Korea, 29
June–1 July 2015; pp. 24–30. https://doi.org/10.1109/IVS.2015.7225657.
23. Nijhuis, J.; Neusser, S.; Spaanenburg, L.; Heller, J.; Sponnemann, J. Evaluation of fuzzy and neural vehicle control. In Proceed‐
ings of the CompEuro 1992 Proceedings Computer Systems and Software Engineering, The Hague, The Netherlands, 4–8 May
1992; pp. 447–452. https://doi.org/10.1109/CMPEUR.1992.218442.
24. GPS Vehicle Collision Avoidance Warning and Control System and Method—Patent US‐6275773‐B1—PubChem. Available
online: https://pubchem.ncbi.nlm.nih.gov/patent/US‐6275773‐B1 (accessed on 10 August 2022).
25. Lee, H.‐K.; Shin, S.‐G.; Kwon, D.‐S. Design of emergency braking algorithm for pedestrian protection based on multi‐sensor
fusion. Int.J Automot. Technol. 2017, 18, 1067–1076. https://doi.org/10.1007/s12239‐017‐0104‐7.
26. Carabulea, L.; Pozna, C.; Antonya, C.; Husar, C.; Băicoianu, A. The influence of the Advanced Emergency Braking System in
critical scenarios for autonomous vehicles. IOP Conf. Ser. Mater. Sci. Eng. 2022, 1, 012045. https://doi.org/10.1088/1757‐
899X/1220/1/012045.
27. Rajendar, S.; Rathinasamy, D.; Pavithra, R.; Kaliappan, V.K.; Gnanamurthy, S. Prediction of stopping distance for autonomous
emergency braking using stereo camera pedestrian detection. Mater. Today Proc. 2022, 51, 1224–1228.
https://doi.org/10.1016/j.matpr.2021.07.211.
28. Cho, M. A Study on the Obstacle Recognition for Autonomous Driving RC Car Using LiDAR and Thermal Infrared Camera. In
Proceedings of the 2019 Eleventh International Conference on Ubiquitous and Future Networks (ICUFN), Zagreb, Croatia, 2–5
July 2019; pp. 544–546. https://doi.org/10.1109/ICUFN.2019.8806152.
29. Tahir, M.B.; Abdullah, M. Distance Measuring (Hurdle detection System) for Safe Environment in Vehicles through Ultrasonic
Rays. Glob. J. Res. Eng. 2012, 12, 8.
Appl. Sci. 2022, 12, 8458 23 of 23
30. Saeed, R.B.; Usman, M.H.; Amin, A.A. Reliable speed control of a separately excited DC motor using advanced modified triple
modular redundancy scheme in H‐bridges. Adv. Mech. Eng. 2022, 14, 168781322211062.
https://doi.org/10.1177/16878132221106289.
31. Autonomous Emergency Braking with Sensor Fusion—MATLAB & Simulink. Available online: https://www.math‐
works.com/help/driving/ug/autonomous‐emergency‐braking‐with‐sensor‐fusion.html (accessed on 16 July 2022).
32. Zeb, K.; Busarello, T.D.C.; Islam, S.U.; Uddin, W.; Raghavendra, K.V.G.; Khan, M.A.; Kim, He. Design of Super Twisting Sliding
Mode Controller for a Three‐Phase Grid‐connected Photovoltaic System under Normal and Abnormal Conditions. Energies
2020, 13, 3773. https://doi.org/10.3390/en13153773.
33. Lee, D.; Kim, B.; Yi, K.; Lee, J. Development of an Integrated Driving Path Estimation Algorithm for ACC and AEBS Using
Multi‐Sensor Fusion. In Proceedings of the 2012 IEEE 75th Vehicular Technology Conference (VTC Spring), Yokohama, Japan,
6–9 May 2012, pp. 1–5. https://doi.org/10.1109/VETECS.2012.6240284.
34. Dixit, A.; Devangbhai, P.D.; Kumar, C.R. Modelling and Testing of Emergency Braking in Autonomous Vehicles. In Proceedings
of the 2021 Innovations in Power and Advanced Computing Technologies (i‐PACT), Kuala Lumpur, Malaysia, 27–29 November
2021; pp. 1–6. https://doi.org/10.1109/i‐PACT52855.2021.9696552.