Autonomous Cars Report - Part 2
Autonomous Cars Report - Part 2
For the past hundred years, innovation within the automotive sector has brought major
technological advances leading to safer, cleaner and more affordable vehicles. In the decades of the
21st century the industry appears to be on the cusp of revolutionary change with potential to
dramatically reshape not just the competitive landscape but also the way we interact with the
vehicles and indeed the future design of our roads and cities. The revolution when it comes will be
engendering by advent of an autonomous or self-driving car.
A glance at the communications of many car manufacturers, suppliers and technology companies
shows that they usually refer to ‘automated driving’, and only rarely to ‘autonomous driving’. The
former term is the umbrella term that includes several phases of automation beginning with driver-
assistance systems. The latter describes the final stage of automation, the situation in which the
system takes over all the steering, accelerating and braking manoeuvres. In this phase, a person is
out of the loop and the car is able to act totally alone at all times and in all traffic situations. Vehicles
that have reached this step can be described as autonomous, driverless or self-driving. In short, the
term automation is used to express the self-acting aspect of a machine. Autonomy goes beyond that
means the self-governing ability of an entire system.
So far, we have only been talking about autonomous cars, the furthest and most advanced stage of
vehicle automation in which the system is responsible for all driving tasks everywhere and at all
times. With this full automation, there is no driver anymore; all the occupants of a vehicle are just
passengers. It will take quite few years until lot of these vehicles are seen on the roads, but some
vehicles are equipped with considerable automation already today.
Level 0 is the starting point – there is no automation and the driver steers, accelerates and chooses
to apply the brakes on the vehicle without any system support. With levels 1 and 2, the system takes
over more and more of these driving tasks. However, the driver is obliged to permanently monitor
the system, the traffic and the surroundings, and must be able to take over the driving function at
any time. Level 3 means that the driver no longer must continuously monitor the activity, because
the system indicates to him or her when it is necessary to take control of the vehicle. With Level 4,
the system masters all driving functions in normal operation and in defined surroundings, but the
driver can overrule the system’s decisions. The last stage in Level 5, where the car is capable of
driving on its own, even when driver is not in the vehicle.
Some manufacturers’ autonomous vehicles have already emerged from the concept phase and
passed thorough tests to take their place on the roads. Currently, they operate in controlled
environments but will be found soon within normal traffic.
For example, an Audi car drove itself from San Francisco to Las Vegas, and another driverless Audi
reached a maximum of 149.1 miles per hour (240 kms per hour) on a racetrack (see Figure 1.1).
NioEV’s new sports car Nio EP9 completed the circuit of Americas Formula 1 racetrack in Austin in a
spectacular 2:40.33 minutes. Mercedes has presented it F015, which provides an impression of the
autonomous mobility of the coming years with its futuristic design and many innovative features
(see Figure 1.2).
1|Page
Department of Computer Engineering
Autonomous Cars
Source: Audi AG
Source: Daimler AG
For several years, Google has been testing its well-known vehicles in California and other US states.
Tesla had equipped some of its cars with software, cameras and radar, enabling them to drive
autonomously in certain traffic situations. Volvo plans to put cars that can drive in autonomous
mode on the beltway around Gothenburg, Sweden. Many other can manufacturers such as Ford,
General Motors, BMW, Toyota, Kia, Nissan and Volkswagen are working on prototypes of self-driving
cars or have already tested them in road traffic.
2|Page
Department of Computer Engineering
Autonomous Cars
The CTOs of many car manufacturers and technology companies agree that achieving the first 90 per
cent of autonomous driving is nothing special. It’s the last 10 per cent – in the most challenging
traffic situations and especially in urban areas – that makes the difference. That’s why the vehicles
have to be tested in as many traffic situations as possible so that experience is gained on their
reaction. A similar argument is presented by Jen-Hsun Huang, CEO of Nvidia, who demands the
accuracy of 99.999999 per cent in the development of autonomous cars, whereby the last
percentage point can only be achieved at very great expense. Toyota and Rand Corporation have
published calculations of the number of miles self-driving cars have to be tested before they can be
assessed as roadworthy because the algorithms required for driverless cars undergo self-learning in
multiple road traffic situations. The more traffic situations these algorithms are exposed to, the
better prepared they are to master a new situation. Designing this training process so that the
accuracy demanded by Jen-Hsun Huang is obtained, will be crucial challenge in development of
autonomous vehicles.
When discussing what fault tolerance might be acceptable, it should be borne in mind that people
are more likely to forgive mistakes made by other people than mistakes made by machines. This also
applies to driving-errors, which are more likely to be overlooked if they were committed by a driver
and not by machine. This means that autonomous vehicles will only be accepted if they cause
significantly fewer errors than the drivers. For this reason alone, the precision demanded by Jen-
Hsun Huang is indispensable.
3|Page
Department of Computer Engineering
Autonomous Cars
4|Page
Department of Computer Engineering
Autonomous Cars
5|Page
Department of Computer Engineering
Autonomous Cars
Studying about autonomous driving is quite a challenge, because new findings on the subject – often
contradictory – are appearing every day. Ideas, concepts and technologies relating to self-driving
vehicles are emerging all over the world and it is hardly possible to gain a detailed overview of them
all. So, this report cannot aim to be an entirely consistent description that is accurate in every detail
but is more like the collected journals of an expedition that is not yet completed. It was worthwhile
setting out on this expedition, because there is probably no other technology that will so
fundamentally transform our economic and social lives. The time has come to address the subject of
autonomous mobility and to make it the subject of social disclosure, thus contributing to changing
our lives for the better.
The automobiles have fascinated me since my childhood. I love driving cars. It gives me control and
power over the machine – a feeling of freedom, movement, pride and pleasure. To me car means
freedom. Cars are a form of self-expression, cars are my identity, it’s who I am. And as I love the
cars, I too have passion for Artificial Intelligence. So, combining the two fields, the field of self-
driving cars is born. And that’s what made me take it up as the seminar topic.
6|Page
Department of Computer Engineering
Autonomous Cars
The technologies upon which the autonomous driving is based blur the boundaries between the
automotive industry and the robotic, electronic and software industries. Software with programming
codes and algorithms, as well as cameras and sensors using radar, ultrasound and lasers, are all
gaining importance. Meanwhile, the hardware of a vehicle – the chassis, suspension and other
mechanical parts – are losing importance. So, it’s not surprising that technology companies such as
Microsoft, Apple, Google, Nvidia, Mobileye, NuTonomy and Qualcomm, are occupied with
autonomous driving and have developed their own driverless vehicles. Even the traditional
automotive suppliers such as Aisin, Delphi, Bosch, Denso, Continental, TRW, Schaeffler or Magna are
either preparing their own prototype of self-driving cars or working on the key components for
autonomous driving.
The essence of autonomous driving is the development of vehicles onto cyber-physical systems that
comprise a combination of mechanical and electronic components. A vehicle’s hardware and
software exchange certain data about the infrastructure (the Internet of Things), and the vehicle is
controlled or monitored by a processing unit.
Technically, an autonomous car is a vehicle capable of sensing its environment and moving with little
or no human input. These cars use a variety of techniques to detect their surroundings, such as
RADAR, LIDAR, Laser Light, Odometer, GPS, and computer vision. These vehicles require a broad
range of technologies and infrastructures to operate properly. Each vehicle is required to
continuously collect and interpret vast amounts of information. Every system of the car must work
with the surrounding environment.
It may come as a surprise to learn that vehicle automation is not an either/or proposition. And not
everyone wants complete automation. Putting your car on automatic pilot in fair weather along a
standard stretch of highway may be fine, but any driver would want the option to regain control of
the car should circumstances demand it. The autonomy of vehicles is graded on a scale from zero to
five, with zero meaning no autonomy and five signifying complete autonomy.
The standards organization Society of Automotive Engineers (SAE) International has defined these
levels in a report on “Taxonomies and Definitions” regarding automated driving systems. SAE
descriptions for the levels are given in the following subtitles.
7|Page
Department of Computer Engineering
Autonomous Cars
“A vehicle that fits into this category, relies on a human to dictate every driving
action.” The driver has full control.
1. Road Conditions
Road conditions could be highly unpredictable and vary from places to places. In some cases,
there are not smooth and marked highways. In other cases, road conditions are highly
8|Page
Department of Computer Engineering
Autonomous Cars
deteriorated – no lane marking. Lanes are not defined, there are potholes, mountainous and
tunnel roads where external signals for direction are not very clear and likewise.
2. Weather Conditions
Weather conditions play another spoilsport. There could be a sunny and clear weather or
rainy and stormy weather. Autonomous cars should work in all sorts of weather conditions.
There is absolutely no scope for failure or downtime.
3. Traffic Conditions
Autonomous cars would have to get onto the road where they would have to drive in all
sorts of traffic conditions. They would have to drive with other autonomous are on the road,
and at the same time, there would also be a lot of humans. Wherever humans are involved,
there are involved lot of emotions. Traffic could be highly moderated and self-regulated. But
often there are cases where people may be breaking traffic rules. An object may turn up in
unexpected conditions. In the case of dense traffic, even the movement of few cms per
minute does matter. One can’t wait endlessly for traffic to clear automatically and have
some precondition to start moving. If more of such cars on the road are waiting for traffic to
get cleared, ultimately that may result in a traffic deadlock.
4. Accident Liability
The most important aspect of autonomous car is accident liability. In the case of
autonomous cars, the software will be the main component that will drive the car and make
all the important decisions. While the initial designs have a person physically placed behind
the steering wheel, newer designs showcased by Google, do not have a dashboard and a
steering wheel. Additionally, due to the nature of autonomous cars, the occupants will
mostly be in a relaxed state and may not be paying close attention to the traffic conditions.
In situations where their attention is needed, by the time they need to act, it may be too late
to avert the situation.
5. Radar interference
Autonomous cars use lasers and radar for navigation. The lasers are mounted on roof top
while the sensors are mounted on the body of the vehicle. The principle of radar works by
detecting reflections of radio works from surrounding objects. When on the road, a car will
continuously emit radio frequency waves which get reflected from the surrounding cars and
other objects neat the road. Appropriate action is then taken based on the radar readings.
When this technology is used for hundreds of vehicles on the road, the cars won’t be able to
distinguish between its own signal and the signal from another vehicle. Even if multiple
frequencies are available for radar, this frequency range is unlikely to be insufficient for all
the vehicles manufactured.
Challenges are many even today for rolling out the autonomous cars on the road. But so is the
determination of our scientists, engineers and problem solvers from various disciplines. The
collective effort of the industry will make the autonomous car on the road a reality one day, and the
benefits will be huge. Not only it will save fuel, encourage efficient transportation and shared
services, but will also help in saving many lives that are regularly lost in road accidents.
9|Page
Department of Computer Engineering
Autonomous Cars
The solution for most of the challenges mentioned above lies in choosing the AI powered super
computer in the car. For this same, NVIDIA is coming up with more powerful and intelligent AI and
motherboards. One such example of motherboard is NVIDIA DRIVE AGX Pegasus.
It’s hard to argue with calling NVIDIA the market leader in providing integrated compute platforms
for automated driving development. Since the launch of the original Drive PX system in 2015, NVIDIA
has continued cranking up the performance and capability of its development platform. One of the
key factors to NVIDIA’s success in this space has been the integrated platform that includes a board
with processing chips and all the necessary input/output for connecting sensors and communicating
to other vehicle systems. This makes a lot easier for developers to buy the system and connect it in
the vehicle without having to assemble a bespoke solution from parts. NVIDIA also provides an array
of software frameworks for deep learning that enable developers to get started quickly. To get to so-
called Level 5 automation that can operate in any environment at any time, NVIDIA is now offering
the Drive AGX Pegasus. The Pegasus system utilizes two Xaviers and two of the company’s next-
generation GPU’s to crank out 320 TOPS.
Accelerator 2x DLA
Memory LPDDR4
TDP 500W
10 | P a g e
Department of Computer Engineering
Autonomous Cars
Source: NVIDIA
B. Camera
All autonomous cars utilize some form of camera technology, but the actual camera
technology and setup on each driverless car varies. The cameras are used to identify road
markings and traffic signals. Some self-driving cars can operate using just a single camera
embedded in the windshield. Other autonomous car cameras require several cameras
mounted to the vehicle’s exterior with slight separation in order to give an overlapping view
of the car’s surroundings. The goal of a driverless car camera is to help the car’s computer
build up a composite picture of the surrounding world in order to drive safely/ The
technology behind these cameras function like the human eye, which provides overlapping
images to the vehicle’s computer before determining things like depth of field, peripheral
movement, and dimensionality of objects.
11 | P a g e
Department of Computer Engineering
Autonomous Cars
C. Radar Sensor
Autonomous cars typically have bumper-mounted radar sensor units (two in front and two
in back). These help the vehicle detect road dynamics such as detours, traffic delays, vehicle
collisions, and other obstacles, by sending a signal to the on-board processor to apply the
brakes and/or move out of the way. This technology works in conjunction with other
features on the car such as inertial measurement units, gyroscopes, and a wheel encoder to
send accurate signals to the processing unit (i.e. brain) of the vehicle.
12 | P a g e
Department of Computer Engineering
Autonomous Cars
1. Accuracy: the difference between a receiver’s measured and real position, speed or
time;
2. Integrity: a system’s capacity to provide a threshold of confidence and, in the event of
an anomaly in the positioning data, an alarm;
3. Continuity: a system’s ability to function without interruption;
4. Availability: the percentage of time a signal fulfils the above accuracy, integrity and
continuity criteria.
An inertial measurement unit (IMU) is an electronic device that measures and reports a
body’s specific force, angular rate, and sometimes the magnetic field surroundings the body,
using a combination of accelerometers and gyroscopes, sometimes also magnetometers.
IMU are typically used to maneuver aircraft, including unnamed aerial vehicles (UAVS),
among many others, and spacecraft, including satellites and landers, and autonomous
vehicles.
Source: Wired
13 | P a g e
Department of Computer Engineering
Autonomous Cars
1. Computer Vision
Computer Vision is how we use cameras to see the road. Humans demonstrate the power of
vision by handling a car with basically just two eyes and a brain. For a self-driving car, we can
use camera images to find lane lines, or track other vehicles on the road.
Source: Medium
2. Sensor Fusion
Sensor Fusion is how we integrate data from other sensors, like radar and lasers – together
with camera data – to build a comprehensive understanding if the vehicle’s environment. As
good as cameras are, there are certain measurements – like distance and velocity – at which
other sensors excel, and other sensors can work better in adverse weather, too. By
combining all of our sensor data, we get a richer understanding of the world.
3. Localization
Localization is how we figure out where we are in the world, which is the next step after we
understand what the world looks like. We all have cell phones with GPS, so it might seem
like we know where we are all the time already. But in fact, GPS is only accurate to within
about 1–2 meters. Think about how big 1–2 meters is! If a car were wrong by 1–2 meters, it
could be off on the sidewalk hitting things. So we have much more sophisticated
mathematical algorithms that help the vehicle localize itself to within 1–2 centimetres.
14 | P a g e
Department of Computer Engineering
Autonomous Cars
4. Path Planning
Path planning is the next step, once we know what the world looks like, and where in it we
are. In the path planning phase, we chart a trajectory through the world to get where we
want to go. First, we predict what the other vehicles around us will do. Then we decide
which maneuver we want to take in response to those vehicles. Finally, we build a trajectory,
or path, to execute that maneuver safely and comfortably.
5. Control
Control is the final step in the pipeline. Once we have the trajectory from our path planning
block, the vehicle needs to turn the steering wheel and hit the throttle or the brake, in order
to follow that trajectory.
Source: Audi AG
15 | P a g e
Department of Computer Engineering
Autonomous Cars
Currently, there are many different technologies available that can assist in creating autonomous
vehicle systems. Items such as GPS, automated cruise control, and lane keeping assistance are
available to consumers on some luxury vehicles. The combination of these technologies and other
systems such as video-based lane analysis, steering and brake actuation systems, and the programs
necessary to control all the components will become a fully autonomous system. The problem is
winning the trust of the people to allow a computer to drive a vehicle for them, because of this,
there must be research and texting done repeatedly to assure a near fool proof final product. The
product will not be accepted instantly, but overtime as the systems become more widely used
people will realize the benefits of it. The implementation of autonomous vehicles will bring up the
problem of replacing humans with computers that can do the work for them. There will not be an
instant change in society, but it will become more apparent over time as they are integrated into
society.
16 | P a g e
Department of Computer Engineering
Autonomous Cars
[1] X. Du, M. H. Ang and D. Rus, "Car detection for autonomous vehicle: LIDAR and vision fusion
approach through deep learning framework", 2017 IEEE/RSJ International Conference on Intelligent
Robots and Systems (IROS), Vancouver, BC, 2017, pp. 749-754.
[2] Mariusz Bojarski, Davide Del Testa, Daniel Dworakowski, Bernhard Firner, Beat Flepp, Prasoon
Goyal, Lawrence D. Jackel, Mathew Monfort, Urs Muller,Jiakai Zhang, Xin Zhang, Jake Zhao and Karol
Zieba, “End to End Learning for Self-Driving Cars”, 2016 NVIDIA Corporation Holmdel, NJ 07735,
2017.
[3] T. Banerjee, S. Bose, A. Chakraborty, T. Samadder, B. Kumar and T. K. Rana, "Self-driving cars: A
peep into the future", 2017 8th Annual Industrial Automation and Electromechanical Engineering
Conference (IEMECON), Bangkok, 2017, pp. 33-38.
[4] A. R. Fayjie, S. Hossain, D. Oualid and D. Lee, "Driverless Car: Autonomous Driving Using Deep
Reinforcement Learning in Urban Environment", 2018 15th International Conference on Ubiquitous
Robots (UR), Honolulu, HI, 2018, pp. 896-901.
[5] Lex Fridman, Daniel E. Brown, Michael Glazer, William Angell, Spencer Dodd, Benedikt Jenik, Jack
Terwilliger, Julia Kindelsberger, Li Ding, Sean Seaman, Hillary Abraham, Alea Mehler, Andrew
Sipperley, Anthony Pettinato, Bobbie Seppelt, Linda Angell, Bruce Mehler, Bryan Reimer, “MIT
Autonomous Vehicle Technology Study: Large-Scale Deep Learning Based Analysis of Driver Behavior
and Interaction with Automation”, 2018 77 Massachusetts Ave, Cambridge, MA 02139, USA, 2018.
[6] “WIRED.” Wired, Conde Nast, www.wired.com.
[7] “Medium – a Place to Read and Write Big Ideas and Important Stories.” Medium, medium.com.
[8] books.emeraldinsight.com. N.p., n.d. Web. 18 Apr. 2019.
<https://books.emeraldinsight.com/resources/pdfs/chapters/9781787148345-TYPE23-NR2.pdf>.
17 | P a g e
Department of Computer Engineering
Autonomous Cars
CHAPTER 7: ANNEXURE 1
Plagiarism Report
18 | P a g e
Department of Computer Engineering