0% found this document useful (0 votes)
251 views10 pages

The Development of A Self-Driving Bus

Emerging technologies have led to significant advancements in vehicle technology, particularly in the development and commercialization of autonomous vehicles. Self-driving technology aims to reduce collisions, save energy, and improve traffic conditions, and major car manufacturers and technology companies have invested nearly $80 billion in research and development. Experts predict that by 2030, self-driving cars will be reliable and affordable enough to replace most traditional vehicles. The
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
251 views10 pages

The Development of A Self-Driving Bus

Emerging technologies have led to significant advancements in vehicle technology, particularly in the development and commercialization of autonomous vehicles. Self-driving technology aims to reduce collisions, save energy, and improve traffic conditions, and major car manufacturers and technology companies have invested nearly $80 billion in research and development. Experts predict that by 2030, self-driving cars will be reliable and affordable enough to replace most traditional vehicles. The
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

Volume 8, Issue 3, March – 2023 International Journal of Innovative Science and Research Technology

ISSN No:-2456-2165

The Development of a Self-Driving Bus


Yu Sun Chi-Yo Huang
Department of Industrial Education Department of Industrial Education
National Taiwan Normal University National Taiwan Normal University
Taipei, Taiwan Taipei, Taiwan

Abstract:- Emerging technologies have led to significant artificial intelligence. Therefore, experts predict optimistically
advancements in vehicle technology, particularly in the that by 2030, self-driving cars will be sufficiently reliable, the
development and commercialization of autonomous price will be acceptable to consumers, and they will be good
vehicles. Self-driving technology aims to reduce collisions, enough to replace most traditional vehicles that require human
save energy, and improve traffic conditions, and major driving [6]. According to a recent analysis by Lanctot [7],
car manufacturers and technology companies have autonomous driving is a huge opportunity, with the market
invested nearly $80 billion in research and development. estimated to reach $800 billion by 2035 and $7 trillion by
Experts predict that by 2030, self-driving cars will be 2050. Due to the reduction in collisions, studies have also
reliable and affordable enough to replace most traditional predicted that autonomous driving could save an estimated
vehicles. The market for autonomous driving is expected 585,000 lives between 2035 and 2045 [7].
to reach $800 billion by 2035 and $7 trillion by 2050,
potentially saving 585,000 lives between 2035 and 2045. According to Marr [8], the future of mobility will bring
The future of mobility will bring significant changes to significant changes to the environment we live in, with roads,
the environment, with smarter and more connected towns, and cities becoming smarter and more connected by up
roads, towns, and cities. Most people will likely to 25%. These advancements will lead to some of the most
experience autonomous driving through privately rented exciting changes in mobility, both in the coming year and
or shared ride vehicles. In response to this trend, beyond. For most people, their first experience with
governments and industry leaders have proposed self- autonomous driving is likely to be in a fleet of privately
driving bus solutions. Supported by the government, rented or shared ride vehicles, rather than a car they own
Taiwanese firms have successfully developed a self- themselves.
driving bus by integrating automotive parts and systems
and using artificial intelligence and deep learning for Facing the megatrend, over the past years, national
human-like vision and driving behavior. The self-driving governments and leaders in the automobile industry have
bus has been verified through field tests, and its platform proposed the use of self-driving buses. Supported by the
has been commercialized. government, Taiwanese firms have also initiated the
development of self-driving vehicles in general, and self-
Keywords:- Self-Driving Bus, Autonomous Vehicle, Artificial driving buses in particular. The research has integrated
Intelligence, Innovation. automotive parts and systems, made by both local and global
firms, for electric and autonomous vehicles. Artificial
I. INTRODUCTION intelligence, mimicking human vision and driving behavior, is
achieved by way of deep learning from data collected by high-
In recent years, emerging technologies in the fields of definition (HD) cameras. After continuous training and
information, communication, and automatic control have led validation, the vehicle can recognize the planned path and
to changes in vehicle technology. One of the major changes is surroundings, allowing for an autonomous driving model. A
the development and commercialization of autonomous self-driving bus has been successfully demonstrated in
vehicle technology. Self-driving technology aims to reduce numerous field tests in Taiwan. Meanwhile, derivatives of the
collisions, save energy and carbon, and improve traffic self-driving bus platform have been successfully
conditions [1] and is bound to become the mainstream of commercialized.
transportation technology in the future.
The development of international automatic driving
In the self-driving era, cars will be more efficient and systems and the development of self-driving buses in
comfortable to use. In order to realize the vision of advanced countries will be reviewed in Section II.
autonomous driving and grab business opportunities, the Subsequently, the design of a self-driving bus will be
world's major car manufacturers (such as Tesla and Toyota), introduced in Section III. Results of verifications and field
technology companies (such as Google [2, 3], Apple, and tests of the self-driving bus will be introduced in Section IV.
Nvidia), and new companies (such as Uber, Lyft, and Zoox) Problems and challenges faced by the self-driving bus will be
have invested in research and development. So far, the discussed in Section V. Section VI concludes the work, with
cumulative investment amount has been nearly 80 billion suggestions for future development.
dollars [1, 4, 5]. In recent years, the Consumer Electronics
Show (CES) held in Las Vegas has become the best venue for
many companies to explore automotive technology and

IJISRT23MAR1727 www.ijisrt.com 2296


Volume 8, Issue 3, March – 2023 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165
II. LITERATURE REVIEW everything” is gradually being popularized, computing power
is rapidly increasing, big data is ubiquitous, and artificial
A. The Development of International Automatic Driving intelligence is on the rise, which brings about the technical
Systems foundation and ability to develop autonomous driving.
This section analyzes the current status of self-driving
bus development in the world and summarizes the policy According to the US Department of Transportation,
planning and demonstrates the operational promotion of autonomous vehicles can lead to safer roads, more efficient
autonomous bus development in the United States, Europe, mobility, reduced congestion, improved fuel efficiency,
and Asia respectively. Finally, individual development cases reduced energy consumption, a cleaner environment, better
are compared and analyzed as an important reference for land use, more positive social impacts, a better quality of life,
developing the automatic driving system. and stronger partnerships. Based on the classification by the
US National Highway Traffic Safety Administration
The main reason for the development of autonomous (NHTSA) and the Society of Automotive Engineers (SAE),
vehicles is that the future of transportation is increasingly autonomous driving technology can be classified from Level 0
facing severe challenges, including safety and efficiency to Level 5. Table 1 classifies each technology, including
concerns, limited energy resources, the impact of mobile steering operation, driving environment monitoring, dynamic
pollution sources on the environment, and changes to the driving task coping, system capability driving mode, etc.
population structure, which have greatly impacted the use of Fully autonomous (Level 5) means fully automated and fully
driving vehicles. Moreover, with the revolutionary advances supported by the system.
in hardware and software technology, the “Internet of

Table 1 Levels of Driving Automation” Standard for Self-Driving Vehicles


SAE SAE SAE Narrative Excution of Monitoring Fallback System BAST NTHSA
Level Name Defibnation Steering/Acc of Driving Performane Capability level Level
eleration Environme of Dynamic (Driving
/Decleration nt Driving Modes)
Task
Human Driver Monitors the Driving Environment
0 No The Full Time A Huamn Huamn Huamn N/A Drive 0
Automa Performance by the Driver Driver Driver Only
tion Human Driver of all
Aspects of the Dynamic
Dnving Task
1 Driver The Driving Mode- Human Human Human Some Assiste 1
Assista Specific Execution by A Driver and Driver Driver Dnving d
nce Driver Assistance Systems Modes
System of Either Steering
or Acceleration
/Deceteration
2 Partial Part time or driving System Human Human Some Partia 2
Automa mode dependent Driver Driver Dnving lly
tion execution by one or more Modes Auto
driver assistance systems mated
of both steering and
acceleration/deceleration
Human driver performs
all other aspects of the
dynam ic dnving task
Automated driving system ('system') monitors the driving
environment
3 Conditi dnving mode- specific System System Human Some Highl 3
onal performance by an Dnver Driving y Auto
Automa automated dnving system Modes Mated
tion of all aspects of the
dynamic dnving task -
human driver does
respond appropriately to
a request to intervene
4 High driving mode- specific System System System Some Fully 3/4
Automa performance by an Dnving Auto
tion automated driving Modes mated

IJISRT23MAR1727 www.ijisrt.com 2297


Volume 8, Issue 3, March – 2023 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165
system of all aspects of
the dynamic driving task
- human driver does not
respond appropriately to
a request to intervene
5 Full full-time performance by System System System Some
Automa an automated driving Driving
tion system of all aspects of Modes
the dynamic driving task
under ail roadway and
environmental conditions
that can be managed by a
hum an driver
Source: [9]

B. Development of Self-Driving Buses in Advanced  Development of Self-Driving Buses in Europe


Countries
 Easymile, France
 Development of Self-Driving Buses in the United States Easymile is a joint venture between French car brand
Ligier and robotics brand Robosoft. Vehicle bodies are built
 Mcity by Ligier, software and backstage functions are provided by
Mcity is an autonomous driving system testing base Easymile, and the current main model is the EZ10 shuttle. The
jointly created by the Government of Michigan, the EZ10 has been tested on the road in many cities around the
University of Michigan, and several research institutions. It is world, and even put into commercial use. With adaptive
committed to creating a base that simulates real road programming, the EZ10 can function in many countries, and
conditions and provides an area for the testing and research of cities around the world are eager to cooperate with Easymile
unmanned autonomous electric vehicles. It covers an area of for testing.
32 acres, including various intersections, signals, curves,
roundabouts, and other elements. It's the best place for many  German Benz Future Bus
teams to test in private. In addition to developing small vehicles equipped with
autonomous driving systems, Benz is also developing self-
 Local Motors driving buses. In 2016, a Benz Future Bus successfully drove
Local Motors, an American company founded in 2007, 20 kilometers from Amsterdam Airport to Haarlem. The Benz
originally designed private vehicles for small-scale production Future Bus drove on town roads as well as the Dutch
and open sourcing. In 2016 Local Motors launched the Olli, motorway. What made this test different is that the bus was
an all-electric, fully autonomous self-driving bus. Olli is built connected to the city's infrastructure, such as traffic lights, so
by Local Motors, in partnership with IBM's Watson AI which it could exchange information with local authorities. It even
supports the autonomous driving system. Local Motors is an has cameras that can scan for potholes and send the data back
innovative company that has used 3D printing in the past to to the government. In the past, Benz’s “Highway Pilot”
produce vehicle parts, such as parts for the Olli. Powered by automatic bus system was used to assist the basic functions of
Watson, Olli also has an artificial intelligence system similar the bus such as keeping to a fixed speed on the highway.
to Siri to enhance its ability to serve travelers. Future Bus’s software, City Pilot, is based on Highway Pilot.
As the name implies, this autonomous driving system is
 Proterra designed to be used in cities. There are still drivers in the
Proterra, an American company, is already a leader in vehicle, but generally they do not have to intervene. The
providing electric buses in the United States. Proterra electric Future Bus will also be able to stop automatically, with its
buses have been used throughout the United States since position accurate to within 10 centimeters.
2009. Proterra has achieved remarkable results in developing
and improving electric bus technologies, breaking records for  French Navya
range on a single charge, energy conversion efficiency, load, Navya, a famous French self-driving bus company, was
climbing, and acceleration performance. In mid-2017, founded in 2014. Their fully electric, fully automatic small
Proterra announced that it would be working on self-driving bus ARMA was launched in 2015. Until recently, France has
buses, starting with the initial stages of collecting data and been one of the most difficult countries in which to trial self-
building a Light Detection and Ranging (LiDAR) system. driving vehicles. Navya is one of the very few unmanned self-
Working with the University of Nevada, Reno, Proterra hopes driving systems that truly reaches the fifth level of
to test self-driving buses in Las Vegas in the near future. With autonomous driving. The maximum operating speed of their
its established reputation and advanced electric bus 15-person small bus can reach 45km/hr, making it the fastest
technology, Proterra has the potential to create the first self- of the unmanned self-driving vehicles.
driving bus in the United States.

IJISRT23MAR1727 www.ijisrt.com 2298


Volume 8, Issue 3, March – 2023 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165
 PostAuto Switzerland C. Development of self-driving buses in Asia
Postauto is a Swiss company owned by Postbus, the
Swiss passenger transport company. Postauto began practical  NTU, Singapore
testing of self-driving buses in 2016. Testing was based on the Nanyang Technological University in Singapore has
ARMA model of the Navya team in France, coordinated with been developing driverless bus technology since 2013 and
the software systems of Postauto team. It is named the began testing it in multiple locations in 2016. Singapore aims
SmartShuttle autonomous bus. Since the second half of 2016, to fully implement self-driving bus technology by 2020. Apart
it has been undergoing trials in the Sion area, and has also from the mature conditions for technological development,
been sent to other countries and regions for testing. government support is also a key factor. NTU's self-driving
SmartShuttle also conducted a market survey to analyze the electric vehicle, in partnership with Navya, is among the
public's acceptance of self-driving buses. In Switzerland, 51% fastest driverless self-driving buses in Asia.
of people have no or very few concerns about autonomous
driving systems. However, in Sion, after trial operations, the  ST Kinetics, Singapore
proportion increased to 62%, indicating that after riding Another Singaporean company, ST Kinetic, has been
experience, the public will not only have a better contracted by the Land Transport Authority to develop an
understanding of the autonomous self-driving system, but also autonomous bus system. The plan is for three and a half years
have more confidence in it. of testing to be completed by October 2020. Unlike other
schemes in development, ST Kinetics plans to produce an
 Swiss SBB autonomous bus that can carry up to 40 people and is
SBB, a Swiss Federal Railways company, began expected to reach a top speed of 60 kph. The Land Transport
piloting self-driving buses in 2017. The SBB tests were Authority said it would like to develop high-capacity
conducted within the Zug area. There were two self-driving driverless buses in the hope of transporting rush-hour
buses in this trial operation. The self-driving bus system passengers on larger buses in the future. It aims to have the
selected by SBB is Olli, developed by Local Motors in the project on the road after 2020. In addition to the 40-seater
United States, which is coordinated with the data and system buses, ST Kinetics has also signed separate contracts with
of SBB's self-driving research plan. The goal is to put the self- Singapore's Transport Ministry and Sentosa Development
driving bus system into use in 2020. Authority to test 20-seater autonomous buses for two years.
The project aims to enable tourists in Sentosa to connect to
 British TRL the driverless bus operating system via a smartphone app in
TRL, a British transport laboratory, began testing the future. The system will determine the route of the
driverless self-propelled buses in Greenwich in 2017. The driverless bus according to travel needs and monitor it with
team is led by TRL, with participation from a number of local closed-circuit television. Self-driving buses with seats
universities, local institutions and companies such as Shell. for 20 people will travel at speeds of up to 80 kilometers per
TRL's self-driving bus, called the GATEway Project, has no hour.
steering wheel or gas pedal inside, and aims to achieve Level
5 autonomy through self-driving technology. GATEway aims  Hyundai of South Korea
to begin a larger trial operation in 2019. Hyundai Motor, a major South Korean automaker, is
also developing self-driving bus technology in addition to
 Netherlands self-driving minibuses. The South Korean Government
The Dutch Automated Vehicle Research and pushed hard to have self-driving buses in trial operation in
Development Demonstration Program (Dutch Automated 2018 as it hosts the Pyeongchang Winter Olympics, and
Vehicle Initiative) is a project organized by TU Delft Hyundai is seen as the company most likely to be selected in
University of Technology (TU Delft) and the Netherlands time for the games. Hyundai is the leader in the process, with
Applied Science Research Organization in 2015 (TNO), six out of the 12 paper test permits issued by the Korean
AutomotiveNL, and other teams to investigate, improve and government. The pilot route will connect Pangyo, known as
demonstrate the feasibility of autonomous driving technology Korea’s Silicon Valley, with Pangyo Station (a 2.5-kilometer
for general road use. The project includes verification of distance) and the company complex.
technical feasibility, behavioral studies, safety proofing, legal
compliance review, and public awareness building. The  Japanese DeNA
program uses a variety of vehicles, including Toyota Priuses, DeNA, a Japanese company, has a self-driving bus
to test autonomous driving technology imported from TNO system called Robot Shuttle. Robot Shuttle is DeNA's
and TU Delft. Testing includes situations such as automatic autonomous self-propelled bus model based on the EZ10 of
steering and flow entry. The plan also classifies five levels of Easymile of France operated with the software of a Japanese
autonomous driving, including acceleration and deceleration team. At present, Robot Shuttle has carried out several tests.
controls, environmental monitoring, and driving situations Japan aims to provide unmanned self-driving services for the
that can be controlled by the system. Tokyo Olympic Games in 2020. Robot Shuttle is also
expected to provide simple services with fixed routes, such as
The project plan of the Dutch automatic vehicle research short shuttle connections.
and development (DAVI) is very wide-ranging, from
technical verification and behavior analysis to legal review,
and worth further investigation.

IJISRT23MAR1727 www.ijisrt.com 2299


Volume 8, Issue 3, March – 2023 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165
 Comparisons tests and the actual outcomes of road environment tests in
To sum up the above-mentioned cases, statistics for Asia, as a reference for the site selection of the subsequent test
passenger capacity, maximum speed, mileage, test speed, time operation environment of this plan. In the future, this plan will
to market, and other information of various autonomous further discuss the legal environment of autonomous driving
driving vehicles are presented in the following Table 2, which vehicles from domestic and foreign cases, as a reference for
further remarks about whether there have been actual road our related traffic and vehicle regulations.

Table 2 Comparisons of Solutions for Self Driving Buses


Maximum Single
Capacity Trial Test Speed Road Test in Technology Time of
Type Speed charge
(People) Operations (km/h) Asia Readiness Introduction
(km/h) mileage
Proterra Project N/A N/A N/A No N/A No Low N/A
Local Motors – Olli 12 40 58km Yes 19 No Middle 2019
Navya -ARMA 15 45 5N/A13hr Seveal 25 Yes High 2018
Easymile –EZ10 12 40 14hr Seveal 20 Yes High 2017
Benz – Future Bus N/A 70 N/A One 70 No Low N/A
TRL –GATEway 3 25 N/A One 16 No Middle 2019
Postautos – SmartShuttle 11 45 130km One 20 No High 2018
SBB –Olli 10 40 58km One N/A No Middle 2020
DeNA – Robot Shuttle 12 40 14hr One 10 Yes High 2020
Hyundai Project 12 N/A N/A One 30 Yes Low 2019
NTU -ARMA 15 45 130km One 25 Yes High 2020
ST Kinetics Project 40 60 30N/A50km No N/A Yes Low 2020
Remark: N/A means not available.

III. DESIGN OF THE SELF-DRIVING BUS settings. Through an integrated architecture, deep neural
networks can be trained on data center systems before being
In this section, details regarding the automatic driving deployed on vehicles.
system and vehicle sensing equipment, including the
automatic driving system, positioning navigation point Nvidia is the only existing chip manufacturer with the
tracking, environment awareness, autopilot deep learning highest mastery of artificial intelligence technology. From
method, and infrastructure and operation planning will be the bottom of the computing module to the top of the neural
introduced. When fully loaded, the maximum speed of the network training the application has complete hardware and
vehicle is 28 km/h, the climbing force is more than 20%, software support. According to Nvidia (2018), DRIVE PX
endurance is more than 100 km, and the charging time is "merges data from multiple cameras, radar, and ultrasonic
about 10-12 hours. The performance of the vehicle is sensors, allowing the algorithm to accurately interpret the
sufficient for at least six hours of daily operation at the test full 360-degree environment around the vehicle, presenting
site. Vehicle specifications are detailed in Table 3 below. the most complete picture possible, including static and
dynamic objects. Using deep neural networks to detect and
Table 3 Specification of the Self-Driving Bus classify objects greatly improves the accuracy of the sensor
Item Contents data after fusion."
Length × width × height 4,800×1,500×1,965 mm3
Seat number 8 The autonomous driving of vehicles in this project is
wheelbase 2,580 mm mainly achieved through an artificial intelligence control
Empty weight 1200 kg system and deep learning core technology. The artificial
Top speed (full load) 28 km/h intelligence control is completed by positioning and
Climbing force (full load) 20% navigation (global planning) and environment awareness
Motor 7.5KW/72V AC Motor (local planning), which are described below.
Battery capacity 14 kwh
B. Positioning Navigation Point Tracking
Cruising endurance ≥100 km
The first step is positioning and navigation. This plan
Double circuit hydraulic brake;
Brake system uses an Inertial Measurement Unit (IMU)-enhanced GPS
front disk and back drum
combined with high-precision maps (HD maps) to conduct
accurate positioning and overall path planning through map
A. Automatic Driving System
assets and positioning components. The first step of
The project uses Nvidia DRIVE™ PX, an open
automatic driving is completed, which determines the
artificially intelligent vehicle computing platform that
general direction of the vehicle path planning and follows the
instantly understands the vehicle's surroundings, accurately
path.
locates the vehicle on HD maps, and plans a safe route.
Combined with deep learning, sensor fusion, and surround
vision, the architecture can be expanded to support various

IJISRT23MAR1727 www.ijisrt.com 2300


Volume 8, Issue 3, March – 2023 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165
The waypoint tracking module is the driving route each cell is scanned several times per second to determine
planned by the navigation system according to the starting the presence of a target probability factor. Based on the
and ending points and the pre-established static map, as well calculated probability of the target’s reaction, it is converted
as the general direction of the traveling route. During into an adaptive velocity curve. The vehicle will slow down
driving, dynamic obstacle avoidance is carried out by the when an obstacle is detected and stop if necessary. The
local planner, which integrates various sensor information to following Table 4 lists the sensing equipment used in the
reconstruct the dynamic obstacle scene. project.

The main work of waypoint tracking is to enable D. Deep Learning Method For Autopilot
vehicles to locate their location in real-time to confirm Automated driving requires continuous deep learning
whether they are still on the planned path. This project to accumulate experience of various scenes, learn to cope
adopts the IMU-enhanced GPS combined with high- with different scenarios, and establish a Deep Neural
precision maps for real-time positioning and real-time Networks (DNNs) model, that is, gradually reduce the
pathing, precise positioning, and path planning. dependence of the system on human intervention, thereby
upgrading the level of automation.
C. Environment Awareness
The second step is environmental awareness, that is, Table 4 Devices Adopted by the Self-Driving Bus
the real-time route adjustment and micro-amplitude-change Device Brand/Model Application
decisions made by the identification of objects and lane Camera-A Whetron WS Image Recognition
judgment while the vehicle is running. This part is controlled CAMERA-100
by AI and depends on the information provided by sensors. Camera-B Sekonix AR0231- Auxiliary Image
However, this plan incorporates different sensors, including SF3322/AR0231- Identification
a camera (image), radar, and LiDAR to judge the target and SF3323
lane, and integrated sensor feedback information for regional Sonic sensor Whetron IRC- Detection of objects
end path planning (lane selection, obstacle avoidance, etc.). (plus 1 APA-12S-000 and obstacles
ECU)
The perception model can be divided into two key Touch GeChic On- Vehicle equipment
functions, namely target identification and driveable space screen Lap1002 operation and setting
identification. The core logic of the perception model is to GPS Xsens (MTi-G- Positioning and
identify the target and driveable space through continuous 710-2A8G4) MTi- navigation
operation so that the autonomous vehicle can understand the G-710-GPS/AHRS
driveable area and obstacles to avoid to make correct route RS232, USB
selections. The proposed sensors are a combination of image, Lidar Velodyne VLP-16 Object and obstacle
radar echo, and optical radar sensors to make up for the detection
shortcomings of each sensor and greatly improve the sensing
accuracy. The process of deep learning in an autonomous vehicle
involves implementing various deep neural networks to
Image recognition using a photographic lens can sense and understand the environment, locate the vehicle on
simultaneously target identifiable objects in view and a high-resolution map, predict the behavior and location of
determine a specific object’s possible behavior after tracking other objects, calculate vehicle dynamics, and plan safe
it, but it is difficult to reach sufficient accuracy to judge the driving paths. The project introduces Nvidia's newly
distance and actual size of the object. announced DX-2 deep learning server system. Through
integrated architecture and enhanced computing power, deep
At this time, the radar is used to judge the distance and neural networks can be trained in a central data center. Then
size of the object, and the radar can also track the direction it can be deployed on the vehicle, reducing the neural
of the object, and this information can also be fed back to the network training time in the data center from several months
imaging system to assist in tracking the object. to several days.
The vehicles in the scheme are equipped with an During the process, data such as information from the
obstacle detection system using optical sweep sensors that controller area network (CAN), surrounding image data, and
can detect obstacles up to 7 cm wide at a range of 40 to 100 distance traveled must be collected, and the data processed
metres and use fail-safe detection to scan the area in front of (interpreting and labeling). The inference testing phase is
the vehicle. The obstacle detection system plugs directly into based on the inference center training, and the model can be
the low-level motion controller and interrupts navigational validated (refer Figure 1).
tasks. Obstacle detection can distinguish between actual
detection targets and "ghost" detection targets, such as rain,
snow, or falling leaves.

The sensor has a wide aperture and four layers of


scanning, a scanning base (vehicle scanning path), and an
extended body. Both body sections are divided into cells and

IJISRT23MAR1727 www.ijisrt.com 2301


Volume 8, Issue 3, March – 2023 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165
High-precision images can be obtained by the above
method. The data processing technology and POS trajectory
Image CAN Data Image
calculation require automatic image tone correction,
automatic ground dead angle compensation of the ring
landscape image, GNSS satellite signal processing, POS
Mapping Data Labelling trajectory calculation, and CV image (Camera Vector) fusion.
In this way, it can improve the efficiency of signal
calculation against the high-rise shelter and multipath effect,
and also greatly reduce the need for ground control point
Labelled Data Labelled Data layouts to achieve the required accuracy, and provide
unmanned vehicle navigation, obstacle avoidance, stopping,
and other AI learning references.

IV. FIELD EXPERIMENTS AND RESULTS


Object Detection
PilotNet Field tests of the self-driving buses were initiated in
Neural Network
Training February, 2017, in Taoyuan Exposition, Taiwan. During the
Training
test ride, we planned to survey passenger satisfaction by
Fig 1 Deep Learning Process of DNNs questionnaires. 1,242 questionnaires were collected before
the ride, and 1,223 questionnaires were collected after the
E. Infrastructure and Operation Planning ride. The error was less than 2.83% at a 95% confidence
level. According to the answers, passengers were given 4
 High Precision Maps points for being very satisfied, 3 points for being satisfied, 2
HD maps contain high-precision coordinates, accurate points for being average, 1 point for being dissatisfied, and 0
road markers, road signs, the number of lanes, and include points for being very dissatisfied. In terms of the level of
mapping layers information such as vertical slope, curvature, satisfaction with safety, comfort, and overall service, the
roll, and surrounding objects. Activity layers can be updated actual satisfaction with self-driving cars is higher than the
with real-time traffic conditions, weather, obstacles, etc. expectation before riding. In terms of specifics, 78% of
passengers think "ride stability" is an important factor before
3D video geographic information system (GIS) riding, and 28% of passengers suggest that "ride stability"
technology is used in high-precision map production. This should be further improved after riding.
technology is derived from CV (Camera Vector) technology,
which combines continuous images into one and forms a real In July 2018, four medium-sized buses and two shuttles
stereo space. It can be fully integrated with 2D layers of the were produced as prototypes and tested in Lihpao
ArcGIS and GIS databases to achieve real-time interactive Amusement Park, Taichung, Taiwan. They served on three
applications. At the same time, with LiDAR point cloud routes connecting the parking lot, a hotel, the amusement
information, three-dimensional high-precision maps can be park, and a racing track.
constructed.
After that, a field test in Taoyuan MRT stations started
The generation process of a high-precision map is to in September, 2019. A 6-meter autonomous driving bus
first integrate the topographic map and electronic map of offered regular passenger service within the train depot and
1/1000 along each operation route of the autonomous traveled over 5,000 km. A traffic light was installed to test
vehicles in the area of field test. The results serve as the vehicle-to-infrastructure synchronization. In February, 2020,
basic map. An image acquisition vehicle (with 2 high-level 6-meter and 4-meter prototypes passed 45 scenarios in the
64-128 LiDAR and more than 4 cameras) is used for national AV testing circuit, certified by TÜV Rheinland [10].
measurement along the route, and the 3D Video GIS
technology is employed. Combined with the three- Starting in May 2020, two autonomous driving buses
dimensional circumnavigation GIS map (LiDAR point could be called at all bus stops along the 8.3 km bus lane in
cloud), the survey results are finally integrated based on the downtown Taipei at off-peak hours. Roadside units were
Image Mobile Mapping System (IMMS), and the developed and installed at major road intersections for
measurement accuracy is mainly based on a three- comprehensive safety assessment. A 5G private network and
dimensional image map of at least 10cm resolution. After an operation and control center were established. Traffic
manual editing, deindexing, and image blur processing, a light status is calibrated and integrated with the system. The
three-dimensional high-precision map can be constructed total mileage exceeds 2,000 km, and over 3,000 people have
(the integration result of the image, point cloud, and two- been onboard [10].
dimensional high-precision electronic map). The information
contains dynamic and static data. The dynamic data consists Then, in February, 2021, a 6-meter autonomous driving
of a real-time road condition layer and a traffic event layer, bus was put to service in a residential area, on the open road
while the static data includes data an update layer, traffic with mixed traffic flow, including scooters. Seven roadside
facility layer, lane layer, and road layer. units were installed. The AV travels more than 3,000 km and
takes over 1,000 passengers. Recently, the smart golf cart

IJISRT23MAR1727 www.ijisrt.com 2302


Volume 8, Issue 3, March – 2023 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165
fleet was first successfully commercially deployed in May new things – advanced learners use their knowledge to figure
2022. The golf carts give golfers the safety, privacy, and out how new situations and formulate rules within that
freedom to take control of their pace of play. Real-time framework; the younger generation will use textbooks to get
information about vehicle statuses can be accessed by the started. After accumulating enough basic knowledge, they
golf course owner or operator via administrative systems will naturally find that there are too many things that
[10]. textbooks cannot cover, so they begin to read extensively,
V. DISCUSSION collect a lot of information, and finally learn from one
another. On the other hand, if starting from scratch means
The design of the self-driving bus has been verified as taking in a lot of information aimlessly, humans will have no
successful through the road tests. Meanwhile, the platform of control over what they learn.
the self-driving bus has successfully been implemented in
various categories of self-driving buses. However, problems B. Obstacle Avoidance and Detour Functions
faced in the field tests merit further discussion and At that time, this plan did not achieve the function of
investigations. object avoidance and rerouting. During the Nongbo
operation, the vehicle made a decision to stop after detecting
A. End-to-end deep learning obstacles, and only continued to drive after the obstacles
In terms of technology, as the driving location is were cleared. The decision-making process to make a detour
constantly changing and there is no fixed scene to be used as is quite complicated. It is necessary to first detect obstacles
a reference, the LiDAR cannot be used for Simultaneous in front of the predetermined route, then judge all alternative
Localization and Mapping (also known as SLAM), and GPS routes that can be taken (map information, traffic rules, etc.,
combined with high-precision map cannot be used to realize must be checked at the same time), and then judge whether
positioning and navigation. In addition, the vehicle dynamic there are obstacles on each alternative path, so as to select
control algorithm was not perfect at that time, and it was the most appropriate alternative path and control vehicle
difficult to accurately control the steering. In order to make steering and speed. Machine learning alone is far from
the vehicle run automatically, the plan decided to adopt enough, requiring a high degree of integration of positioning,
"end-to-end deep learning". navigation, perception, decision-making and other modules.
These are the key items of follow-up research and
"End-to-end deep learning" means that, given the input development for the implementation team of this plan.
and output data, a deep neural network can be trained to
"automatically learn" how to go from one location to another C. Future Development Proposals and Demonstrations on
without having to manually establish an end-to-end the Open Road
processing process. This project uses the neural network The operation field of the project is a closed area, and
PilotNet built by Nvidia. The input data is the actual road there are still many challenges to be overcome in future
image and the output data is the steering control data of operations on actual roads with mixed traffic flow. Closed
human drivers. The trained PilotNet model can judge how to areas may replicate all the static elements of a real road, such
control the vehicle according to the new input images. as roads, markers, signals, etc., but they still lack the most
difficult pedestrians and vehicles. In other words, if a self-
The fascinating thing about deep learning is that driving car is to operate on the open road, it must be able to
models surprise, sometimes learning more than expected. deal with people and cars first.
Because we train the model based on human driving
behavior in real road conditions, the model may learn more After this plan was completed, the team continued to
than the intended goal. For example, after the program develop the autonomous driving system (refer Table 2),
ended, the vehicles moved to a new location for the test. Not gradually building full autonomous driving capabilities to
only did they recognize the white lane markings and follow enable vehicles to achieve SAE Level 4 and even Level 5 on
them without any advance work, but they also recognized the the open road. Many of the projects in the above architecture
parking lanes and parked in them. They also slightly avoided could and should have been validated first in closed or
pedestrians, which means they could draw parallels. simulated fields. For example, localization, decision module
partial functions (lane following, bypass), vehicle control, as
However, relying solely on the results of deep learning well as some basic perception and corresponding decisions
has its drawbacks. When the amount of data is not diverse can also be tested in closed areas. The experience of testing
enough, the model cannot really exert its power. For in closed areas can be used as a basis for application on open
example, the "road" defined in this plan is white lane roads.
markings, so the model only recognizes white lines; If there
is no white line to refer to, it will fail, but an actual road may As for how to deal with the complex mixed traffic
contain no markings, different road materials, and so on. flow, especially locomotives, it is still necessary to identify
This is the limitation of the deep learning model. locomotives visually through deep learning artificial
intelligence, and then try to explore the behavior pattern of
In the approach to artificial intelligence, rule-based locomotives in simulators. Through thousands of simulations
decision models and deep learning models both excel, and and models, real-world data will continue to accumulate.
the transition from the former to the latter should be a There are no shortcuts to this learning process. In addition,
gradual process, much like the human concept of learning the speed of system identification is limited due to the

IJISRT23MAR1727 www.ijisrt.com 2303


Volume 8, Issue 3, March – 2023 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165
limitations of the computing power of the vehicle computing vehicular and human behavior should help autonomous
platform (or rather, platforms with high computing power are vehicles detect the movement of vehicles in blind spots, out
highly expensive and do not have commercial of sight, and coming from the side. The content of V2X is
competitiveness). Over time, as performance improves and extensive and diverse. However, for autonomous vehicles, if
prices fall, devices will become more capable of instantly roadside Infrastructure to Vehicle (I2V) communication can
identifying large numbers of moving objects at high speeds. be realized and cameras, radars and other devices on the
roadside can sense people and vehicles and then instantly
In addition, Vehicle to Everything (V2X) technology transmit that data to the computing platforms of autonomous
can be used to assist autonomous vehicles to cope with vehicles, the safety of self-driving cars in mixed traffic flow
people and vehicle dynamics on the road. The installation of will be improved.
V2X devices at intersections with heavy traffic and complex

Fig 2 Autonomous Driving System Development Project

VI. CONCLUSIONS REFERENCES

The development and commercialization of autonomous [1]. S. A. Bagloee, M. Tavana, M. Asadi, and T. Oliver,
vehicles have been accelerated by emerging technologies. "Autonomous vehicles: challenges, opportunities, and
Almost $80 billion has been invested in self-driving future implications for transportation policies," Journal
technology to reduce collisions, save energy, and improve of Modern Transportation, vol. 24, no. 4, pp. 284-303,
traffic conditions. In response to this trend, governments and 2016.
industry leaders have proposed self-driving buses. With [2]. E. Guizzo, "How google’s self-driving car works,"
government support, Taiwanese firms have developed self- IEEE Spectrum, vol. 18, no. 7, pp. 1132-1141, 2011.
driving buses that integrate automotive parts and systems and [3]. J. Markoff. (2010, March 20). Google Cars Drive
use artificial intelligence and deep learning for human-like Themseleves, in Traffic. Available:
driving behavior. https://www.nytimes.com/2010/10/10/science/10googl
e.html
In the future, the Government should grasp the [4]. European Automobile Manufacturers Association, The
momentum of the global development of self-driving cars, Automobile Industry Pocket Guide 2015-2016.
integrate relevant ministerial resources, improve the testing Brussels, Belgium: European Automobile
environment and regulations, build an autonomous vehicle Manufacturers’ Association, 2015.
platform, and discuss with relevant industry, education, and [5]. J. A. H. Nieuwenhuijsen, Diffusion of Automated
research institutions the application of self-driving cars in Vehicles: A Quantitative Method to Model the
parks and rural connections, public transport station Diffusion of Automated Vehicles with System
connections and other services. Dynamics. Delft, Netherlands: Delft University of
Technology, 2015.

IJISRT23MAR1727 www.ijisrt.com 2304


Volume 8, Issue 3, March – 2023 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165
[6]. T. Litman, Autonomous Vehicle Implementation
Predictions. Victoria, Canada: Victoria Transport
Policy Institute, 2017.
[7]. R. Lanctot, Accelerating the Future: The Economic
Impact of the Emerging Passenger Economy. Boston,
M.A., U.S.A.: Strategy Analytics, 2017.
[8]. B. Marr. (2021, March 20). The 5 Biggest Connected
and Autonomous Vehicle Trends in 2022. Available:
https://www-forbes-
com.cdn.ampproject.org/c/s/www.forbes.com/sites/ber
nardmarr/2021/12/20/the-5-biggest-connected-and-
autonomous-vehicle-trends-in-2022/amp/
[9]. G. Rudolph and U. Voelzke. (2017, March 27). Three
Sensor Types Drive Autonomous Vehicles. Available:
https://www.fierceelectronics.com/components/three-
sensor-types-drive-autonomous-vehicles
[10]. Turing Drive. (2022, March 27). Milestones.
Available: https://turing-drive.com/en/about-en/

IJISRT23MAR1727 www.ijisrt.com 2305

You might also like