The Development of A Self-Driving Bus
The Development of A Self-Driving Bus
ISSN No:-2456-2165
Abstract:- Emerging technologies have led to significant artificial intelligence. Therefore, experts predict optimistically
advancements in vehicle technology, particularly in the that by 2030, self-driving cars will be sufficiently reliable, the
development and commercialization of autonomous price will be acceptable to consumers, and they will be good
vehicles. Self-driving technology aims to reduce collisions, enough to replace most traditional vehicles that require human
save energy, and improve traffic conditions, and major driving [6]. According to a recent analysis by Lanctot [7],
car manufacturers and technology companies have autonomous driving is a huge opportunity, with the market
invested nearly $80 billion in research and development. estimated to reach $800 billion by 2035 and $7 trillion by
Experts predict that by 2030, self-driving cars will be 2050. Due to the reduction in collisions, studies have also
reliable and affordable enough to replace most traditional predicted that autonomous driving could save an estimated
vehicles. The market for autonomous driving is expected 585,000 lives between 2035 and 2045 [7].
to reach $800 billion by 2035 and $7 trillion by 2050,
potentially saving 585,000 lives between 2035 and 2045. According to Marr [8], the future of mobility will bring
The future of mobility will bring significant changes to significant changes to the environment we live in, with roads,
the environment, with smarter and more connected towns, and cities becoming smarter and more connected by up
roads, towns, and cities. Most people will likely to 25%. These advancements will lead to some of the most
experience autonomous driving through privately rented exciting changes in mobility, both in the coming year and
or shared ride vehicles. In response to this trend, beyond. For most people, their first experience with
governments and industry leaders have proposed self- autonomous driving is likely to be in a fleet of privately
driving bus solutions. Supported by the government, rented or shared ride vehicles, rather than a car they own
Taiwanese firms have successfully developed a self- themselves.
driving bus by integrating automotive parts and systems
and using artificial intelligence and deep learning for Facing the megatrend, over the past years, national
human-like vision and driving behavior. The self-driving governments and leaders in the automobile industry have
bus has been verified through field tests, and its platform proposed the use of self-driving buses. Supported by the
has been commercialized. government, Taiwanese firms have also initiated the
development of self-driving vehicles in general, and self-
Keywords:- Self-Driving Bus, Autonomous Vehicle, Artificial driving buses in particular. The research has integrated
Intelligence, Innovation. automotive parts and systems, made by both local and global
firms, for electric and autonomous vehicles. Artificial
I. INTRODUCTION intelligence, mimicking human vision and driving behavior, is
achieved by way of deep learning from data collected by high-
In recent years, emerging technologies in the fields of definition (HD) cameras. After continuous training and
information, communication, and automatic control have led validation, the vehicle can recognize the planned path and
to changes in vehicle technology. One of the major changes is surroundings, allowing for an autonomous driving model. A
the development and commercialization of autonomous self-driving bus has been successfully demonstrated in
vehicle technology. Self-driving technology aims to reduce numerous field tests in Taiwan. Meanwhile, derivatives of the
collisions, save energy and carbon, and improve traffic self-driving bus platform have been successfully
conditions [1] and is bound to become the mainstream of commercialized.
transportation technology in the future.
The development of international automatic driving
In the self-driving era, cars will be more efficient and systems and the development of self-driving buses in
comfortable to use. In order to realize the vision of advanced countries will be reviewed in Section II.
autonomous driving and grab business opportunities, the Subsequently, the design of a self-driving bus will be
world's major car manufacturers (such as Tesla and Toyota), introduced in Section III. Results of verifications and field
technology companies (such as Google [2, 3], Apple, and tests of the self-driving bus will be introduced in Section IV.
Nvidia), and new companies (such as Uber, Lyft, and Zoox) Problems and challenges faced by the self-driving bus will be
have invested in research and development. So far, the discussed in Section V. Section VI concludes the work, with
cumulative investment amount has been nearly 80 billion suggestions for future development.
dollars [1, 4, 5]. In recent years, the Consumer Electronics
Show (CES) held in Las Vegas has become the best venue for
many companies to explore automotive technology and
III. DESIGN OF THE SELF-DRIVING BUS settings. Through an integrated architecture, deep neural
networks can be trained on data center systems before being
In this section, details regarding the automatic driving deployed on vehicles.
system and vehicle sensing equipment, including the
automatic driving system, positioning navigation point Nvidia is the only existing chip manufacturer with the
tracking, environment awareness, autopilot deep learning highest mastery of artificial intelligence technology. From
method, and infrastructure and operation planning will be the bottom of the computing module to the top of the neural
introduced. When fully loaded, the maximum speed of the network training the application has complete hardware and
vehicle is 28 km/h, the climbing force is more than 20%, software support. According to Nvidia (2018), DRIVE PX
endurance is more than 100 km, and the charging time is "merges data from multiple cameras, radar, and ultrasonic
about 10-12 hours. The performance of the vehicle is sensors, allowing the algorithm to accurately interpret the
sufficient for at least six hours of daily operation at the test full 360-degree environment around the vehicle, presenting
site. Vehicle specifications are detailed in Table 3 below. the most complete picture possible, including static and
dynamic objects. Using deep neural networks to detect and
Table 3 Specification of the Self-Driving Bus classify objects greatly improves the accuracy of the sensor
Item Contents data after fusion."
Length × width × height 4,800×1,500×1,965 mm3
Seat number 8 The autonomous driving of vehicles in this project is
wheelbase 2,580 mm mainly achieved through an artificial intelligence control
Empty weight 1200 kg system and deep learning core technology. The artificial
Top speed (full load) 28 km/h intelligence control is completed by positioning and
Climbing force (full load) 20% navigation (global planning) and environment awareness
Motor 7.5KW/72V AC Motor (local planning), which are described below.
Battery capacity 14 kwh
B. Positioning Navigation Point Tracking
Cruising endurance ≥100 km
The first step is positioning and navigation. This plan
Double circuit hydraulic brake;
Brake system uses an Inertial Measurement Unit (IMU)-enhanced GPS
front disk and back drum
combined with high-precision maps (HD maps) to conduct
accurate positioning and overall path planning through map
A. Automatic Driving System
assets and positioning components. The first step of
The project uses Nvidia DRIVE™ PX, an open
automatic driving is completed, which determines the
artificially intelligent vehicle computing platform that
general direction of the vehicle path planning and follows the
instantly understands the vehicle's surroundings, accurately
path.
locates the vehicle on HD maps, and plans a safe route.
Combined with deep learning, sensor fusion, and surround
vision, the architecture can be expanded to support various
The main work of waypoint tracking is to enable D. Deep Learning Method For Autopilot
vehicles to locate their location in real-time to confirm Automated driving requires continuous deep learning
whether they are still on the planned path. This project to accumulate experience of various scenes, learn to cope
adopts the IMU-enhanced GPS combined with high- with different scenarios, and establish a Deep Neural
precision maps for real-time positioning and real-time Networks (DNNs) model, that is, gradually reduce the
pathing, precise positioning, and path planning. dependence of the system on human intervention, thereby
upgrading the level of automation.
C. Environment Awareness
The second step is environmental awareness, that is, Table 4 Devices Adopted by the Self-Driving Bus
the real-time route adjustment and micro-amplitude-change Device Brand/Model Application
decisions made by the identification of objects and lane Camera-A Whetron WS Image Recognition
judgment while the vehicle is running. This part is controlled CAMERA-100
by AI and depends on the information provided by sensors. Camera-B Sekonix AR0231- Auxiliary Image
However, this plan incorporates different sensors, including SF3322/AR0231- Identification
a camera (image), radar, and LiDAR to judge the target and SF3323
lane, and integrated sensor feedback information for regional Sonic sensor Whetron IRC- Detection of objects
end path planning (lane selection, obstacle avoidance, etc.). (plus 1 APA-12S-000 and obstacles
ECU)
The perception model can be divided into two key Touch GeChic On- Vehicle equipment
functions, namely target identification and driveable space screen Lap1002 operation and setting
identification. The core logic of the perception model is to GPS Xsens (MTi-G- Positioning and
identify the target and driveable space through continuous 710-2A8G4) MTi- navigation
operation so that the autonomous vehicle can understand the G-710-GPS/AHRS
driveable area and obstacles to avoid to make correct route RS232, USB
selections. The proposed sensors are a combination of image, Lidar Velodyne VLP-16 Object and obstacle
radar echo, and optical radar sensors to make up for the detection
shortcomings of each sensor and greatly improve the sensing
accuracy. The process of deep learning in an autonomous vehicle
involves implementing various deep neural networks to
Image recognition using a photographic lens can sense and understand the environment, locate the vehicle on
simultaneously target identifiable objects in view and a high-resolution map, predict the behavior and location of
determine a specific object’s possible behavior after tracking other objects, calculate vehicle dynamics, and plan safe
it, but it is difficult to reach sufficient accuracy to judge the driving paths. The project introduces Nvidia's newly
distance and actual size of the object. announced DX-2 deep learning server system. Through
integrated architecture and enhanced computing power, deep
At this time, the radar is used to judge the distance and neural networks can be trained in a central data center. Then
size of the object, and the radar can also track the direction it can be deployed on the vehicle, reducing the neural
of the object, and this information can also be fed back to the network training time in the data center from several months
imaging system to assist in tracking the object. to several days.
The vehicles in the scheme are equipped with an During the process, data such as information from the
obstacle detection system using optical sweep sensors that controller area network (CAN), surrounding image data, and
can detect obstacles up to 7 cm wide at a range of 40 to 100 distance traveled must be collected, and the data processed
metres and use fail-safe detection to scan the area in front of (interpreting and labeling). The inference testing phase is
the vehicle. The obstacle detection system plugs directly into based on the inference center training, and the model can be
the low-level motion controller and interrupts navigational validated (refer Figure 1).
tasks. Obstacle detection can distinguish between actual
detection targets and "ghost" detection targets, such as rain,
snow, or falling leaves.
The development and commercialization of autonomous [1]. S. A. Bagloee, M. Tavana, M. Asadi, and T. Oliver,
vehicles have been accelerated by emerging technologies. "Autonomous vehicles: challenges, opportunities, and
Almost $80 billion has been invested in self-driving future implications for transportation policies," Journal
technology to reduce collisions, save energy, and improve of Modern Transportation, vol. 24, no. 4, pp. 284-303,
traffic conditions. In response to this trend, governments and 2016.
industry leaders have proposed self-driving buses. With [2]. E. Guizzo, "How google’s self-driving car works,"
government support, Taiwanese firms have developed self- IEEE Spectrum, vol. 18, no. 7, pp. 1132-1141, 2011.
driving buses that integrate automotive parts and systems and [3]. J. Markoff. (2010, March 20). Google Cars Drive
use artificial intelligence and deep learning for human-like Themseleves, in Traffic. Available:
driving behavior. https://www.nytimes.com/2010/10/10/science/10googl
e.html
In the future, the Government should grasp the [4]. European Automobile Manufacturers Association, The
momentum of the global development of self-driving cars, Automobile Industry Pocket Guide 2015-2016.
integrate relevant ministerial resources, improve the testing Brussels, Belgium: European Automobile
environment and regulations, build an autonomous vehicle Manufacturers’ Association, 2015.
platform, and discuss with relevant industry, education, and [5]. J. A. H. Nieuwenhuijsen, Diffusion of Automated
research institutions the application of self-driving cars in Vehicles: A Quantitative Method to Model the
parks and rural connections, public transport station Diffusion of Automated Vehicles with System
connections and other services. Dynamics. Delft, Netherlands: Delft University of
Technology, 2015.