US11392132

Download as pdf or txt
Download as pdf or txt
You are on page 1of 19

US011392132B2

( 12) United
Song et al.
States Patent ( 10) Patent No .: US 11,392,132 B2
(45 ) Date of Patent : Jul . 19 , 2022
(54 ) GENERATIVE ADVERSARIAL NETWORK ( 56 ) References Cited
ENRICHED DRIVING SIMULATION
U.S. PATENT DOCUMENTS
( 71 ) Applicant: Pony AI Inc. , Grand Cayman (KY )
10,346,450 B2 7/2019 Noguero et al .
( 72 ) Inventors: Hao Song , Sunnyvale, CA (US ) ; Jun 10,423,647 B2 9/2019 Llagostera et al.
Peng , Fremont, CA (US ) ; Nengxiu (Continued )
Deng , Fremont, CA (US ) ; Sinan Xiao ,
Mountain View , CA (US ) ; Tao Qin , FOREIGN PATENT DOCUMENTS
Sunnyvale, CA (US ); Tiancheng Lou ,
Milpitas , CA (US ) ; Tianyi Li , Milpitas , CA 3045439 A1 11/2017
CA (US ); Xiang Yu , Santa Clara , CA CN 109190648 A 1/2019 G06K 9/6256
(US ) ; Yubo Zhang , Los Gatos, CA (Continued )
( US )
OTHER PUBLICATIONS
( 73 ) Assignee : Pony AI Inc. , Grand Cayman (KY )
Junxiao Shen et al ' s “ Imaginative Generative Adversarial Network
( * ) Notice: Subject to any disclaimer, the term of this ... ” Department of Engineering , University of Cambridge, United
patent is extended or adjusted under 35 Kingdom ( Year: 2021 ) . *
U.S.C. 154 ( b ) by 31 days . ( Continued )
( 21 ) Appl. No .: 17/014,818
(22 ) Filed : Sep. 8, 2020 Primary Examiner – Cuong H Nguyen
(74 ) Attorney, Agent, or Firm — Sheppard Mullin Richter
( 65 ) Prior Publication Data & Hampton LLP
US 2020/0409380 A1 Dec. 31 , 2020
Related U.S. Application Data ( 57 ) ABSTRACT
( 63 ) Continuation of application No. 16 /043,706 , filed on A computer - implemented method and a system for training
Jul. 24 , 2018 , now Pat . No. 10,768,629 . a computer -based autonomous driving model used for an
(51 ) Int. Ci. autonomous driving operation by an autonomous vehicle are
G05D 1/00 ( 2006.01 ) described . The method includes: creating time - dependent
GOON 20/00 ( 2019.01 ) three - dimensional (3D ) traffic environment data using at
least one of real traffic element data and simulated traffic
(Continued ) element data; creating simulated time -dependent 3D traffic
( 52) U.S. Ci. environmental data by applying a time-dependent 3D
CPC G05D 1/0221 ( 2013.01 ) ; B60W 50/06
(2013.01 ) ; G05D 1/0088 ( 2013.01 ) ; generic adversarial network (GAN ) model to the created
( Continued )
time -dependent 3D traffic environment data ; and training a
computer - based autonomous driving model using the simu
( 58 ) Field of Classification Search lated time- dependent 3D traffic environmental data .
CPC GO5D 1/0221 ; G05D 1/0088 ; G05D
2201/0213 ; GO6N 3/0454 ; G06N 3/088 ;
(Continued ) 20 Claims , 5 Drawing Sheets
100

4D Traffic Environmental Data Generating System


104

Real Traffic Data Simulated Traffic


Processing Engine Data Processing
110
Engine
112

Network 102

4D GAN Model Management System Autonomous Driving Simulating System


106 108
40 GAN 4D GAN Virtual Traffic Virtual Drive
Discriminating Generating Provisioning Simulating
Engine Engine Engine Engine
114 116 118 129
US 11,392,132 B2
Page 2

( 51 ) Int . Ci . 2020/0150643 A1 5/2020 Cella et al .


G06F 30/27
G05D 1/02 ( 2020.01 ) 2021/0056863 A1 * 2/2021 Stefanescu
2021/0221404 A1 * 7/2021 Reiner GO5D 1/0055
GO6N 3/04 ( 2006.01 ) 2021/0286923 A1 * 9/2021 Kristensen GO6N 3/088
GOON 3/08 ( 2006.01 ) 2021/0294944 A1 * 9/2021 Nassar GOOF 11/3688
B60W 50/06 ( 2006.01 ) 2021/0302169
2021/0312244
A1 * 9/2021 Xie
A1 * 10/2021 Atsmon
G01C 21/32
G06T 11/00
B60W 50/00 ( 2006.01 )
( 52 ) U.S. CI . FOREIGN PATENT DOCUMENTS
CPC GO6N 3/0454 (2013.01 ) ; GO6N 37088
( 2013.01 ) ; GO6N 20/00 ( 2019.01 ) ; B60W CN 112256589 A * 1/2021 G06F 11/3668
2050/0028 (2013.01 ) ; B60W 2050/0088 CN 112529208 A 3/2021 G06K 9/6267
(2013.01 ) ; G05D 2201/0213 (2013.01 ) WO 2017196821 A1 11/2017
( 58 ) Field of Classification Search WO 2018209894 Al 11/2018
WO WO - 2020205655 Al 10/2020 GO5D 1/0088
??? GOON 20/00; B60W 2050/0088 ; B60W
2050/0028 ; B60W 50/06 ; G01C 21/3602 ; OTHER PUBLICATIONS
G01C 21/3617 ; G01C 21/3638 ; GOIC
21/3655 18 Impressive Applications of Generative Adversarial Networks
USPC 701/23 (GANs); by Jason Brownlee on Jun . 14 , 2019 in Generative
See application file for complete search history . Adversarial Networks ( Year: 2019 ) . *
Deep Learning — an Opportunity and a Challenge for Geo- and
( 56 ) References Cited Astrophysics; Christian Reimers MSc , Christian Requena -Mesa
MSc , in Knowledge Discovery in Big Data from Astronomy and
2
U.S. PATENT DOCUMENTS Earth Observation , 2020 ; Chapter 13.4.3.2 Generative Adversarial
Networks ( Year: 2020 ) . *
10,678,244 B2 * 6/2020 Iandola B60W 40/02 C. Wu et al . , “ Spatiotemporal Scenario Generation of Traffic Flow
10,768,629 B2 * 9/2020 Song GOON 3/0472 Based on LSTM - GAN , ” in IEEE Access , vol . 8 , pp . 186191
11,270,165 B2 * 3/2022 Atsmon G06K 9/6257
186198 , 2020 , doi : 10.1109 /ACCESS.2020.3029230 . ( Year: 2020 ) . *
2018/0165554 Al 6/2018 Zhang et al . Z. Wang, H. Zhu , M. He , Y. Zhou , X. Luo and N. Zhang, “ GAN and
2018/0284745 A1 * 10/2018 Celia GOIM 13/04
2018/0349526 A1 12/2018 Atsmon et al . Multi -Agent DRL Based Decentralized Traffic Light Signal Con
2019/0132343 A1 5/2019 Chen et al . trol , ” in IEEE Transactions on Vehicular Technology, vol . 71 , No.
2019/0147582 A1 5/2019 Lee et al . 2 , pp . 1333-1348 , Feb. 2022 , doi : 10.1109 /TVT.2021.3134329 .
2019/0223571 A1 7/2019 Atsmon ( Year: 2022 ) . *
2019/0228110 A1 7/2019 Yan et al . I. Gilitschenski, G. Rosman , A. Gupta , S. Karaman and D. Rus,
2019/0303759 A1 * 10/2019 Farabet G06F 9/455 “ Deep Context Maps: Agent Trajectory Prediction Using Location
2019/0311298 A1 * 10/2019 Kopp G06K 9/6256 Specific Latent Maps,” in IEEE Robotics and Automation Letters,
2019/0325264 A1 10/2019 Keserich et al . vol . 5 , No. 4 , pp . 5097-5104 , Oct. 2020 , doi: 10.1109 /LRA.2020 .
2019/0339684 A1 11/2019 Cella et al . 3004800. ( Year: 2020 ) . *
2019/0356588 A1 11/2019 Shahraray et al . Q. Sun and L. Cai , “ Multi - AUV Target Recognition Method Based
2019/0392596 Al 12/2019 Yang
2020/0033866 A1 * 1/2020 Song GOON 3/088 on GAN -meta Learning , ” 2020 5th International Conference on
>>

2020/0074266 A1 * 3/2020 Peake GO5D 1/0231 Advanced Robotics and Mechatronics ( ICARM ), 2020 , pp . 374
2020/0129780 A1 4/2020 Lachaine et al . 379 , doi : 10.1109 /ICARM49381.2020.9195289 . ( Year: 2020 ) . *
2020/0129784 Al 4/2020 Beriault et al .
2020/0133255 Al 4/2020 Cella et al . * cited by examiner
U.S. Patent Jul . 19 , 2022 Sheet 1 of 5 US 11,392,132 B2

4D Traffic Environmental Data Generating System


104

Real Traffic Data Simulated Traffic


Processing Engine Data Processing
Engine
112

Network 102

4D GAN Model Management System Autonomous Driving Simulating System


106 108

4D GAN 4D GAN Virtual Traffic Virtual Drive


Discriminating Generating Provisioning Simulating
Engine Engine Engine Engine
120

FIG . 1
U.S. Patent Jul . 19 , 2022 Sheet 2 of 5 US 11,392,132 B2
200

Start

Create 4D traffic environment data using real and /or simulated


traffic element data
202

Create/update a 4D GAN model through an adversarial


machine learning process
204

Create simulated 4D traffic environmental data by applying the


created 4D GAN model to the created 4D traffic environmental
data
206

Train an autonomous driving model using the created


simulated 4D traffic environmental data
208

Perform a real -world autonomous driving operation using the


trained autonomous driving model
210

FIG . 2
U.S. Patent Jul . 19 , 2022 Sheet 3 of 5 US 11,392,132 B2

300

GAN4DofthevpMaorldmuietfsyr msub-goendrealtor318
KOR
4DGEenAgraitNneg Start t4sGednirvmauoflDtenicad mgsub-GANa4Dueonsdriaentolgr 312 t4sPednirvmaouflDtenidca td4DGfirscaeAomntiNraog 314 tRaegrbyxnacsieurtlvnegdsub-ma4DGANisocridmenatlor 316 BEK R

3
.
FIG

KIMSENHO I

Start R4DGdteisrcaAmintNvaeogr 302


dPa4DroftiesnchlfmiyonsaertvmGdtitogsrecaAnmitrNanoge
drisecmunaltio 304 dPmgofteanrhcfoairetnmgdriswtouecpiamruvtnlsihoaryrtgaeasniurlantge306 GAN4DofavpMaorldmuietfsyrdsubb-mionthetscroamdsineatilndog
result 308
4DGDEiscnrAgmiNanteg
U.S. Patent Jul . 19 , 2022 Sheet 4 of 5 US 11,392,132 B2
400

Start

Receive simulated 4D traffic environmental data


402

Render the received simulated 4D traffic environmental data to


generate virtual photorealistic 4D traffic environment
404

Carry out a virtual autonomous driving operation in the


generated virtual photorealistic 4D traffic environment
406

Obtain a virtual autonomous driving result of the virtual


autonomous driving operation
408

Modify parameter values of an autonomous driving model


based on the virtual autonomous driving result
410

FIG . 4
U.S. Patent Jul . 19 , 2022 Sheet 5 of 5 US 11,392,132 B2

500

1
sonore conse 4
connes connes 1

I
Network
Processor
504
(s) Interface ( s )
518

Bus
502

Main Memory ROM Storage


506 508 510

LAN App

vo YO O Ut OD dura VYE w

I
Output Input Cursor
I
Device ( s ) Device( s ) Control
I
512 514 516
GC Goga OD CA G Gadge# AT GO CRACE GP

FIG . 5
US 11,392,132 B2
1 2
GENERATIVE ADVERSARIAL NETWORK In some embodiments, the adversarial machine learning
ENRICHED DRIVING SIMULATION process of the time- dependent 3D GAN discriminator sub
model may comprise: receiving time-dependent 3D GAN
CROSS - REFERENCE TO RELATED discriminator training data from the time -dependent 3D
APPLICATION 5 GAN generator sub -model; and performing, using the time
dependent 3D GAN discriminator sub -model, discrimina
This application is aa continuation of U.S. application Ser. tion analysis of the received time-dependent 3D GAN
No. 16 /043,706 , filed Jul. 24 , 2018 , the content of which is discriminator training data to generate a discrimination
incorporated herein by reference in its entirety. 10
result indicating whether the time -dependent 3D GAN dis
BACKGROUND
criminator sub -model determined that the time-dependent
3D GAN discriminator training data represents real -world
Today many researcher are conducting research on time-dependent 3D traffic environmental data or simulated
autonomous driving, and autonomous driving has been time-dependent 3D traffic environmental data . The adver
rapidly developed accordingly.One of the core technologies 15 GAN
sarial machine learning process of the time-dependent 3D
discriminator sub -model may further comprise: per
of autonomous driving is an autonomous driving model , forming matching of the generated discrimination result
such as an autonomous driving algorithm , configured to with supervisory data indicating whether the time-dependent
make decisions about behavior of a vehicle depending on
surrounding traffic environment. To improve the autono 3D GAN discriminator training data represents real - world
mous driving model for safer and more efficient driving, 20 time-dependent 3D traffic environmental data or simulated
typically machine learning may be employed . According to time -dependent 3D traffic environmental data, to generate a
the machine learning, test autonomous driving is carried out training result indicating a trained level of the time-depen
in aa real or virtual traffic environment, using the autonomous dent 3D GAN discriminator sub -model ; and modifying
driving model to be tested , and the autonomous driving parameter values of the time- dependent 3D GAN discrimi
model is improved according to a machine learning algo- 25 nation sub -model based on the training result.
rithm , based on the test result . The test autonomous driving In some embodiments, the adversarial machine learning
in the virtual traffic environment may be advantageous in process of the time-dependent 3D GAN generator sub
cost , public safety, and time efficiency, compared to the test model may comprise: generating, using the time-dependent
autonomous driving in the real traffic environment. How 3D GAN generator sub -model , simulated time-dependent
ever , the virtual traffic environment according to the current 30 3D traffic environmental data ; and providing the generated
technology may not be identical to the real traffic environ simulated time-dependent 3D traffic environmental data for
ment, and improvement of the virtual traffic environment to creating time -dependent 3D GAN discriminator training
be closer to the real traffic environment may be required to data to be used by the time -dependent 3D GAN discrimi
reflect real environmental conditions.
These and other issues are addressed, resolved, and /or 35 natorof thesub -model
time . The adversarial
-dependent machine learning
3D GAN generator process
sub -model may
reduced using techniques described herein . The foregoing further comprise receiving the training from the time -de
examples of the related art and limitations related therewith pendent 3D GAN discriminator sub -model ; and modifying
are intended to be illustrative and not exclusive . Other
limitations of the related art will become apparent to those 40 parameter values of the time - dependent 3D GAN generator
of skill in the relevant art upon a reading of the specification sub -model based on the training resultmay .
and a study of the drawings. In some embodiments, the method further comprise
performing a real-world autonomous driving operation
SUMMARY using the trained computer -based autonomous driving
model.
Described herein are a method and a system for training 45 In some embodiments, the simulated time- dependent 3D
a computer-based autonomous driving model , which can be traffic environmental data may include object movement
used for an autonomous driving operation by an autonomous data indicating irregular movement of objects around roads.
vehicle. The system includes one or more processors ; and a In some embodiments, the real traffic element data is used to
memory storing instructions that, when executed by the one create the time -dependent 3D traffic environment data , and
or more processors . 50 the real traffic element data may include at least one of
In one embodiment, the disclosure describes a computer- geographical mapping data , traffic sign data , and traffic
implemented method including : creating time- dependent signal data of a real -world geographical region . In some
three - dimensional (3D ) traffic environment data using at embodiments, the simulated traffic element data is used to
least one of real traffic element data and simulated traffic create the time-dependent 3D traffic environment data , and
element data ; creating simulated time-dependent 3D traffic 55 the simulated traffic element data may include at least one of
environmental data by applying a time-dependent 3D simulated weather data, simulated traffic signal change data ,
generic adversarial network (GAN ) model to the created simulated pedestrian data , and simulated obstacles data .
time- dependent 3D traffic environment data ; and training a
computer-based autonomous driving model using the simu BRIEF DESCRIPTION OF THE DRAWINGS
lated time - dependent 3D traffic environmental data . 60
In some embodiments, the method may further comprise Certain features of various embodiments of the present
creating the time -dependent 3D generative adversarial net- technology are set forth with particularity in the appended
work (GAN ) model through an adversarial machine learning claims . A better understanding of the features and advan
process of a time - dependent 3D ( which is also referred to tages of the technology will be obtained by reference to the
simply as “ four - dimensional ( 4D ) " ) GAN discriminator 65 following detailed description that sets forth illustrative
sub -model and a time -dependent 3D GAN generator sub- embodiments, in which the principles of the invention are
model of the time - dependent 3D GAN model. utilized , and the accompanying drawings of which :
US 11,392,132 B2
3 4
FIG . 1 is a schematic diagram depicting an example of a using a computer -based model created as a result of per
a
system for training a computer -based autonomous driving forming an adversarial machine learning process, which
model according to some embodiments . enables to create photorealistic simulated 4D traffic envi
FIG . 2 depicts a flowchart of an example of a method for ronments sufficiently close to real 4D traffic environments .
training a computer - based autonomous driving model 5 Real 4D traffic environments involve irregular events , such
according to some embodiments . as irregular movement of objects ( e.g. , pedestrian's irregular
FIG . 3 depicts a flowchart of an example of a method for movement in response to a large noise, wind, road bumps,
performing an adversarial machine learning process for a etc. ) . By reflecting such irregular events in simulated 4D
four - dimensional ( 4D ) generative adversarial network traffic environments, the photorealistic simulated 4D traffic
(GAN ) model according to some embodiments. 10 environments can be created and driving simulation in more
FIG . 4 depicts a flowchart of an example of specific realistic traffic environment can be carried out . It is noted
processes for training a computer - based autonomous driving here that “ 4D ” is intended to represent time -dependent
model according to some embodiments. three - dimensional ( 3D ) space , where objects in the 3D space
FIG . 5 is a block diagram illustrating a computer system are capable of moving and changing their shapes according
upon which any of the embodiments described herein may 15 to time passage . For example, leaves of a tree and a skirt of
be implemented. a pedestrian may flap in a 4D environment as time passage ,
as opposed to a 3D environment where all objects are static
DETAILED DESCRIPTION and there is no movement or change of shapes of objects.
One embodiment provides a computer - implemented
In the following description, certain specific details are set 20 method including : creating four-dimensional (4D) traffic
forth in order to provide a thorough understanding of various environment data using at least one of real traffic element
embodiments of the invention. However, one skilled in the data and simulated traffic element data ; creating simulated
art will understand that the invention may be practiced 4D traffic environmental data by applying a 4D generic
without these details. Moreover, while various embodiments adversarial network (GAN ) model to the created four
of the invention are disclosed herein , many adaptations and 25 dimensional ( 4D ) traffic environment data ; and training a
modifications may be made within the scope of the invention computer-based autonomous driving model using the simu
in accordance with the common general knowledge of those lated 4D traffic environmental data . Another embodiment
skilled in this art. Such modifications include the substitu- provides system for an autonomous-driving vehicle, com
tion of known equivalents for any aspect of the invention in prising: one or more processors; and memory storing
order to achieve the same result in substantially the same 30 instructions that, when executed by the one or more proces
way. sors , cause the one or more processors to perform the
Unless the context requires otherwise, throughout the computer - implemented method.
present specification and claims , the word " comprise ” and FIG . 1 is a schematic diagram 100 depicting an example
variations thereof, such as , “ comprises ” and “ comprising ” of a system for training a computer -based autonomous
are to be construed in an open , inclusive sense , that is as 35 driving model according to some embodiments. In the
“ including, but not limited to . ” Recitation of numeric ranges example depicted in FIG . 1 , the system for training a
of values throughout the specification is intended to serve as computer-based autonomous driving model includes a net
a shorthand notation of referring individually to each sepa- work 102 , and a 4D traffic environmental data generating
rate value falling within the range inclusive of the values system 104 , a 4D GAN model management system 106 , and
defining the range , and each separate value is incorporated 40 an autonomous driving simulating system 108 connected
in the specification as it were individually recited herein . through the network 102 .
Additionally, the singular forms “ a , " " an ” and “ the ” include In the example depicted in FIG . 1 , the system for training
plural referents unless the context clearly dictates otherwise. a computer -based autonomous driving model represents a
Reference throughout this specification to " one embodi- system primarily dedicated to train a computer -based
ment” or “ an embodiment ” means that a particular feature , 45 autonomous driving model to be mounted on an autono
structure or characteristic described in connection with the mous -driving vehicle, which is capable of sensing its envi
embodiment is included in at least one embodiment of the ronment and navigating with a limited human input or
present invention . Thus, the appearances of the phrases “ in without human input. The “ vehicle ” discussed in this paper
one embodiment ” or “ in an embodiment” in various places typically includes a vehicle that drives on the ground, such
throughout this specification are not necessarily all referring 50 as wheeled vehicles, and may also include aa vehicle that flies
to the same embodiment, but may be in some instances . in the sky (e.g. , drones, helicopter, airplanes, and so on) . The
Furthermore, the particular features, structures, or charac- “ vehicle ” discussed in this paper may or may not accom
teristics may be combined in any suitable manner in one or modate one or more passengers therein .
more embodiments . In one embodiment, the autonomous -driving vehicle
Various embodiments described herein are directed to a 55 includes a vehicle that controls braking and / or acceleration
computer - implemented method and a system for training a without real time human input. In another embodiment, the
computer-based autonomous driving model to be employed autonomous - driving vehicle includes aa vehicle that controls
in an autonomous-driving vehicle (or simply autonomous steering without real time human input based on inputs from
vehicle ). In a specific implementation , the computer- imple- one or more lens mount units . In another embodiment, the
mented method and the system are intended to provide a 60 autonomous - driving vehicle includes aa vehicle that autono
training scheme of a computer -based autonomous driving mously controls braking, acceleration , and steering without
model without carrying out a high - cost and less - safe real- real time human input specifically for parking the vehicle at
world test driving operations with aa vehicle in a real -world a specific parking space , such as a parking lot , a curb side of
traffic environment. Further, the technology in certain imple- a road ( e.g. , parallel parking ), and a home garage, and so on .
mentations of the present disclosure can create simulated 65 Further, “ real time human input ” is intended to represent a
time -dependent three - dimensional (3D ) (hereinafter referred human input that is needed to concurrently control move
to simply as “ four- dimensional ( 4D ) ) traffic environments ment of a non -autonomous -driving vehicle , such as gear
US 11,392,132 B2
5 6
shifting, steering control, braking pedal control, accel pedal commands one or more other autonomous-driving police
control, crutch pedal control, and so on . systems to proceed to a particular location so as to avoid
In one embodiment, the autonomous -driving vehicle is traffic incidents .
capable of sensing its environment based on inputs from one In the example depicted in FIG . 1 , the 4D traffic envi
or more imaging devices ( e.g. , camera ) mounted on the 5 ronmental data generating system 104 represents a system
autonomous -driving vehicle . In an embodiment, the autono- for generating 4D traffic environmental data . The 4D traffic
mous -driving vehicle is configured to analyze image data environment data may represent basic time -dependent traffic
obtained from the one or more imaging devices and identify environment in a virtual or real geographical region, which
objects ( e.g. , traffic signals, road signs , other vehicles , is to be applied to a 4D GAN model to generate a virtual
pedestrians, and obstacles ) included in images of the ana- 10 photorealistic 4D traffic environment. In the example
lyzed image data . In one embodiment, the autonomous- depicted in FIG . 1 , the 4D traffic environmental data gen
driving vehicle is also capable of performing an autono- erating system 104 includes a real traffic data processing
mous -driving operation based on the identified objects. In an engine 110 and aa simulated traffic data processing engine
embodiment, the autonomous-driving vehicle is also capable 112. In the example depicted in FIG . 1 , the real traffic data
of drive the vehicle so as to follow a traffic stream without 15 processing engine 110 represents a hardware module con
hitting the identified objects. For example, the autonomous- figured to generate real traffic element data . In some embodi
driving vehicle follow traffic signals identified based on ments, the real traffic element data include one or more of
image data , follow traffic signs identified based on image mapping data in real -world geographical locations , traffic
data, and drive with a sufficient distance from preceding indicator data indicating traffic signs and traffic signals in
vehicles. 20 real-world environments, traffic regulatory data indicating
In the example of FIG . 1 , the autonomous -driving vehicle traffic regulations in specific real -world regions, and so on .
is also capable of communicating with systems or devices In some embodiments, the real traffic data processing engine
connected to the autonomous -driving vehicle through a 110 generates the real traffic element data based on data
network . In an embodiment, the autonomous -driving vehicle obtained from public resources or commercial resources.
communicates with a server via the network . For example, 25 In the example depicted in FIG . 1 , the simulated traffic
the autonomous -driving vehicle pulls up from the server data processing engine 112 represents a hardware module
map information ( e.g. , local map , parking structure map , configured to generate simulated traffic element data . In
floor plan of buildings , and etc. ) of a region around the some embodiments, the simulated traffic element data
autonomous -driving vehicle . In another example, the include one or more of simulated weather data in real /virtual
autonomous -driving vehicle periodically notifies informa- 30 geographical locations , simulated time- frame data indicat
tion of the autonomous -driving vehicle such as locations and ing environmental conditions (e.g. , brightness, congestion ,
directions thereof to the server . noise , etc. ) in a specific time in a day, signal change data
In some embodiments, the network 102 represents a indicating timing of changing traffic signals in real/virtual
variety of potentially applicable technologies. For example, geographical locations, simulated outside object data indi
the network 102 can be used to form a network or part of a 35 cating outside objects ( e.g. , pedestrian and other road - side
larger network . Where two components are co - located on a objects) and movement thereof, and so on . In some embodi
device, the network can include a bus or other data conduit ments, the simulated traffic data processing engine 112 is
or plane. Depending upon implementation -specific or other further configured to generate 4D traffic environment data
considerations, the network 102 can include wired commu- using real and /or simulated traffic element data .
nication interfaces and wireless communication interfaces 40 In the example depicted in FIG . 1 , the 4D GAN model
for communicating over wired or wireless communication management system 106 represents a system for managing
channels. Where a first component is located on a first device a 4D GAN model . In some embodiments, the 4D GAN
and a second component is located on a second ( different) model represents computer instructions configured to cause
device , the network can include aa wireless or wired back - end a photorealistic effect to be applied to 4D traffic environment
network or LAN . The network 102 can also encompass a 45 represented by 4D traffic environment data to generate
relevant portion of aa WAN or other network , if applicable. photorealistic 4D traffic environment. To generate the pho
Enterprise networks can include geographically distributed torealistic 4D traffic environment, the 4D GAN model
LANs coupled across WAN segments. For example , a dis- includes a 4D GAN discrimination sub -model and a 4D
tributed enterprise network can include multiple LANs ( each GAN generator sub -model configured to perform an adver
LAN is sometimes referred to as a Basic Service Set (BSS ) 50 sarial machine learning process. An example of an adver
in IEEE 802.11 parlance, though no explicit requirement is sarial machine learning process according to some embodi
suggested here ) separated by WAN segments. An enterprise ments will be described below with reference to FIG . 3. In
network can also use VLAN tunneling ( the connected LANS the example depicted in FIG . 1 , the 4D GAN model man
are sometimes referred to as an Extended Service Set (ESS ) agement system 106 includes a 4D GAN discriminating
in IEEE 802.11 parlance, though no explicit requirement is 55 engine 114 and a 4D GAN generating engine 116. In the
suggested here ). Depending upon implementation or other example depicted in FIG . 1 , the 4D GAN discriminating
considerations, the network 102 can include a private cloud engine 114 represents a hardware module configured to train
under the control of an enterprise or third party, or a public the 4D GAN discrimination sub -model through the adver
cloud . sarial machine learning process. In the example depicted in
In an embodiment, the autonomous -driving vehicle may 60 FIG . 1 , the 4D GAN generating engine 116 represents a
communicate with one or more other autonomous -driving hardware module configured to train the 4D GAN generator
vehicle systems via the network 102. For example, the sub -model.
autonomous -driving vehicle sends information of aa vehicle In the example depicted in FIG . 1 , the autonomous
route of the corresponding autonomous -driving vehicle to driving simulating system 108 represents a system for
the one or more other autonomous -driving vehicle systems, 65 training an autonomous driving model through a virtual
such that traffic incidents such as collisions can be pre- autonomous driving operation in a virtual photorealistic 4D
vented . In another example, the autonomous -driving vehicle traffic environment. In the example depicted in FIG . 1 , the
US 11,392,132 B2
7 8
autonomous driving simulating system 108 includes a vir- ments will be described below with reference to FIG . 3. As
tual traffic presentation engine 118 and a virtual drive a result of the adversarial machine learning process, the 4D
simulating engine 120 to achieve the functionality thereof. GAN model , in particular, a 4D GAN generator sub -model ,
In the example depicted in FIG . 1 , the virtual traffic presen- can be configured to create simulated 4D traffic environ
tation engine 118 represents a hardware module configured 5 mental data sufficiently close to real 4D traffic environmen
to generate virtual photorealistic 4D traffic environment tal data .
using simulated 4D traffic environmental data. In some In the example of FIG . 2 , the flowchart 200 continues to
embodiments, the generate virtual photorealistic 4D traffic module 206 , with creating simulated 4D traffic environmen
environment is a time -dependent three - dimensional virtual tal data by applying the created 4D GAN model to the
traffic environment in which a virtual vehicle is capable of 10 created 4D traffic environmental data . An applicable engine
performing a virtual driving operation. An example of for creating simulated 4D traffic environmental data , such as
specific processes to generate the virtual photorealistic 4D a 4D GAN generating engine ( e.g. , the 4D GAN generating
traffic environment will be described below with reference engine 116 in FIG . 1 ) described in this paper, can create the
to FIG . 4 . simulated 4D traffic environmental data . In some embodi
In the example depicted in FIG . 1 , the virtual drive 15 ments, the simulated 4D traffic environmental data include
simulating engine 120 represents a hardware module con- expression of time-dependent locations and time -dependent
figured to carry out a virtual autonomous driving operation characteristics of objects, such as roads, vehicles, pedestri
in a virtual photorealistic 4D traffic environment generated ans , road - side objects, and so on , with 4D functions.
by the virtual traffic presentation engine 118. In some In the example of FIG . 2 , the flowchart 200 continues to
embodiments, the virtual autonomous driving operation 20 module 208 , with training an autonomous driving model
includes a virtual autonomous driving along a certain route using the created simulated 4D traffic environmental data .
from a certain departing point in the virtual photorealistic An applicable engine for training an autonomous driving
4D traffic environment to a certain destination point in the model, such as a virtual drive simulating engine (e.g. , the
virtual photorealistic 4D traffic environment. An example of virtual drive simulating engine 120 in FIG . 1 ) described in
specific processes to train an autonomous driving model 25 this paper, can train the autonomous driving model . A
through the virtual autonomous driving operation will be specific example of a process for training the autonomous
described below with reference to FIG . 4 . driving model according to some embodiments will be
FIG . 2 depicts a flowchart 200 of an example of a method described below with reference to FIG . 4. As aa result of the
for training a computer-based autonomous driving model training process , the autonomous driving model can be
according to some embodiments. This flowchart and other 30 configured to perform an autonomous driving operation in a
flowcharts described in this paper illustrate modules (and safer, more time -efficient, and /or more cost - efficient manner .
potentially decision points) organized in a fashion that is In the example of FIG . 2 , the flowchart 200 continues to
conducive to understanding. It should be recognized, how- module 10 , with performing a real -world autonomous
ever, that the modules can be reorganized for parallel driving operation using the trained autonomous driving
execution , reordered , modified ( changed, removed , or aug- 35 model . An applicable engine or module for performing a
mented ), where circumstances permit. In the example of real-world autonomous driving operation mounted in a
FIG . 2 , the flowchart 200 starts at module 202 with creating real -world autonomous driving vehicle can perform the
4D traffic environment data using real and / or simulated real -world autonomous driving operation . In some embodi
traffic element data . An applicable engine for creating 4D ments, an engine that is the same or substantially similar to
traffic environment data such as a real traffic data processing 40 the virtual drive simulating engine (e.g. , the virtual drive
engine ( e.g. , the real traffic data processing engine 110 in simulating engine 120 in FIG . 1 ) described in this paper, can
FIG . 1 ) and /or a simulated traffic data processing engine be employed for the engine or module for performing a
( e.g. , the simulated traffic data processing engine 112 in FIG . real -world autonomous driving operation . As aa result of the
1 ) creates the 4D traffic environment data. In some embodi- real- world autonomous driving operation using the trained
ments, the real traffic element data include one or more of 45 autonomous driving model , the real- world autonomous driv
mapping data in real -world geographical locations , traffic ing vehicle can perform an real -world autonomous driving
indicator data indicating traffic signs and traffic signals in operation in a safer, more time-efficient, and / or more cost
real - world environments, traffic regulatory data indicating efficient manner.
traffic regulations in specific real -world regions , and so on . FIG . 3 depicts a flowchart 300 of an example of ?a method
In some embodiments, the simulated traffic element data 50 for performing an adversarial machine learning process for
include one or more of simulated weather data in real / virtual a four -dimensional ( 4D ) generative adversarial network
geographical locations, simulated time- frame data indicat- (GAN ) model according to some embodiments . In the
ing environmental conditions (e.g. , brightness, congestion , example of FIG . 3 , the modules 302-308 in the flowchart
etc. ) in a specific time in a day, signal change data indicating 300 are carried out by an applicable engine such as a 4D
timing of changing traffic signals in real/ virtual geographical 55 GAN discriminating engine (e.g. , the 4D GAN discriminat
locations, simulated outside object data indicating outside ing engine 114 in FIG . 1 ) described in this paper, to train a
objects ( e.g. , pedestrian and other road - side objects ) and 4D GAN discriminator sub -model of the 4D GAN model .
movement thereof, and so on . The modules 302-308 in the flowchart 300 are carried out by
In the example of FIG . 2 , the flowchart 200 continues to an applicable engine such as a 4D GAN generating engine
module 204 , with creating /updating a 4D GAN model 60 (e.g. , 4D GAN generating engine 116 in FIG . 1 ) described
through an adversarial machine learning process. An appli- in this paper, to train a 4D GAN generator sub -model of the
cable engine for creating /updating a 4D GAN model , such 4D GAN model .
as a 4D GAN generating engine (e.g. , the 4D GAN gener- In the example of FIG . 3 , the modules of the flowchart
ating engine 116 in FIG . 1 ) described in this paper, can 300 carried out by the 4D GAN discriminating engine starts
create / update the 4D GAN model through an adversarial 65 at module 302 , with receiving 4D GAN discriminator train
machine learning process. A specific example of an adver- ing data . In some embodiments, the 4D GAN discriminator
sarial machine learning process according to some embodi- training data include pieces of simulated 4D traffic environ
US 11,392,132 B2
9 10
mental data and pieces of real 4D traffic environmental data. irregularity of movement of objects (e.g. , vehicles , pedes
The simulated 4D traffic environmental data includes 4D trians, roadside objects, etc. ) in a traffic environment, one or
traffic environmental data generated through computer- more parameters indicating conformity of light reflection in
based simulation using the 4D GAN generator sub -model by the traffic environment, one or more parameters indicating
an applicable engine, such as a 4D GAN generating engine 5 abnormality of ambient sound (e.g. , engine sound , tire noise ,
( e.g. , 4D GAN generating engine 116 in FIG . 1 ) described honking sound, human voice , bicycle noise, audio sounds
in this paper. The real 4D traffic environmental data includes from vehicle , etc. ) , and so on . In some embodiments, the 4D
4D traffic environmental data corresponding to recorded GAN discriminating engine increases one or more parameter
data of aa traffic environment in a real world . values of the parameters of the 4D GAN discrimination
In the example of FIG . 3 , the flowchart 300 continues to 10 sub -model associated with the basis of the prediction and /or
module 304 , with performing a discrimination analysis of decreases one or more parameter values of the parameters of
the received 4D GAN discriminator training data to generate the 4D GAN discrimination sub -model not associated with
a discrimination result . In some embodiments , the discrimi- the basis of the prediction when the generated discrimination
nation analysis of the receive 4D GAN discriminator train- result matches the corresponding supervisory data . To the
ing data include computer -based analysis of 4D traffic 15 contrary, the 4D GAN discriminating engine may increase
environmental data included in the received 4D GAN dis- one or more parameter values of the parameters of the 4D
criminator training data, and prediction of whether the GAN discrimination sub -model not associated with the basis
analyzed of 4D traffic environmental data is simulated 4D of the prediction and / or decreases one or more parameter
traffic environmental data or real 4D traffic environmental values of the parameters of the 4D GAN discrimination
data. To distinguish the simulated 4D traffic environmental 20 sub -model associated with the basis of the prediction when
data from the real 4D traffic environmental data, the 4D the generated discrimination result does not match the
GAN discriminating engine executes a 4D GAN discrimi- corresponding supervisory data .
nator sub -model . In some embodiments , the discrimination In the example of FIG . 3 , the modules of the flowchart
result includes whether the 4D GAN discriminating engine 300 carried out by the 4D GAN generating engine starts at
predicts the received 4D GAN discriminator training data as 25 module 312 , with generating simulated 4D traffic environ
simulated 4D traffic environmental data or real 4D traffic mental data using a 4D GAN generator sub -model. In some
environmental data . embodiments, the simulated 4D traffic environmental data
In some embodiments , the discrimination result may includes 4D traffic environmental data in aa virtual environ
further include a basis or ground based on which the 4D ment. For example, the 4D traffic environmental data in a
GAN discriminating engine made the prediction, and the 30 virtual environment may include arrangement of roads ( e.g. ,
basis or ground may include one or more specific objects in geographical mapping ), vehicles on the roads, pedestrians
a 4D traffic environment represented by the 4D GAN and other road - side objects, atmospheric setting ( e.g. ,
discriminator training data , and specific characteristics of weather, time , etc. ) in the virtual environment. Depending
the one or more specific objects that enabled the prediction . on the specific implementation , part of the 4D traffic envi
For example, a constant - pace non -meandering walking pat- 35 ronmental data, such as the arrangement of roads and
tern of a pedestrian in a 4D traffic environment represented atmospheric setting may be imported from real 4D traffic
by the 4D GAN discriminator training data may be a basis environmental data .
to predict that the 4D GAN discriminator training data is In the example of FIG . 3 , the flowchart continues to
simulated 4D traffic environmental data . In another example, module 314 , with providing simulated 4D traffic environ
non - flapping clothes of pedestrians while leaves of roadside 40 mental data for creating 4D GAN discriminator training
trees are flapping may be a basis to predict that the 4D GAN data. In a specific implementation, the simulated 4D traffic
discriminator training data is simulated 4D traffic environ- environmental data generated using the 4D GAN generator
mental data . sub model
- are provided to the 4D GAN discrimination
In the example of FIG . 3 , the flowchart 300 continues to engine as a piece of 4D GAN discriminator training data ,
module 306 , with performing matching of the generated 45 such that the 4D GAN discrimination engine can perform
discrimination result with supervisory data to generate a the discrimination analysis. The simulated 4D traffic envi
training result. In some embodiments, the supervisory data ronmental data includes some noise to make a simulated 4D
may include a correct answer indicating whether the 4D traffic environment more similar to a real 4D traffic envi
GAN discriminator training data is in fact simulated 4D ronment. For example, the noise to be included in the
traffic environmental data or real 4D traffic environmental 50 simulated 4D traffic environmental data may be irregular
data . In some embodiments, the training result includes movement of objects, such as pedestrians, clothes ( e.g. ,
accuracy of the generated discrimination result and the basis skirt) worn by pedestrians, road - side trees and leaves
or ground based on which the 4D GAN discriminating thereof, and so on . In another example, the noise to be
engine made the prediction to generate the discrimination included in the simulated 4D traffic environmental data may
result . In some embodiments, the training result may be a 55 be irregular movement (e.g. , vertical vibration ) of on - road
cumulative training result of a plurality of operations of the vehicles caused by specific road conditions (e.g. , obstacles
discrimination analysis with respect to a plurality of pieces on roads, bumps or dents in roads, etc. )
of 4D traffic environmental data . In that case , the training In the example of FIG . 3 , the flowchart continues to
result may include an accuracy rate calculated by accuracy module 316 , with receiving a training result generated by
of each matching of each discrimination result with corre- 60 executing a 4D GAN discriminator sub -model . In a specific
sponding supervisory data . implementation, the received training result includes results
In the example of FIG . 3 , the flowchart 300 continues to of matching of discrimination results with respect to one or
2

module 308 , with modifying parameter values of the 4D more pieces of simulated 4D traffic environmental data
GAN discrimination sub -model based on the training result. generated by the 4D GAN generating engine, and may not
In some embodiments, the parameters of the 4D GAN 65 include results of matching of discrimination results with
discrimination sub -model may include various applicable respect to one or more pieces of real 4D traffic environmen
parameters, including one or more parameters indicating tal data not generated by the 4D GAN generating engine.
US 11,392,132 B2
11 12
In the example of FIG . 3 , the flowchart continues to used , and one or more intermediate check points to be
module 318 , with modifying parameter values of the 4D passed . Further depending on a specific implementation , the
GAN generator sub -model . In some embodiments, the certain route may be determined based on a selected one of
parameter values of the 4D GAN generator sub -model are driving modes, which may include an economic mode ( e.g. ,
modified , so as to cause the training result to indicate less 5 lowest cost ) , a fastest mode , and so on . Moreover, the certain
accuracy . In other words , the parameter values of the 4D route may be dynamically changes as the virtual autono
GAN generator sub -model are modified , such that the 4D mous driving operation proceeds . In some embodiments ,
GAN discriminating engine is not able to distinguish simu during the virtual autonomous driving operation, a virtual
lated 4D traffic environmental data from real 4D traffic vehicle that performs the virtual autonomous driving opera
environmental data . In some embodiments, the parameter 10 tion
values of the 4D GAN generator sub -model may include the mousis driving controlled according to the computer -based autono
model so as to be safe to passengers and
same parameters as and / or different parameters from the people and animals outside the virtual vehicle .
parameters of the 4D GAN discrimination sub -model .
In the example of
According to the adversarial machine learning process for module 408 , with obtaining FIG . 4 , the flowchart 400 continues to
a four - dimensional ( 4D ) generative adversarial network 15 result of the virtual autonomous a virtual autonomous driving
(GAN ) model of some embodiments, the 4G GAN discrimi driving operation . In some
nation sub -model is modified so as to increase accuracy of embodiments, the virtual autonomous driving result of the
discrimination analysis through the machine learning pro virtual autonomous driving operation includes an actual
cess , and the 4G GAN generating sub -model is modified so route of the virtual vehicle that has been used , a time
as to decrease the accuracy of discrimination analysis by the 20 dependent position of the virtual vehicle in the actual route ,
4G GAN discrimination sub -model through machine learn- a cost of the virtual autonomous driving operation , any risks
ing process . By repeating the machine learning process , the and / or incidents involved in the virtual autonomous driving
4G GAN discrimination sub -model may be improved to operation , difference between a planned operation and actual
distinguish simulated 4D traffic environmental data from operation that has been carried out, and so on .
real 4D traffic environmental data with high accuracy, and 25 In the example of FIG . 4 , the flowchart 400 continues to
the 4G GAN generating sub - model may be improved to module 410 , with modifying parameter values of an autono
generate simulated 4D traffic environmental data that is mous driving model based on the virtual autonomous driv
close to real 4D traffic environmental data enough to be not ing result. In some embodiments, the parameters of the
distinguishable easily. autonomous driving model may include various applicable
FIG . 4 depicts a flowchart of an example of specific 30 parameters, including one or more parameters indicating
processes for training a computer -based autonomous driving movement behavior of a vehicle ( e.g. , acceleration , braking,
model according to some embodiments. The modules in the steering, idling , etc. ), exterior non -movement behavior of
flowchart 400 are carried out by one or more applicable the vehicle ( e.g. , light, flashing, honking, etc. ) , interior
engines such as a virtual traffic provisioning engine (e.g. , the non -movement behavior of the vehicle ( e.g. , navigation ,
virtual traffic provisioning engine 118 in FIG . 1 ) and a 35 audio , warning, etc. ). In some embodiments, the parameters
virtual drive simulating engine ( e.g. , virtual drive simulating of the autonomous driving model may include parameters
engine 120 in FIG . 1 ) described in this paper, to train a associated with route selection to achieve selection of time
computer-based autonomous driving model. efficient and / or cost - efficient route . In some embodiments,
In the example of FIG . 4 , the flowchart 400 starts at the autonomous driving model is modified so as to improve
module 402 , with receiving simulated 4D traffic environ- 40 the virtual autonomous driving operation that has been
mental data. In some embodiments, the simulated 4D traffic carried out in various applicable aspects , such as safety , time
environmental data is received from an applicable engine efficiency , cost efficiency, etc. For example, to improve
such as a 4D GAN generating engine ( e.g. , the 4D GAN safety of the virtual autonomous driving operation, one or
generating engine 116 in FIG . 1 ) . In some embodiments, the more parameters of the autonomous driving model associ
simulated 4D traffic environmental data is generated by the 45 ated with acceleration , braking, and /or steering levels are
applicable engine executing a well -trained 4G GAN gener- modified . In another example, to improve time efficiency
ating sub -model of a 4G GAN model . and / or cost efficiency of the virtual autonomous driving
In the example of FIG . 4 , the flowchart 400 continues to operation, one or more parameters of the autonomous driv
module 404 , with rendering the received simulated 4D traffic ing model associated with route selection and one or more
environmental data to generate virtual photorealistic 4D 50 parameters of the autonomous driving model associated with
traffic environment. In some embodiments, the rendering of acceleration and braking are modified .
the simulated 4D traffic environmental data include express- According to the training of the computer -based autono
ing time-dependent locations and time -dependent character- mous driving model of some embodiments , the computer
istics of objects, such as roads , vehicles , pedestrians, road- based autonomous driving model is modified so as to enable
side objects, and so on, with 4D functions, and generation of 55 improved autonomous driving operation through the
the virtual photorealistic 4D traffic environment using the machine learning process . By repeating the machine learn
4D functions. ing process , the computer-based autonomous driving model
In the example of FIG . 4 , the flowchart 400 continues to may be improved to achieve a safer, more time-efficient, and
module 406 , with carrying out a virtual autonomous driving more cost - efficient autonomous driving, even without car
operation in the generated virtual photorealistic 4D traffic 60 rying out a real -world autonomous test driving operation ,
environment. In some embodiments, the virtual autonomous which may involve higher risk of traffic incidents involving
driving operation includes a virtual autonomous driving passengers and / or people around the vehicle . Further, com
along a certain route from a certain departing point in the pared to the real-world autonomous test driving operation ,
virtual photorealistic 4D traffic environment to a certain which requires to bring the vehicle to real -world traffic
destination point in the virtual photorealistic 4D traffic 65 environments, the computer-based autonomous driving
environment. Depending on a specific implementation, the model for the autonomous driving can be improved at lower
certain route may further includes one or more roads to be cost and through variety of different traffic environments
US 11,392,132 B2
13 14
simulating variety of conditions that cannot be obtained at a media accessible to processor 504 , render computer system
single geographical location ( e.g. , weather, congestion , 500 into a special -purpose machine that is customized to
pedestrian density, etc. ). perform the operations specified in the instructions.
The foregoing description of the present invention has The computer system 500 further includes a read only
been provided for the purposes of illustration and descrip- 5 memory (ROM) 508 or other static storage device coupled
tion . It is not intended to be exhaustive or to limit the to bus 502 for storing static information and instructions for
invention to the precise forms disclosed . The breadth and processor 504. A storage device 510 , such as a magnetic
scope of the present invention should not be limited by any disk, optical disk, or USB thumb drive (Flash drive ), etc. , is
of the above - described exemplary embodiments. Many provided and coupled to bus 502 for storing information and
modifications and variations will be apparent to the practi- 10 instructions .
tioner skilled in the art. The modifications and variations
include any relevant combination of the disclosed features . output device ( s ) system
The computer 500 may be coupled via bus 502 to
512 , such as a cathode ray tube ( CRT) or
The embodiments were chosen and described in order to
best explain the principles of the invention and its practical LCD display ( or touch screen ), for displaying information to
application , thereby enabling others skilled in the art to is15 a computer user. Input device( s) 514, including alphanu
understand the invention for various embodiments and with meric and other keys , are coupled to bus 502 for commu
various modifications that are suited to the particular use nicating information and command selections to processor
contemplated . It is intended that the scope of the invention 504. Another type of user input device is cursor control 516 ,
be defined by the following claims and their equivalence . such as a mouse , a trackball, or cursor direction keys for
Hardware Implementation 20 communicating direction information and command selec
The techniques described herein are implemented by one tions to processor 504 and for controlling cursor movement
or more special-purpose computing devices . The special- on display 512. This input device typically has two degrees
purpose computing devices may be hard - wired to perform of freedom in two axes , a first axis ( e.g. , x ) and a second axis
the techniques, or may include circuitry or digital electronic (e.g. , y ) , that allows the device to specify positions in a
devices such as one or more application - specific integrated 25 plane. In some embodiments, the same direction information
circuits (ASICs) or field programmable gate arrays ( FPGAs) and command selections as cursor control may be imple
that are persistently programmed to perform the techniques, mented via receiving touches on a touch screen without a
or may include one or more hardware processors pro cursor.
grammed to perform the techniques pursuant to program The computing system 500 may include a user interface
instructions in firmware, memory, other storage , or a com- 30 module to implement a GUI that may be stored in a mass
bination . Such special -purpose computing devices may also storage device as executable software codes that are
combine custom hard -wired logic , ASICs , or FPGAs with executed by the computing device ( s ). This and other mod
custom programming to accomplish the techniques. The ules may include, by way of example, components, such as
special -purpose computing devices may be desktop com- software components, object- oriented software components,
puter systems, server computer systems, portable computer 35 class components and task components, processes , func
systems , handheld devices , networking devices or any other tions , attributes, procedures, subroutines, segments of pro
>

device or combination of devices that incorporate hard- gram code, drivers , firmware, microcode, circuitry, data ,
wired and / or program logic to implement the techniques . databases, data structures, tables, arrays, and variables.
Computing device ( s ) are generally controlled and coor- In general, the word “ module , " as used herein , refers to
dinated by operating system software, such as iOS , Android , 40 logic embodied in hardware or firmware, or to a collection
Chrome OS , Windows XP, Windows Vista , Windows 7 , of software instructions, possibly having entry and exit
Windows 8 , Windows 10 , Windows Server, Windows CE , points, written in a programming language, such as , for
Unix , Linux, SunOS, Solaris, iOS , Blackberry OS , example , Java , C or C ++ . A software module may be
VxWorks, or other compatible operating systems. In other compiled and linked into an executable program , installed in
embodiments, the computing device may be controlled by a 45 a dynamic link library, or may be written in an interpreted
proprietary operating system . Conventional operating sys- programming language such as , for example, BASIC , Perl,
tems control and schedule computer processes for execution , or Python. It will be appreciated that software modules may
perform memory management, provide file system , net- be callable from other modules or from themselves, and /or
working, I /O services, and provide a user interface func- may be invoked in response to detected events or interrupts .
tionality, such as a graphical user interface (“GUI” ), among 50 Software modules configured for execution on computing
other things. devices may be provided on a computer readable medium ,
FIG . 5 is a block diagram that illustrates a computer such as a compact disc , digital video disc , flash drive,
system 500 upon which any of the embodiments described magnetic disc , or any other tangible medium , or as a digital
herein may be implemented. The computer system 500 download (and may be originally stored in a compressed or
includes a bus 502 or other communication mechanism for 55 installable format that requires installation , decompression
communicating information , one or more hardware proces- or decryption prior to execution ). Such software code may
sors 504 coupled with bus 502 for processing information . be stored, partially or fully, on a memory device of the
Hardware processor ( s) 504 may be , for example, one or executing computing device, for execution by the computing
more general purpose microprocessors. device. Software instructions may be embedded in firmware,
The computer system 500 also includes a main memory 60 such as an EPROM . It will be further appreciated that
506 , such as a random access memory (RAM ), cache and / or hardware modules may be comprised of connected logic
other dynamic storage devices , coupled to bus 502 for units, such as gates and flip - flops, and / or may be comprised
storing information and instructions to be executed by of programmable units , such as programmable gate arrays or
processor 504. Main memory 506 also may be used for processors . The modules or computing device functionality
storing temporary variables or other intermediate informa- 65 described herein are preferably implemented as software
tion during execution of instructions to be executed by modules , but may be represented in hardware or firmware.
processor 504. Such instructions, when stored in storage Generally, the modules described herein refer to logical
US 11,392,132 B2
15 16
modules that may be combined with other modules or cable modem , satellite modem , or a modem to provide a data
divided into sub - modules despite their physical organization communication connection to a corresponding type of tele
or storage . phone line . As another example, communication interface
The computer system 500 may implement the techniques 518 may be a local area network (LAN ) card to provide a
described herein using customized hard -wired logic , one or 5 data communication connection to a compatible LAN (or
more ASICs or FPGAs, firmware and /or program logic WAN component to communicated with a WAN ). Wireless
which in combination with the computer system causes or links mayalso be implemented. In any such implementation ,
programs computer system 500 to be a special -purpose communication interface 518 sends and receives electrical,
machine . According to one embodiment, the techniques electromagnetic or optical signals that carry digital data
herein are performed by computer system 500 in response to 10 streams representing various types of information .
processor( s) 504 executing one or more sequences of one or A network link typically provides data communication
more instructions contained in main memory 506. Such through one or more networks to other data devices. For
instructions may be read into main memory 506 from example, a network link may provide a connection through
another storage medium , such as storage device 510. Execu- local network to a host computer or to data equipment
tion of the sequences of instructions contained in main 15 operated by an Internet Service Provider ( ISP ) . The ISP in
memory 506 causes processor ( s) 504 to perform the process turn provides data communication services through the
steps described herein . In alternative embodiments, hard- world wide packet data communication network now com
wired circuitry may be used in place of or in combination monly referred to as the " Internet ” . Local network and
with software instructions. Internet both use electrical, electromagnetic or optical sig
The term “ non -transitory media , ” and similar terms, as 20 nals that carry digital data streams. The signals through the
used herein refers to any media that store data and /or various networks and the signals on network link and
instructions that cause a machine to operate in a specific through communication interface 518 , which carry the digi
fashion . Such non - transitory media may comprise non- tal data to and from computer system 500 , are example
volatile media and /or volatile media . Non -volatile media forms of transmission media .
includes , for example, optical or magnetic disks , such as 25 The computer system 500 can send messages and receive
storage device 510. Volatile media includes dynamic data, including program code , through the network ( s ), net
memory, such as main memory 506. Common forms of work link and communication interface 518. In the Internet
non -transitory media include, for example, a floppy disk, a example , a server might transmit a requested code for an
flexible disk , hard disk , solid state drive , magnetic tape , or application program through the Internet, the ISP, the local
any other magnetic data storage medium , a CD -ROM , any 30 network and the communication interface 518 .
other optical data storage medium , any physical medium The received code may be executed by processor 504 as
with patterns of holes, a RAM , a PROM , and EPROM , a it is received, and /or stored in storage device 510 , or other
FLASH - EPROM , NVRAM , any other memory chip or non - volatile storage for later execution .
cartridge, and networked versions of the same . Each of the processes, methods, and algorithms described
Non -transitory media is distinct from but may be used in 35 in the preceding sections may be embodied in , and fully or
conjunction with transmission media . Transmission media partially automated by, code modules executed by one or
participates in transferring information between non - transi- more computer systems or computer processors comprising
tory media . For example, transmission media includes computer hardware . The processes and algorithms may be
coaxial cables, copper wire and fiber optics , including the implemented partially or wholly in application - specific cir
wires that comprise bus 502. Transmission media can also 40 cuitry.
take the form of acoustic or light waves , such as those The various features and processes described above may
generated during radio -wave and infra - red data communi- be used independently of one another, or may be combined
cations. in various ways. All possible combinations and sub- combi
Various forms of media may be involved in carrying one nations are intended to fall within the scope of this disclo
or more sequences of one or more instructions to processor 45 sure . In addition , certain method or process blocks may be
504 for execution . For example, the instructions may ini- omitted in some implementations. The methods and pro
tially be carried on a magnetic disk or solid state drive of a cesses described herein are also not limited to any particular
remote computer. The remote computer can load the instruc- sequence , and the blocks or states relating thereto can be
tions into its dynamic memory and send the instructions over performed in other sequences that are appropriate. For
a telephone line using a modem . A modem local to computer 50 example , described blocks or states may be performed in an
system 500 can receive the data on the telephone line and order other than that specifically disclosed , or multiple
use an infra - red transmitter to convert the data to an infra - red blocks or states may be combined in a single block or state .
signal. An infra - red detector can receive the data carried in The example blocks or states may be performed in serial, in
the infra - red signal and appropriate circuitry can place the parallel, or in some other manner. Blocks or states may be
data on bus 502. Bus 502 carries the data to main memory 55 added to or removed from the disclosed example embodi
506 , from which processor 504 retrieves and executes the ments . The example systems and components described
instructions. The instructions received by main memory 506 herein may be configured differently than described . For
may retrieves and executes the instructions. The instructions example , elements may be added to , removed from , or
received by main memory 506 may optionally be stored on rearranged compared to the disclosed example embodi
storage device 510 either before or after execution by 60 ments.
processor 504 . Conditional language, such as , among others, “ can , "
The computer system 500 also includes a communication " could , ” “might,” or “may,” unless specifically stated oth
interface 518 coupled to bus 502. Communication interface erwise, or otherwise understood within the context as used ,
518 provides a two -way data communication coupling to is generally intended to convey that certain embodiments
one or more network links that are connected to one or more 65 include, while other embodiments do not include , certain
local networks . For example , communication interface 518 features , elements and / or steps . Thus, such conditional lan
may be an integrated services digital network ( ISDN) card , guage is not generally intended to imply that features,
US 11,392,132 B2
17 18
elements and / or steps are in any way required for one or will be appreciated that the decision to implement a hard
more embodiments or that one or more embodiments nec- ware engine mechanically , in dedicated and permanently
essarily include logic for deciding, with or without user configured circuitry , or in temporarily configured circuitry
input or prompting, whether these features, elements and / or (e.g. , configured by software) may be driven by cost and
steps are included or are to be performed in any particular 5 time considerations.
embodiment. Accordingly, the phrase " hardware engine” should be
Any process descriptions, elements, or blocks in the flow understood to encompass a tangible entity, be that an entity
diagrams described herein and / or depicted in the attached that is physically constructed, permanently configured ( e.g. ,
figures should be understood as potentially representing hardwired ), or temporarily configured ( e.g. , programmed ) to
modules , segments, or portions of code which include one or 10 operate in aa certain manner or to perform certain operations
more executable instructions for implementing specific logi- described herein . As used herein , “ hardware - implemented
cal functions or steps in the process . Alternate implementa- engine ” refers to a hardware engine. Considering embodi
tions are included within the scope of the embodiments ments in which hardware engines are temporarily configured
described herein in which elements or functions may be (e.g. , programmed ), each of the hardware engines need not
deleted , executed out of order from that shown or discussed, 15 be configured or instantiated at any one instance in time . For
including substantially concurrently or in reverse order, example , where a hardware engine comprises a general
depending on the functionality involved , as would be under- purpose processor configured by software to become a
stood by those skilled in the art . special -purpose processor, the general- purpose processor
It should be emphasized that many variations and modi- may be configured as respectively different special -purpose
fications maybe made to the above - described embodiments, 20 processors (e.g. , comprising different hardware engines ) at
the elements of which are to be understood as being among different times . Software accordingly configures a particular
other acceptable examples. All such modifications and varia- processor or processors, for example , to constitute par
tions are intended to be included herein within the scope of ticular hardware engine at one instance of time and to
this disclosure. The foregoing description details certain constitute a different hardware engine at a different instance
embodiments of the invention . It will be appreciated, how- 25 of time .
ever, that no matter how detailed the foregoing appears in Hardware engines can provide information to , and receive
text , the invention can be practiced in many ways . As is also information from , other hardware engines. Accordingly, the
stated above , it should be noted that the use of particular described hardware engines may be regarded as being
terminology when describing certain features or aspects of communicatively coupled . Where multiple hardware
the invention should not be taken to imply that the termi- 30 engines exist contemporaneously, communications may be
nology is being re - defined herein to be restricted to includ- achieved through signal transmission ( e.g. , over appropriate
ing any specific characteristics of the features or aspects of circuits and buses ) between or among two or more of the
the invention with which that terminology is associated . The hardware engines. In embodiments in which multiple hard
scope of the invention should therefore be construed in ware engines are configured or instantiated at different
accordance with the appended claims and any equivalents 35 times , communications between such hardware engines may
thereof. be achieved , for example, through the storage and retrieval
Engines, Components, and Logic of information in memory structures to which the multiple
Certain embodiments are described herein as including hardware engines have access . For example, one hardware
logic or a number of components, engines , or mechanisms. engine may perform an operation and store the output of that
Engines may constitute either software engines (e.g. , code 40 operation in a memory device to which it is communica
embodied on a machine- readable medium) or hardware tively coupled . A further hardware engine may then , at a
engines. A “ hardware engine” is a tangible unit capable of later time , access the memory device to retrieve and process
performing certain operations and may be configured or the stored output. Hardware engines may also initiate com
arranged in a certain physical manner. In various example munications with input or output devices, and can operate on
embodiments, one or more computer systems (e.g. , a stand- 45 a resource ( e.g. , a collection of information ).
alone computer system , a client computer system , or a server The various operations of example methods described
computer system ) or one or more hardware engines of a herein may be performed , at least partially, by one or more
computer system ( e.g. , a processor or a group of processors) processors that are temporarily configured ( e.g. , by soft
may be configured by software ( e.g. , an application or ware ) or permanently configured to perform the relevant
application portion ) as a hardware engine that operates to 50 operations. Whether temporarily or permanently configured ,
perform certain operations as described herein . such processors may constitute processor- implemented
In some embodiments , a hardware engine may be imple- engines that operate to perform one or more operations or
mented mechanically, electronically, or any suitable combi- functions described herein . As used herein , “ processor
nation thereof. For example, a hardware engine may include implemented engine” refers to a hardware engine imple
dedicated circuitry or logic that is permanently configured to 55 mented using one or more processors.
perform certain operations. For example , a hardware engine Similarly, the methods described herein may be at least
may be a special-purpose processor, such as a Field - Pro- partially processor - implemented, with a particular processor
grammable Gate Array (FPGA ) or an Application Specific or processors being an example of hardware. For example,
Integrated Circuit ( ASIC ). A hardware engine may also at least some of the operations of a method may be per
include programmable logic or circuitry that is temporarily 60 formed by one or more processors or processor -imple
configured by software to perform certain operations. For mented engines. Moreover, the one or more processors may
example, a hardware engine may include software executed also operate to support performance of the relevant opera
by a general-purpose processor or other programmable tions in aa “ cloud computing ” environment or as a “ software
processor. Once configured by such software , hardware as a service ” ( SaaS ) . For example, at least some of the
engines become specific machines ( or specific components 65 operations may be performed by a group of computers ( as
of a machine) uniquely tailored to perform the configured examples of machines including processors ), with these
functions and are no longer general -purpose processors . It operations being accessible via aa network (e.g. , the Internet)
US 11,392,132 B2
19 20
and via one or more appropriate interfaces ( e.g. , an Appli- The data stores described herein may be any suitable
cation Program Interface (API)). structure ( e.g. , an active database , a relational database , a
The performance of certain of the operations may be self-referential database , a table , a matrix , an array, a flat file ,
distributed among the processors, not only residing within a a documented - oriented storage system , a non - relational No
single machine , but deployed across a number of machines . 5 SQL system , and the like ) , and may be cloud - based or
In some example embodiments, the processors or processor otherwise .
implemented engines may be located in a single geographic As used herein , the term “ or” may be construed in either
location (e.g. , within a home environment, an office envi may an inclusive or exclusive sense . Moreover, plural instances
ronment, or a server farm ). In other example embodiments , 10 described be provided for resources, operations, or structures
the processors or processor - implemented engines may be aries between herein as a single instance. Additionally, bound
distributed across a number of geographic locations . various resources , operations, engines ,
engines, and data stores are somewhat arbitrary, and par
LANGUAGE ticular operations are illustrated in a context of specific
illustrative configurations. Other allocations of functionality
Throughout this specification , plural instances may imple are envisionedof the
15 and may fall within a scope of various
ment components, operations, or structures described as a embodiments present disclosure . In general, structures
single instance. Although individual operations of one or example configurations may beasimplemented
and functionality presented separate resources in the
as a combined
more methods are illustrated and described as separate structure or resource . Similarly , structures and functionality
operations, one or more of the individual operations may be 20 presented as a single resource may be implemented as
performed concurrently, and nothing requires that the opera separate resources . These and other variations, modifica
tions be performed in the order illustrated . Structures and tions , additions, and improvements fall within a scope of
functionality presented as separate components in example embodiments of the present disclosure as represented by the
configurations may be implemented as a combined structure appended claims . The specification and drawings are ,
or component. Similarly, structures and functionality pre- 25 accordingly, to be regarded in an illustrative rather than a
sented as a single component may be implemented as restrictive sense .
separate components . These and other variations, modifica- Conditional language, such as , among others, “ can , "
tions , additions, and improvements fall within the scope of “ could ,” “ might ,” or “ may , ” unless specifically stated oth
the subject matter herein . erwise , or otherwise understood within the context as used ,
Although an overview of the subject matter has been 30 is generally intended to convey that certain embodiments
described with reference to specific example embodiments , include, while other embodiments do not include , certain
various modifications and changes may be made to these features, elements and / or steps . Thus, such conditional lan
embodiments without departing from the broader scope of guage is not generally intended imply that features ,
embodiments of the present disclosure . Such embodiments elements and / or steps are in any way required for one or
of the subject matter may be referred to herein , individually 35 more embodiments or that one or more embodiments nec
or collectively, by the term “ invention ” merely for conve- essarily include logic for deciding, with or without user
nience and without intending to voluntarily limit the scope input or prompting, whether these features, elements and /or
of this application to any single disclosure or concept if more steps are included or are to be performed in any particular
than one is , in fact, disclosed . embodiment.
The embodiments illustrated herein are described in suf- 40 Although the invention has been described in detail for
ficient detail to enable those skilled in the art to practice the the purpose of illustration based on what is currently con
teachings disclosed . Other embodiments may be used and sidered to be the most practical and preferred implementa
derived therefrom , such that structural and logical substitu- tions , it is to be understood that such detail is solely for that
tions and changes may be made without departing from the purpose and that the invention is not limited to the disclosed
scope of this disclosure. The Detailed Description , therefore , 45 implementations, but , on the contrary, is intended to cover
is not to be taken in a limiting sense , and the scope of various modifications and equivalent arrangements that are within
embodiments is defined only by the appended claims , along the spirit and scope of the appended claims . For example , it
with the full range of equivalents to which such claims are is to be understood that the present invention contemplates
entitled . that, to the extent possible , one or more features of any
It will be appreciated that an " engine, " " system , ” “ data 50 embodiment can be combined with one or more features of
store," and /or “ database” may comprise software, hardware , any other embodiment.
firmware, and / or circuitry. In one example, one or more
software programs comprising instructions capable of being What is claimed is :
executable by a processor may perform one or more of the 1. A computer - implemented method comprising:
functions of the engines, data stores, databases, or systems 55 creating time -dependent three - dimensional (3D ) traffic
described herein . In another example, circuitry may perform environment data using at least one of real traffic
the same or similar functions. Alternative embodiments may element data and simulated traffic element data;
comprise more, less , or functionally equivalent engines, creating simulated time -dependent 3D traffic environmen
systems , data stores , or databases, and still be within the tal data by applying a time -dependent 3D generative
scope of present embodiments . For example , the function- 60 adversarial network (GAN ) model to the created simu
ality of the various systems , engines, data stores , and / or lated time -dependent 3D traffic environment data ,
databases may be combined or divided differently. wherein the simulated time-dependent 3D traffic envi
“ Open source ” software is defined herein to be source ronmental data includes:
code that allows distribution as source code as well as honking sounds and tire sounds ; and
compiled form , with a well -publicized and indexed means of 65 at least one of simulated weather data , simulated traffic
obtaining the source , optionally with a license that allows signal data , simulated pedestrian data , and simulated
modifications and derived works. obstacle data ; and
US 11,392,132 B2
21 22
training a computer-based autonomous driving model modifying parameter values of the time-dependent 3D
using the simulated time - dependent 3D traffic environ- GAN generator sub -model based on the training result .
mental data . 6. The computer - implemented method of claim 1 ,
2. The computer -implemented method of claim 1 , further wherein the training the computer -based autonomous driv
comprising : 5 ing model comprises:
creating the time -dependent 3D generative adversarial rendering the simulated time-dependent 3D traffic envi
network (GAN ) model through an adversarial machine ronmental data to generate a virtual photorealistic time
learning process of a time -dependent 3D GAN dis dependent 3D traffic environment;
criminator sub -model and aa time-dependent 3D GAN carrying out , using the computer -based autonomous driv
generator sub -model of the time -dependent 3D GAN 10 ing model , a virtual autonomous driving operation in
model. the generated virtual photorealistic time-dependent 3D
3. The computer - implemented method of claim 2 , traffic environment;
wherein the adversarial machine learning process of the obtaining a virtual autonomous driving result of the
virtual autonomous driving operation; and
time-dependent 3D GAN discriminator sub -model com 15 modifying parameter values of the computer -based
prises: autonomous driving model based on the virtual autono
receiving time -dependent 3D GAN discriminator training mous driving result.
data from the time-dependent 3D GAN generator sub- 7. A system for an autonomous-driving vehicle, compris
model; ing :
performing, using the time -dependent 3D GAN discrimi- 20 one or more processors ; and
nator sub -model , discrimination analysis of the memory storing instructions that, when executed by the
received time-dependent 3D GAN discriminator train one or more processors , cause the one or more proces
ing data to generate a discrimination result indicating sors to :
whether the time-dependent 3D GAN discriminator create time -dependent three - dimensional ( 3D ) traffic
sub -model determined that the time -dependent 3D 25 environment data using at least one of real traffic
GAN discriminator training data represents real -world element data and simulated traffic element data ;
time -dependent 3D traffic environmental data or simu- create simulated time -dependent 3D traffic environmental
lated time-dependent 3D traffic environmental data ; data by applying a time-dependent 3D generative
performing matching of the generated discrimination adversarial network (GAN ) model to the created simu
result with supervisory data indicating whether the 30 lated time -dependent 3D traffic environment data ,
time -dependent 3D GAN discriminator training data wherein the simulated time -dependent 3D traffic envi
represents real - world time- dependent 3D traffic envi ronmental data includes:
ronmental data or simulated time- dependent 3D traffic honking sounds and tire sounds ; and
environmental data , to generate a training result indi at least one of simulated weather data, simulated traffic
cating a trained level of the time -dependent 3D GAN 35 signal data, simulated pedestrian data, and simulated
discriminator sub -model ; and obstacle data ; and
modifying parameter values of the time -dependent 3D train a computer -based autonomous driving model using
GAN discrimination sub -model based on the training the simulated time-dependent 3D traffic environmental
result. data .
4. The computer - implemented method of claim 3 , 40 8. The computer - implemented method of claim 3 ,
wherein the adversarial machine learning process of the wherein the discrimination result further indicates a basis on
time -dependent 3D GAN generator sub -model further com- which the 3D GAN discriminator sub -model made the
prises: prediction .
generating, using the time -dependent 3D GAN generator 9. The computer - implemented method of claim 8 ,
sub -model , the simulated time -dependent 3D traffic 45 wherein the basis includes one or more specific objects
environmental data ; represented by the 3D GAN discriminator training data and
providing the generated simulated time- dependent 3D specific characteristics of the one or more specific objects
traffic environmental data for creating time -dependent that enabled the prediction.
3D GAN discriminator training data to be used by the 10. The system of claim 7 , wherein the simulated time
time -dependent 3D GAN discriminator sub -model ; 50 dependent 3D traffic environmental data includes a noise
receiving the training result from the time -dependent 3D from trees and leaves.
GAN discriminator sub -model; and 11. The system of claim 7 , wherein the simulated time
modifying parameter values of the time -dependent 3D dependent 3D traffic environmental data includes a noise
GAN generator sub -model based on the training result. from pedestrians.
5. The computer - implemented method of claim 2 , 55 12. The system of claim 7 , wherein the instructions cause
wherein the adversarial machine learning process of the the one or more processors to create the time -dependent 3D
time -dependent 3D GAN generator sub -model comprises : generative adversarial network (GAN ) model through an
generating, using the time - dependent 3D GAN generator adversarial machine learning process of a time -dependent
sub -model , the simulated time -dependent 3D traffic 3D GAN discriminator sub -model and a time -dependent 3D
environmental data ; 60 GAN generator sub -model of the time- dependent 3D GAN
providing the generated simulated time- dependent 3D model.
traffic environmental data for creating time- dependent 13. The system of claim 12 , wherein the adversarial
9

3D GAN discriminator training data ; machine learning process of the time -dependent 3D GAN
receiving a training result indicating a trained level of the discriminator sub -model comprises:
time -dependent 3D GAN discriminator sub -model 65 receiving time-dependent 3D GAN discriminator training
from the time- dependent 3D GAN discriminator sub- data from the time -dependent 3D GAN generator sub
model ; and model;
US 11,392,132 B2
23 24
performing, using the time-dependent 3D GAN discrimi- providing the generated simulated time -dependent 3D
nator sub -model , discrimination analysis of the traffic environmental data for creating time -dependent
received time -dependent 3D GAN discriminator train 3D GAN discriminator training data ;
ing data to generate a discrimination result indicating receiving a training result indicating a trained level of the
whether the time-dependent 3D GAN discriminator 5 time - dependent 3D GAN discriminator sub -model
sub -model determined that the time -dependent 3D from the time -dependent 3D GAN discriminator sub
GAN discriminator training data represents real -world model; and
time- dependent 3D traffic environmental data or simu modifying parameter values of the time -dependent 3D
lated time -dependent 3D traffic environmental data ;
performing matching of the generated discrimination 10 GAN generator sub -model based on the training result .
result with supervisory data indicating whether the 16. The system of claim 7 , wherein the training the
time-dependent 3D GAN discriminator training data computer -based autonomous driving model comprises:
represents real-world time- dependent 3D traffic envi- rendering the simulated time -dependent 3D traffic envi
ronmental data or simulated time -dependent 3D traffic ronmental data to generate virtual photorealistic time
environmental data , to generate a training result indi dependent 3D traffic environment;
cating a trained level of the time -dependent 3D GAN 15 carrying out , using the computer -based autonomous driv
discriminator sub -model ; and ing model , a virtual autonomous driving operation in
modifying parameter values of the time -dependent 3D the generated virtual photorealistic time-dependent 3D
GAN discrimination sub -model based on the training traffic environment ;
result.
14. The system of claim 13 , wherein the adversarial 20 obtaining a virtual autonomous driving result of the
virtual autonomous driving operation ; and
machine learning process of the time-dependent 3D GAN modifying parameter values of the computer -based
generator sub -model further comprises: autonomous driving model based on the virtual autono
generating, using the time -dependent 3D GAN generator
sub -model , the simulated time -dependent 3D traffic 25 17. The driving
mous result.
system of claim 7 , wherein the simulated time
environmental data ; dependent 3D traffic environmental data includes vertical
providing the generated simulated time -dependent 3D vibration
traffic environmental data for creating time- dependent on a road .of vehicles caused by obstacles , bumps, or dents
3D GAN discriminator training data to be used by the 18. The system of claim 13 , wherein the discrimination
time- dependent 3D GAN discriminator sub -model ;
receiving the training result from the time- dependent 3D 30 result further subindicates
discriminator -model amadebasistheonprediction
which .the 3D GAN
GAN discriminator sub -model; and
modifying parameter values of the time -dependent 3D one19.orThe system of claim 18 , wherein the basis includes
more specific objects represented by the 3D GAN
GAN generator sub -model based on the training result . discriminator training data and specific characteristics of the
15. The system of claim 12 , wherein the adversarial
machine learning process of the time- dependent 3D GAN 35 one20.or more specific objects that enabled the prediction.
The system of claim 7 , wherein the tire sounds
generator sub -model comprises: comprise screeching and the simulated time- dependent 3D
generating, using the time -dependent 3D GAN generator traffic environmental data includes engine sounds .
sub -model , the simulated time-dependent 3D traffic
environmental data ;

You might also like