Manufacturing Assembly Simulations in Virtual and Augmented Reality
Manufacturing Assembly Simulations in Virtual and Augmented Reality
net/publication/334726493
CITATIONS READS
2 1,058
3 authors:
Ming C. Leu
National Taiwan University
335 PUBLICATIONS 8,118 CITATIONS
SEE PROFILE
Some of the authors of this publication are also working on these related projects:
Cyber-Physical Sensing, Modeling, and Control with Augmented Reality for Smart Manufacturing Workforce Training and Operations Management View project
All content following this page was uploaded by Wenjin Tao on 27 July 2019.
Abstract:
Virtual Reality (VR) and Augmented Reality (AR) technologies have been well
researched for decades, and recently they have been introduced to the consumer
market and are being applied to many fields. This chapter focuses on utilizing
VR/AR technologies for assembly simulations in advanced manufacturing. First,
some basic terminologies and concepts are clarified, and the VR/AR technologies
are outlined to provide a brief introduction to this topic. State-of-the-art
methodologies including modeling, sensing, and interaction that enable the
VR/AR assembly simulations are then reviewed and discussed. This is followed
by assembly examples applying the technologies, and a hands-on case study is
used to provide a demonstration of and practical guide to implementing a VR/AR
assembly simulation application. Finally, the limitations and challenges of current
VR/AR technologies and future research needs are discussed.
1.1 Introduction
Virtual Reality (VR) and Augmented Reality (AR) technologies have been studied
and prototyped in labs for decades without much public attention. However, in
recent years, they have been introduced to the consumer market and are being
applied to a wide range of fields due to the cost reduction of VR/AR hardware and
the algorithm improvement of software. In the manufacturing assembly area,
VR/AR technologies have been used to simulate the costly processes beforehand
and help train the workforce using a more interactive way. This section introduces
some basic terminologies and concepts to provide a brief introduction to this topic.
3
4 Augmented, Virtual, and Mixed Reality Applications in Advanced Manufacturing
machine that could provide users a realistic experience. From then on, many
researchers have worked on this area, and various prototypes have been developed.
The term VR refers to a series of techniques for humans to visualize,
manipulate and interact with computers in a realistic virtual environment, which
provides an interactive graphics interface enhanced by non-visual modalities such
as auditory, haptic, and smell feedback to enable the user feeling the presence of
a real physical environment [9, 15, 39]. As shown in Fig. 1, an ideal VR system
can generate a virtual environment for the user by providing the five senses,
including sight, hearing, touch, smell, and taste. There are different VR systems
such as CAVE VR systems [21, 81], desktop VR systems, and head-mounted VR
systems.
Real/Virtual Environment
Sight
Taste Hearing
Human
Senses
Smell Touch
For VR, a virtual environment is generated to provide users with fully immersive
experiences. For Augmented Reality (AR), instead of generating a virtual
Manufacturing Assembly Simulations in Virtual and Augmented Reality 5
Mixed
Reality (MR)
To keep pace with the vibrant technology revolution planned in the Industry 4.0
era, more and more manufacturers have been reconsidering their assembly
systems. More flexible and efficient assembly methods and strategies have to be
developed to meet the dynamic needs of customers and the shortened product
lifecycle. In such a context, the current assembly systems must be upgraded in
order for products to gain success and maintain competitiveness in the market.
These goals could be achieved through assembly simulation in a virtual
environment before launching a real factory to identify potential problems without
6 Augmented, Virtual, and Mixed Reality Applications in Advanced Manufacturing
the use of physical mockups, thus shortening the design cycle and improving
product quality.
With the help of VR/AR technologies, which puts humans in the loop and
takes the human’s experience at the first place, the assembly process can be
simulated in a more immersive, interactive manner, especially for worker-involved
assembly tasks. Also, assembly training can be conducted in VR/AR environments
to train workers and improve their skills. The main building blocks needed for
assembly simulation in VR/AR are briefly described below.
Modeling. One of the primary tasks for developing a MAS application is to
create the digital models. The 3D model of a physical part can be generated using
CAD software or a 3D reconstruction process with the acquisition of part
geometric data in digital form from an existing physical part. Unlike gaming
VR/AR, the digital 3D models in the aspect of MAS should be accurate in
dimensions, realistic in physics, and functional as real parts in the virtual
environment.
Environment Sensing & Pose Estimation. In VR, we need to track the
user’s pose to change the digital contents in the user’s view accordingly to make
the user feel that he/she is in the real world performing a real task. In AR, sensing
the environment and tracking the user’s pose is even more crucial, because it
requires real-time registration and tracking to realize the optimal mapping between
the virtual world and the real world.
Interface and Interaction. In VR/AR, the visual interface is the most
important one. In addition, auditory and haptic interfaces, and even smell and taste
interfaces can be used to further augment the visual interface to make the user feel
fully immersed in a VR/AR environment. In terms of interaction, how the user
manipulates and interacts with the virtual objects as he/she does with physical
objects should be developed.
This section reviews and discusses the state-of-the-art methodologies that enable
VR/AR manufacturing assembly simulations.
Creating digital models for virtual objects is one of the primary tasks in MAS.
The conventional approach for building 3D models is to use commercial CAD
software, such as NX [47], SolidWorks [25], and CATIA [24], to create CAD
models for virtual objects. Another approach is reconstruction of 3D models from
Manufacturing Assembly Simulations in Virtual and Augmented Reality 7
real objects, which starts with data acquisition from physical objects in a real
environment and ends with digital models representing these objects on the
computer [64]. With this approach, the often time-consuming modeling process
done by human designers can be automated, which significantly reduces the
development time and cost. As shown in Fig. 3, the process of 3D model
reconstruction includes data acquisition, data processing, positional registration,
modeling, and rendering [27, 28].
Data Positional
Preprocessing Modeling Rendering
Acquisition Registration
To create a 3D model from a real object, the first step is to acquire the digital
data of the 3D object’s surface. Many data acquisition techniques have been
developed, and they can be categorized as mechanical and optical approaches.
There are some other methods such as ultrasonography, radiography and Magnetic
Resonance Imaging (MRI) for data acquisition purpose. They are not discussed
here as they are costly and not commonly used for MAS applications. A
classification of these techniques is illustrated in Fig. 4. Mechanical data
acquisition is a mature and well-established method that has been used for many
years in reverse engineering and industrial inspection. It utilizes a Coordinate
Measuring Machine (CMM) or a robotic arm to measure the coordinates of points
on the object’s surface to acquire the shape. Currently, optical methods are more
widely used as they can provide non-contact, relatively more efficient data
acquisition. In addition to acquiring the shape, they can capture the appearance as
well, which further reduces the time consumed for modeling and rendering. The
optical methods can be divided into passive and active ones. Passive methods do
not emit light onto the target object, and the shape of the object is estimated from
the perceived images, using techniques such as shape from shading [36], shape
from silhouette [18, 51], and stereo vision [30, 31, 69]. Active methods often
project light onto the target object before sensing. The structural light based
method [67] projects a certain pattern onto the object and then retrieves the depth
information by analyzing the distorted pattern from the captured image. The Time-
of-Flight based method [22] measures the time taken for the light to travel from
the transmitter to the object surface and back to the receiver, based on which the
depth information can be calculated [46]. The light source can be a laser or a near-
infrared LED.
8 Augmented, Virtual, and Mixed Reality Applications in Advanced Manufacturing
CMM
Mechanical
Shape from
Robotic Arm
Shading
Stereo Vision
Optical
Structural Light
Active
Time-of-Flight
After the procedure of data acquisition, the 3D model of the object can be
generated by surface reconstruction, the objective of which is to determine the
object’s surface geometry from a given finite set of measurements. Usually, the
acquired data are disorganized and noisy. Furthermore, the surface may not have
a specific topological type and could be arbitrary in shape. The steps to generate
3D models from the acquired digital data include the following [64]: (i): Pre-
processing: Erroneous data are removed, and the noise in the data is smoothed out;
(ii) Determination of the surface’s global topology; (iii) Generation of the
polygonal surface; and (iv) Post-processing: Edge correction, triangle insertion,
hole filling, and polygon editing are used to optimize and smooth the shape.
The data acquisition devices are usually called 3D scanners. Besides
expensive industrial scanners, there are some low-cost 3D scanners in the market
that are affordable and suitable for MAS development, such as Microsoft Kinect
[53], MakerBot Digitizer [50], and 3D Systems Sense [1], which are shown in
Figure 5. The 3D model reconstruction software is also provided to generate the
digital models from 3D scanning.
Figure 5: Example low-cost 3D scanners for data acquisition [53, 50, 1].
Manufacturing Assembly Simulations in Virtual and Augmented Reality 9
This section reviews different kinds of sensors and sensing technologies for
environment perception and human pose estimation, including marker-based and
marker-less tracking technologies. Human pose estimation methods are also
reviewed.
Human-computer interfaces provide the user with different visual, haptic and
auditory sensations, which play a vital role in a virtual assembly system for
increasing the degree of immersion in the VR/AR environment. Figure 7 shows a
12 Augmented, Virtual, and Mixed Reality Applications in Advanced Manufacturing
typical VR/AR system configuration with physical input and output devices that
transmit information between the user and the environment. These sensing
technologies are essential to the realism of the VR/AR system. The critical
technologies addressed here include visual interface, auditory modeling and
rendering, and haptic modeling and rendering.
Visual interface is the most import interface in VR/AR. There exists different
hardware providing visual interfaces, such as desktop displays, hand-held displays,
projection displays, and Head Mounted Displays (HMD) that have been the
mainstream due to the fully immersive experiences they provided. VR HMDs can
be divided into three classes, tethered HMDs that need to be connected to a PC or
console, mobile-dependent HMDs that need to attach onto a smartphone, and
standalone headsets that do not need to be connected to other devices. Figure 8
shows three VR glasses available in the market that fall into these three classes and
their costs range from $100 to $500. Some of the mobile-dependent products cost
less than $100, which are affordable to most consumers. AR HMDs are relatively
more expensive than VR HMDs because they require more complicated optical
designs to realize the see-through ability. Figure 9 shows three AR HMDs in the
market. The selection of visual hardware for a given application should take into
consideration of the resolution, update rate, field-of-view, head tracking accuracy,
latency, etc.
Manufacturing Assembly Simulations in Virtual and Augmented Reality 13
Audio clues can be used to augment visual interfaces for assembly simulation in
VR/AR. Auditory rendering is especially helpful when haptic feedback is not
available. Physics-based sound modeling is too computationally expensive for
real-time rendering required in virtual assembly. Synthetic sound can be used to
approximate the real sound generated from the physical assembly, making the
simulation more realistic. Spectral modeling can be used as the basis for sound
synthesis in virtual assembly [57, 87]. Sound generation using deep learning
methods also can be used in this case [79].
In the virtual environment, a sound generator can be attached to a virtual
object to generate 3D audio for auditory rendering. 3D audio can provide listeners
a spatial hearing experience that enables them to sense where they are relative to
the sound sources, further immersing them in the virtual world.
Haptic devices provide force or tactile feedback to the users can further improve
the realism besides visual and auditory sensations. Such devices usually are
embedded with sensors and actuators to measure the user’s contact position with a
virtual object and apply force or other haptic feedback to the user at the contact
position. The current haptic devices can be classified into three categories: (1)
handheld devices such as the Touch from 3D Systems [2] that has been used widely
as a force feedback device; (2) hand wearable devices such as the Sense Glove [68]
14 Augmented, Virtual, and Mixed Reality Applications in Advanced Manufacturing
which provides feedback to individual fingers to simulate how the users interact
with the real objects; and (3) haptic suits such as the Teslasuit [75] which can
provide haptic feedback to the body covered by the suit. These haptic devices have
some drawbacks such as having strict geometry, placement and workspace
requirements, and being cumbersome for wearing, which limit them from being
widely used.
Figure 10: Haptic devices of 3D System Touch, Sense Glove, and Teslasuit (from
left to right) [2, 68, 75].
Figure 11: A operator wearing an Oculus headset for immersive VR and attached
with multiple markers for motion tracking (Image courtesy of OPTIS) [11].
AR technology allows the operator to see the real environment with virtual
information superimposed upon the physical world. This has been popular in AR-
assisted assembly training and guidance [66, 89, 90, 84]. AR also has been applied
to other assembly related tasks, such as manual assembly station planning [63],
assembly workplace design [59], and assembly constraint analysis [60].
16 Augmented, Virtual, and Mixed Reality Applications in Advanced Manufacturing
There are various tools that can be used for developing AR MAS applications.
Some commonly used ones are introduced as follows:
Unity. Unity is a game engine developed by Unity Technologies that allows
its users to develop games [77]. The engine can provide various visual effects
including 2D and 3D graphics, textures, lighting and shading. The users can create
a scene by importing customized CAD models with texture rendering. The engine
platform supports scripting via programming languages such as C# and JavaScript.
Unreal Engine. Unreal Engine is a game engine that was first brought to the
gaming platform in 1998 and has been widely used [78]. The engine is
programmed in C++ and is available for different operating systems.
ARToolKit. ARToolKit is a library that can be utilized for AR application
development and was originally developed by Dr. Hirokazu Kato. It is being
supported by multidisciplinary institutions across the country [42]. The library can
provide AR rendering through camera pose estimation and target tracking.
Vuforia. Vuforia is a Software Development Kit (SDK) for AR development
[82]. It provides the functionalities of target recognition and tracking, enabling its
users to superimpose 3D digital contents with respect to the world coordinates as
2D image registration for a composite view. The user can define the trackable
marker for maker-based AR by importing the desired image target for
augmentation with different configurations.
Wikitude. Wikitude SDK combines target recognition, tracking, and
simultaneous localization and mapping (SLAM), as well as Geo-location, for
various AR development engines and configurations across Android and iOS [86].
ARKit. ARKit from Apple is a platform for building AR iOS applications. It
provides object detection and mapping functions for marker-less tracking, enabling
the user to directly overlay information onto the environment with different
geometries [5].
ARCore. ARCore is an AR platform from Google. It integrates three main
top-notch elements, i.e. motion tracking, environmental understanding, and light
estimation [32]. ARCore provides a wide variety of features and functionalities
with different APIs and can be deployed on both Android and iOS devices.
In this demonstration project, the application is developed using Unity with
the support of Vuforia SDK due to its fundamental usability and accessibility for
AR implementation.
18 Augmented, Virtual, and Mixed Reality Applications in Advanced Manufacturing
1.4.2 Modeling
In this case study, NX is chosen as the CAD software to build the 3D digital model
for the spindle carriage. Figure 14 shows the created CAD model of the spindle
carriage in NX [104].
1.4.3 AR realization
“Image Target Behavior” of the “ImageTarget” prefab by setting the dataset and
image target. After the setting is finished, the augmented view can be visualized
through the display while the marker is being captured by the camera. Figure 17
shows the AR scene with generated CAD models superimposed.
Figure 17: An assembly scene runs in real time with AR instructions. Multiple
types of instructions are rendered through a display including texts, graphics, and
3D animations.
Webcam
AR onsite display
Subassembly
of the CNC
machine
Webcam
Toolkit
Marker
Figure 20: A comparison of two subjects performing the experiment with two
different methods: (left) manual and (right) AR.
The 20 subjects are divided into two groups with AR and manual instructions
provided, respectively. During the experiment, each subject is asked to perform
the assembly task by following the instructions step by step; see Fig. 20. To assess
the performance, the completion time and number of errors are recorded
throughout the experiment. The performance comparison is shown in Figure 21;
by using AR guidance, the completion time and number of errors show 33.2% and
32.4% of reduction, respectively. The considerable improvement demonstrates the
great potential of applying AR technology to industries for manufacturing
assembly training.
Figure 21: The comparison between manual and AR guidance: (left) completion
time and (right) number of errors.
This section discusses the limitations and challenges of current VR/AR and MAS
technologies and future research needs towards MAS realism improvement,
worker behavior understanding, and sharing and collaborative MAS.
Manufacturing Assembly Simulations in Virtual and Augmented Reality 23
In a MAS task, understanding the worker’s behavior can be used for quantification
and evaluation of the worker’s performance, as well as to provide onsite
instructions in VR/AR [74]. Studies in the following aspects need to be conducted:
24 Augmented, Virtual, and Mixed Reality Applications in Advanced Manufacturing
1.6 Conclusion
References
9. Aukstakalnis, S. and Blatner, D., 1992, “Silicon Mirage: the Art and
Science of Virtual Reality,” Peachpit Press, Inc., Berkeley, CA.
10. Bentley, 2017. Bentley reinvents the design of future vehicles with
Virtual Reality. https://www.optis-world.com/Showcase/News-
Release/Customer-focus/Bentley-reinvents-the-design-of-future-
vehicles-with-Virtual-Reality.
11. Bentley, 2017. Prelude to GTC 2017: Touch and feel the pixels in your
virtual car. http://www.digitaleng.news/virtual_desktop/2017/04/prelude-
gtc-2017-touch-feel-pixels-virtual-car/.
12. Boeing, 2018. Boeing tests Augmented Reality in the factory.
https://www.boeing.com/features/2018/01/augmented-reality-01-
18.page.
13. Bruce, C. 2015. Volvo and Microsoft partner on virtual reality and
autonomous tech. https://www.autoblog.com/2015/11/20/volvo-
microsoft-virtual-reality-autonomous-tech/.
14. Bruce, C., 2016. Tesla uses augmented reality to improve EV
manufacturing. https://www.autoblog.com/2016/03/22/tesla-augmented-
reality-factory/.
15. Burdea, G. and Coiffet, P., 1994, Virtual Reality Technology, John Willey
& Sons, INC., Chap. 1.
16. Cadena, C., Carlone, L., Carrillo, H., Latif, Y., Scaramuzza, D., Neira, J.,
Reid, I. and Leonard, J.J., 2016. Past, present, and future of simultaneous
localization and mapping: Toward the robust-perception age. IEEE
Transactions on Robotics, 32(6), pp.1309-1332.
17. Cao, Z., Simon, T., Wei, S.E. and Sheikh, Y., 2017, July. Realtime multi-
person 2d pose estimation using part affinity fields. In CVPR (Vol. 1,
No. 2, p. 7).
18. Cheung, G., Baker, S., Kanade, T., 2003a, Visual hull alignment and
refinement across time: a 3D reconstruction algorithm combining shape-
from-silhouette with stereo, Proceedings of IEEE conference on computer
vision and pattern recognition (S. II - 375-82, vol. 2 ).
19. Constine, J., 2014. Facebook’s $2 billion acquisition of Oculus closes,
now official, https://techcrunch.com/2014/07/21/facebooks-acquisition-
of-oculus-closes-now-official/.
20. Covington, P., 2017. Inside Ford’s Virtual Reality labs.
https://www.triplepundit.com/2017/01/ford-virtual-reality-labs/.
21. Cruz-Neira, C., Sandin, D.J. and DeFanti, T.A., 1993, September.
Surround-screen projection-based virtual reality: the design and
implementation of the CAVE. In Proceedings of the 20th annual
Manufacturing Assembly Simulations in Virtual and Augmented Reality 27
66. Saaski, J., Salonen, T., Hakkarainen, M., Siltanen, S., Woodward, C.,
Lempiainen, J., 2008, Integration of design and assembly using
augmented reality, Micro-assembly technologies and applications, 260:
395–404.
67. Salvi, J., Pages, J., Batlle, J., 2004, Pattern condification strategies in
structured light systems, Pattern recognition, 37(4): 827-849.
68. Sense Glove, 2018. Homepage. https://www.senseglove.com/.
69. Shapiro, L.S., Brady, J.M., 1992, Feature-based correspondence: an
eigenvector approach. Image and vision computing, 10(5): 283-288.
70. Simon, T., Joo, H., Matthews, I. and Sheikh, Y., 2017, July. Hand
keypoint detection in single images using multiview bootstrapping. In
The IEEE Conference on Computer Vision and Pattern Recognition
(CVPR) (Vol. 2).
71. Steinicke, F., Visell, Y., Campos, J. and Lécuyer, A., 2013. Human
walking in virtual environments (pp. 199-219). New York: Springer.
72. Taketomi, T., Uchiyama, H. and Ikeda, S., 2017. Visual SLAM
algorithms: a survey from 2010 to 2016. IPSJ Transactions on Computer
Vision and Applications, 9(1), p.16.
73. Tao, W., Lai, Z., Leu, M.C. and Yin Z., 2018. American Sign Language
Alphabet Recognition Using Leap Motion Controller. Proceedings of the
2018 IISE Annual Conference.
74. Tao, W., Lai, Z., Leu, M.C. and Yin, Z., 2018. Worker Activity
Recognition in Smart Manufacturing Using IMU and sEMG Signals with
Convolutional Neural Networks, 46th SME North American
Manufacturing Research Conference, NAMRC 46, Texas, USA
75. Teslasuit, 2018. Ultimate tech in smart clothing. https://teslasuit.io/.
76. Toyota, 2017. Toyota assembly line VR training – The training center of
the future. http://realitymatters.eu/project/virtual-assembly-line-training/.
77. Unity, 2018. Homepage. https://unity3d.com/.
78. Unreal Engine, 2018. What is unreal engine 4.
https://www.unrealengine.com/en-US/what-is-unreal-engine-4
79. Van Den Oord, A., Dieleman, S., Zen, H., Simonyan, K., Vinyals, O.,
Graves, A., Kalchbrenner, N., Senior, A. and Kavukcuoglu, K., 2016.
Wavenet: A generative model for raw audio. arXiv preprint
arXiv:1609.03499.
80. Vicon, 2018. Homepage. https://www.vicon.com/.
81. Visbox, 2018. CAVE. http://www.visbox.com/products/cave/.
82. Vuforia, 2018. Homepage. https://www.vuforia.com/.
Manufacturing Assembly Simulations in Virtual and Augmented Reality 31
83. Wang, X., Ong, S.K. and Nee, A.Y.C., 2016. A comprehensive survey of
augmented reality assembly research. Advances in Manufacturing, 4(1),
pp.1-22.
84. Webel, S., Bockholt, U., Engelke, T., Gavish, N., Olbrich, M. and
Preusche, C., 2013. An augmented reality training platform for assembly
and maintenance skills. Robotics and Autonomous Systems, 61(4),
pp.398-403.
85. Wei, S.E., Ramakrishna, V., Kanade, T. and Sheikh, Y., 2016.
Convolutional pose machines. In Proceedings of the IEEE Conference on
Computer Vision and Pattern Recognition (pp. 4724-4732).
86. Wikitude, 2018. Wikitude homepage. https://www.wikitude.com/
87. Wu, S., Tao, W., Leu, M.C. and Suzanna, L., 2018. Engine sound
simulation and generation in driving simulator. Proceedings of the 2018
IISE Annual Conference.
88. Xsens Corporation, 2018. Homepage. https://www.xsens.com/.
89. Yuan, M.L., Ong, S.K., Nee, A.Y.C., 2008, Augmented reality for
assembly guidance using a virtual interactive tool, International journal
of production research, 46(7):1745-1767.
90. Zhang, J., Ong S.K., Nee, A.Y.C., 2011, RFID-assisted assembly
guidance system in an augmented reality environment, International
journal of production research, 49(13): 3919-3938.
91. Zhu, W., Chadda, A., Leu, M.C., Liu, X.F., 2011a, Real-time automated
simulation generation based on CAD modeling and motion capture,
Journal of computer aided design and applications, PACE(1), 103-121.
92. Zhu, W., Daphalapurkar, C.P., Puthenveetil, S.C., Leu, M.C., Liu, X.F.,
Chang, A.M., Gilpin-Mcminn, J.K., Hu, P.H., Snodgrass, S.D., 2012,
Motion capture of fastening operation using Wiimotes for ergonomic
analysis, Proceedings of international symposium on flexible automation,
St. Louis, USA.
93. Zhu, W., Vader, A., Chadda, A., Leu, M.C., Liu, X.F, Vance, J., 2010,
Low-cost versatile motion tracking for assembly simulation, Proceedings
of international symposium on flexible automation, Tokyo, Japan.
94. Zhu, W., Vader, A., Chadda, A., Leu, M.C., Liu, X.F, Vance, J., 2011b,
Wii Remote based low-cost motion capture for automated assembly
simulation, Virtual Reality, 1-12.
95. Carlton, B., 2018. Volkswagen Will Train 10,000 Employees Using VR
This Year. https://vrscout.com/news/volkswagen-employee-training/
96. Innoactive GmbH, 2019. https://innoactive.de/
32 Augmented, Virtual, and Mixed Reality Applications in Advanced Manufacturing
97. Siemens PLM, 2018. Digital Mockup, Virtual & Augmented Reality.
https://www.plm.automation.siemens.com/global/en/industries/automotiv
e-transportation/automotive-oems/digital-mockup-virtual-augmented-
reality.html
98. Bosch Sensortec GmbH, 2018. Augmented & Virtual Reality.
https://www.bosch-
sensortec.com/bst/applicationssolutions/ar_vr/overviewarvr#
99. Hills-Duty, R., 2018. Dassault Systems SOLIDWORKS 2019 Introduces
VR and AR Functionality. https://www.vrfocus.com/2018/10/dassault-
systemes-solidworks-2019-introduces-vr-and-ar-functionality/
100. Claassen, R., 2018. AR and VR in Manufacturing: Being There.
https://www.smart-industry.net/ar-and-vr-in-manufacturing-being-there/
101. Nica, G., 2018. BMW Used Virtual Reality to Set Up Workstations for
3 Series Assembly. https://www.bmwblog.com/2018/11/20/video-bmw-
used-virtual-reality-to-set-up-workstations-for-3-series-assembly/
102. Winick, E., 2018. NASA is using HoloLens AR headsets to build its
new spacecraft faster.
https://www.technologyreview.com/s/612247/nasa-is-using-hololens-ar-
headsets-to-build-its-new-spacecraft-faster/
103. Castellanos, S., 2017. Find out how Rolls-Royce is utilising virtual
reality to train engineers. https://blogs.wsj.com/cio/2017/09/21/rolls-
royce-enlists-virtual-reality-to-help-assemble-jet-engine-
parts/?mod=djemlogistics
104. Leu, M.C., Tao, W., Ghazanfari, A. and Kolan, K., 2017. NX 12 for
Engineering Design. Missouri University of Science and Technology.
105. Tao, W., Lai, Z., Leu, M.C., Yin, Z. and Qin, R., 2019. A Self-Aware
and Active-Guiding Training System for Worker-Centered Intelligent
Manufacturing. Manufacturing Letters.