Unit 1
Unit 1
UNIT: 1
Prof. Priyadarshini R
Assistant Professor
[email protected]
Introduction To Augmented
Reality
Introduction - Definition
Augmented reality holds the promise of creating direct, automatic, and actionable
links between the physical world and electronic information.
AR bridges the gap between virtual world and real world, both spatially and
cognitively.
3
Introduction - Definition
4
Components of AR
1. Tracking Component
2. Registration component
3. Visualization Component
4. Spatial model(database)
5
6
Scope Of Augmented Reality
1. AR is about using a portable device, such as a smartphone, to add a few extra details to
what we see. Examples include Google Glass and Pokémon. Currently the most
popular applications of AR are on Snapchat, where selfie lovers use smart filters to
decorate and animate photos on the fly.
2. Currently, augmented reality jobs offer the greatest opportunities for creative
professionals. This reflects the broader user base for AR technology, although virtual
reality jobs are also growing at a fast rate.
7
Scope Of Augmented Reality
3. Most of the augmented reality jobs available today are best described as existing job
titles with an AR descriptor. Common positions include:
1. AR/VR content developer
2. AR/VR content strategist
3. AR/VR user experience designer
4. Designer, animator, or sound artist specializing in AR & VR
5. AR/VR community manager
6. AR/VR project manager
8
9
NFL AR NASA hybrid synthesis AR
Microsoft HoloLens
Print media
Google Glasses
Examples
11
Related Fields
Medical Training: From operating MRI equipment to performing complex surgeries, AR tech holds the potential to
boost the depth and effectiveness of medical training in many areas.
Retail: In today's physical retail environment, shoppers are using their smartphones more than ever to compare prices
or look up additional information on products they're browsing. World famous motorcycle brand Harley Davidson is
one great instance of a brand making the most of this trend, by developing an AR app that shoppers can use in-store.
Users can view a motorcycle they might be interesting in buying in the showroom and customize it using the app to see
which colors and features they might like.
Architectural Design & Modeling: From interior design to architecture and construction, AR is helping professionals
visualize their final products during the creative process. Use of headsets enables architects, engineers, and design
professionals’ step directly into their buildings and spaces to see how their designs might look, and even make virtual on
the spot changes. Urban planners can even model how entire city layouts might look using AR headset visualization.
Any design or modeling jobs that involve spatial relationships are a perfect use case for AR tech.
12
System Structure
13
System Structure
14
Key Technology
15
Helmet display (HMD) Handheld device display
16
3D registration technology
3d registration technology enables virtual images to be superimposed accurately in the
real environment.
The main flow of 3D registration technology has two steps.
First, determine the relationship between the virtual image, the model and the
direction and position information of the camera or display device.
Second, the virtual rendered image and model are accurately projected into the real
environment, so the virtual image and model can be merged with the real environment.
17
Intelligent interaction technology
18
Location interactions
19
General solution for calculating geometric & illumination consistency in
the augmented environment
20
Light estimation methods
The estimation of real illumination was calculated from shadows. Estimation was
done based on the distribution of illumination by analyzing the relationships
between the image brightness and the occlusions of incoming light.
Another method for real-time estimation of diffuse illumination from arbitrary
geometry, captured by an RGB-D camera.
This method reconstructs the real geometry and surrounding illumination which is
used for rendering of the virtual content in AR with consistent illumination.
21
Rendering with natural illumination
22
Introduction To Virtual
Reality
24
VIRTUAL REALITY
The synthetic world is not static, but responds to the user's input. Hence key
feature of virtual reality, which is real-time interactivity.
Real time means that the computer is able to detect a user's input and modify the
virtual world instantaneously. People like to see things change on the screen in
response to their commands and become captivated by the simulation.
25
Introduction - Definition
26
1. Interactive
2. Immersive
3. Human Imagination
27
28
What are Virtual, Augmented and Mixed Realities
29
Augmented reality (AR)
30
Mixed Reality (MR)
1. Mixed Reality (MR) is the merge of real and virtual worlds to produce new
environments and visualizations where physical and digital objects co-exist and
interact in real time. It allows you to see and get immersed in your surroundings
even while you are interacting with the digital objects embedded in your
surroundings. It gives you the ability to keep your one feet in reality and the other
in the digital world, merging these two worlds together.
31
Components of VR System
32
33
Components or Types of VR
1. Broadly, VR can be classified based on its type of immersion and the type of device you intend to use.
3. 360 Degree Media: These are basically 360 Degree Camera-shot images or videos or rendered scenes
or images in 3D. Camera shot 360 media enable you to experience or see a real-life place or scenario shot
using a 360-degree camera. While a Rendered 360 Image or a video lets people experience images and
34
A 360-degree panoramic Image. A realistic 360 render of a house.
35
Computer Generated 3D VR
This is completely 3D immersive VR where you build a 3D space for the user to
explore and interact with
1. Immersion: as explained above is the trick to get our brain to visualize itself in an environment that
it is not currently in.
2. Teleportation: is the ability of moving across various environments without having the need to
leave your premise. Virtual Reality allows you to change your physical surrounding without moving
even an inch from your position.
3. Interaction: when one is able to interact with this new environment that one is looking at, the power
of the immersion amplifies into making the belief of this Virtual Reality to be an actual Reality more
concrete.
4. Presence: is the ability to feel that one is actually at the place that one sees one is in.
5. Sensory feedback: It is easy to break the illusion of Virtual Reality if our brain sees something, but
our other senses reject that notion and rebel against it. But then our senses complement to the visual
feedback that it is receiving, it creates an absolute Virtual Reality.
37
Present Development on VR:
Most of the sensors like gyroscopes and motion sensors that are used to track the head
orientation and body positions in a VR headset were primarily developed for smartphones.
Small HD screens used initially to make the display for smartphones are used as displays in
a Virtual Reality headset.
list of few technical advances
1. Haptics: Haptics is the basic involvement of touch
as a feedback to the senses for confirming the belief of
whatever they are seeing is actually there.
38
2. 3D Display: 3D or 3-dimensional display is the technology that helps build
this illusion of depth.
• This is used to produce a three-dimensional effect by projecting the same
scene into both eyes but depicted from slightly different perspectives.
The display mechanisms that help achieve 3D display are:
39
1. Stereoscopy: Stereoscopy (also called stereoscopics,
or stereo imaging) is a technique for creating or
enhancing the illusion of depth in an image by means of
stereopsis for binocular vision.
4. 360 Degree View: The ability of constructing displays that show a complete 360-degree
environment either by taking an individual into an environment which has displays
surrounding in all directions or by rendering the images on the displays placed in front of
eyes which moves as quickly and rapidly with the moving chassis of the display as and when
the head rotates. 41
5. Motion and Orientation: The ability of measuring motion and direction in space and
translating it into a Virtual environment is critical for creating the illusion of the virtual
reality. And this ability of the HMDs to respond correctly to the user’s actions in the
virtual environment, is achieved by the help of these sensors:
42
• Gyroscope: A gyroscope is a device used to measure
orientation. The device consists of a wheel or disc
mounted so that it can spin rapidly about an axis which
itself is free to alter in any direction. Gyroscope sensor
is responsible for the autorotation of the screen and view
on the screen whenever a phone is rotated.
43
6. Depth Sensing: As the name suggests, depth sensing is the ability of a computing
system to measure depth of the real environment. The main components that make it
possible are an IR (Infra-Red) projector and an IR Camera.
• An IR projector emits many dots in the surrounding in its line of sight and the IR
camera then sees and understand these dots and the processors calibrate the position of
the object according to the shape, size, and density of these dots.
44
6. Light Field Camera: A light field camera, also known as plenoptic camera, captures
information about the light field emanating from a scene; that is, the intensity of light in a
scene, and also the direction that the light rays are traveling in space. This contrasts with
a conventional camera, which records only light intensity.
• One type of light field camera uses an array of micro-lenses placed in front of an
otherwise conventional image sensor to sense intensity, color, and directional information.
Multi-camera arrays are another type of light field camera. Holograms are a type of film-
based light field image.
45
7. Computer Graphics: This is probably the most critical topic in Virtual Reality.
Although VR has been in existence through many decades but only recently with
increasing portable computing power being easily accessible, a lot of quality work in
Computer Graphics has been made possible, that in turn enables the kind of VR that
we experience today.
46
Challenges in virtual reality
1. Realistic sense
2. No Nausea
3. Depth
4. Non-interfering Sensors
5. Ergonomics
6. Immersion
7. Presence
8. Teleportation
9. Movements
10.Interactions
47
Basic terminologies in VR industries
48
FOV: The Field of View is the extent of the observable world that is seen at any
given moment. The field of view is usually given as an angle for the horizontal or
vertical component of the FOV.
A larger angle indicates a larger field of view. For immersive VR, our entire FOV
needs to be the virtual world. As the device is brought closer to your eyes, the screen
takes up more of your FOV. Biconvex lenses magnify the screen further and make the
virtual world your entire FOV.
49
50
Field of View
51
Frames per second
•FPS: Frame rate or Frames per second is the frequency (rate) at which consecutive images called frames appear on a
display. Displaying frames in quick succession creates the illusion of motion. i.e., more the frames smoother the
motion.
Frame Rate
52
Transform
1. Transform is used to place the bodies correctly in the world and calculate how they
should appear on displays. It consists of Position (Translation) & Rotation
(Orientation) of the object with reference to the given coordinate system. In may
also include the scale of the object in virtual world.
53
Degree Of Freedom
1. Degrees of Freedom is the number of independently variable factors which can affects
the transform of an object. Ex: Desktop Mouse Movement - 2DOF.
2. Degree of Freedom of a VR setup depends on different sensors (only rotational tracking
or positional tracking) used in setup.
3. Head Rotation - Where I am looking - 3DOF Object Movement in space - Where I am -
3DOF Object Movement + Rotation in space - 6DOF.
4. Rotational Degree of Freedom are identified by amount of rotation across Pitch, Yaw &
Roll axis.
54
55
Latency
1. Latency is a time interval between the stimulation and response, or, from a more general
point of view, a time delay between the cause and the effect of some physical change in the
system being observed. VR and neuroscience experts have found through user studies that a
latency greater than 20ms causes motion sickness and discomfort and have projected that it
may be necessary to reduce it to 15ms or even 7ms to fully eliminate them.
2. The direct perception of latency varies wildly among people. Even when it is not
perceptible, it has been one of the main contributors to VR sickness. Adaptation causes
great difficulty because people can adjust to a constant amount of latency through long
exposure; returning to the real world might be difficult in this case.
56
Foveated imaging
57
Asynchronous Time Warp
58
Asynchronous Space Warp (ASW)
59
Engines/Tools to build VR experiences
1. Unity 3D
2. Unreal Engine
3. Godot Engine
4. WebVR
5. Scapic
6. Lumberyard
7. CRYENGINE
60
Game Engines
1. Blender
2. SketchUp
3. SimLab Soft
61
IDEs for Native Development & Few SDKs
1. Android Studio
2. Visual Studio
3. Spark AR
4. Lens Studio
5. AR Core
6. AR Kit
7. Vuforia
8. Cardboard SDK
62
Discussion (5 min)
63
THANK YOU