0% found this document useful (0 votes)
41 views64 pages

Unit 1

Uploaded by

Abhinandan R
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
41 views64 pages

Unit 1

Uploaded by

Abhinandan R
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 64

School of Computer Science & Engineering

Introduction to Augmented Reality (AR)

UNIT: 1

Prof. Priyadarshini R
Assistant Professor
[email protected]
Introduction To Augmented
Reality
Introduction - Definition

Augmented reality holds the promise of creating direct, automatic, and actionable
links between the physical world and electronic information.

It provides a simple and immediate user interface to an electronically enhanced


physical world.

Augmented reality can overlay computer-generated information on views of the real


world, amplifying human perception and cognition in remarkable new ways.

AR bridges the gap between virtual world and real world, both spatially and
cognitively.

3
Introduction - Definition

1. Augmented reality (AR) is an interactive experience


of a real-world environment where the objects that
reside in the real world are enhanced by computer-
generated perceptual information.
2. Three basic features: a combination of real and
virtual worlds, real-time interaction, and accurate
3D registration of virtual and real objects.

4
Components of AR

1. Tracking Component

2. Registration component

3. Visualization Component

4. Spatial model(database)

5
6
Scope Of Augmented Reality

1. AR is about using a portable device, such as a smartphone, to add a few extra details to
what we see. Examples include Google Glass and Pokémon. Currently the most
popular applications of AR are on Snapchat, where selfie lovers use smart filters to
decorate and animate photos on the fly.
2. Currently, augmented reality jobs offer the greatest opportunities for creative
professionals. This reflects the broader user base for AR technology, although virtual
reality jobs are also growing at a fast rate.

7
Scope Of Augmented Reality

3. Most of the augmented reality jobs available today are best described as existing job
titles with an AR descriptor. Common positions include:
1. AR/VR content developer
2. AR/VR content strategist
3. AR/VR user experience designer
4. Designer, animator, or sound artist specializing in AR & VR
5. AR/VR community manager
6. AR/VR project manager

8
9
NFL AR NASA hybrid synthesis AR

Microsoft HoloLens

Print media
Google Glasses
Examples

1. IKEA Mobile App


2. Nintendo’s Pokémon Go App
3. Google Pixel’s Star Wars Stickers
4. Disney Coloring Book
5. L’Oréal Makeup App
6. Weather Channel Studio Effects
7. U.S. Army - Tactical Augmented Reality

11
Related Fields
 Medical Training: From operating MRI equipment to performing complex surgeries, AR tech holds the potential to
boost the depth and effectiveness of medical training in many areas.
 Retail: In today's physical retail environment, shoppers are using their smartphones more than ever to compare prices
or look up additional information on products they're browsing. World famous motorcycle brand Harley Davidson is
one great instance of a brand making the most of this trend, by developing an AR app that shoppers can use in-store.
Users can view a motorcycle they might be interesting in buying in the showroom and customize it using the app to see
which colors and features they might like.
 Architectural Design & Modeling: From interior design to architecture and construction, AR is helping professionals
visualize their final products during the creative process. Use of headsets enables architects, engineers, and design
professionals’ step directly into their buildings and spaces to see how their designs might look, and even make virtual on
the spot changes. Urban planners can even model how entire city layouts might look using AR headset visualization.
Any design or modeling jobs that involve spatial relationships are a perfect use case for AR tech.

12
System Structure

13
System Structure

14
Key Technology

1. Intelligent display technology


• Helmet display (HMD) was born in 1968 - professor ivan sutherland -makes it possible to
superimpose simple graphics constructed by computers on real scenes in real time.
• Handheld device display, relying on the augmented reality technology of handheld display, handheld
device display is very light, small, especially the popularity of smart phones, through video
perspective
• Display devices, such as pc desktop displays, match the real-world scene information captured by the
camera to a three-dimensional virtual model generated by the computer and are ultimately displayed
by the desktop display.

15
Helmet display (HMD) Handheld device display

16
3D registration technology
3d registration technology enables virtual images to be superimposed accurately in the
real environment.
The main flow of 3D registration technology has two steps.
First, determine the relationship between the virtual image, the model and the
direction and position information of the camera or display device.
Second, the virtual rendered image and model are accurately projected into the real
environment, so the virtual image and model can be merged with the real environment.

3D registration technology based on hardware tracker, computer vision, wireless network


and the mixed registration technology

17
Intelligent interaction technology

 Intelligent interactive technology is closely related to intelligent display technology, 3d registration


technology, ergonomics, cognitive psychology, and other disciplines.
 There are a variety of intelligent interactions, including hardware device interactions, location
interactions, tag-based or other information-based interactions.
 With the development of intelligent interaction technology, augmented reality not only superimposes
virtual information to real scenes, but also realizes the interaction between people and virtual objects in
real scenes.
 This interaction is based on the fact that people give specific instructions to the virtual object in the
scene, and the virtual object can make some feedback, thus enabling the audience of the augmented
reality application to achieve a better experience.

18
Location interactions
19
General solution for calculating geometric & illumination consistency in
the augmented environment

Light probe methods


 Methods based on light probes use a special hardware (usually a camera with a fisheye
lens) to capture the illumination in high quality.
 Approaches which capture the illumination in real-time and use it for rendering in AR.
The advantage of these methods is a high visual fidelity of the rendered result.
 However, they need a high computational power and therefore are not suitable for
mobile devices.

20
Light estimation methods
 The estimation of real illumination was calculated from shadows. Estimation was
done based on the distribution of illumination by analyzing the relationships
between the image brightness and the occlusions of incoming light.
 Another method for real-time estimation of diffuse illumination from arbitrary
geometry, captured by an RGB-D camera.
 This method reconstructs the real geometry and surrounding illumination which is
used for rendering of the virtual content in AR with consistent illumination.

21
Rendering with natural illumination

 Once the real light is reconstructed, rendering with natural illumination


plays an important role for achieving a consistent appearance of virtual
and real objects.
 This method is well suitable to calculate the diffuse illumination. A fast
approximation of environment map convolution can be achieved by MIP-
mapping.

22
Introduction To Virtual
Reality
24
VIRTUAL REALITY

 It is a simulation in which computer graphics is used to create a realistic looking


world.

 The synthetic world is not static, but responds to the user's input. Hence key
feature of virtual reality, which is real-time interactivity.

 Real time means that the computer is able to detect a user's input and modify the
virtual world instantaneously. People like to see things change on the screen in
response to their commands and become captivated by the simulation.

25
Introduction - Definition

1. Virtual reality (VR) is the term used to describe a three-dimensional, computer-


generated environment which can be explored and interacted with by a person. That
person becomes part of this virtual world or is immersed within this environment and
at the same time as there, is able to manipulate objects or perform a series of actions.

Virtual reality is a high-end user-computer interface that involves real-time simulation


and interactions through multiple sensorial channels. These sensorial modalities are
visual, auditory, tactile, smell, and taste.

26
1. Interactive

2. Immersive

3. Human Imagination

27
28
What are Virtual, Augmented and Mixed Realities

1. Virtual reality (VR) is a computer-generated scenario that simulates experience


through senses and perception. The immersive environment can be similar to the
real-world, or it can be fantastical, creating an experience not possible in ordinary
physical reality. Instead of viewing the environment on a screen in front of them,
users get an immersive experience and are able to interact with the environment.

29
Augmented reality (AR)

1. Augmented reality (AR) is an interactive experience of a real-world environment


whose elements are "augmented" by computer-generated perceptual information,
sometimes across multiple sensory modalities, including visual, auditory, haptic,
somatosensory, and olfactory. The overlaid sensory information can be
constructive (i.e., additive to the natural environment) or destructive (i.e., masking
of the natural environment). Augmented reality is related to two largely
synonymous terms: mixed reality and computer-mediated reality

30
Mixed Reality (MR)

1. Mixed Reality (MR) is the merge of real and virtual worlds to produce new
environments and visualizations where physical and digital objects co-exist and
interact in real time. It allows you to see and get immersed in your surroundings
even while you are interacting with the digital objects embedded in your
surroundings. It gives you the ability to keep your one feet in reality and the other
in the digital world, merging these two worlds together.

31
Components of VR System

32
33
Components or Types of VR

1. Broadly, VR can be classified based on its type of immersion and the type of device you intend to use.

2. Based on Type of Immersion, VR can be categorized broadly as:

1. 360 Degree Media and

2. Computer Generated 3D VR (CG3D VR)

3. 360 Degree Media: These are basically 360 Degree Camera-shot images or videos or rendered scenes

or images in 3D. Camera shot 360 media enable you to experience or see a real-life place or scenario shot

using a 360-degree camera. While a Rendered 360 Image or a video lets people experience images and

scenes that were computer generated using any 3D application.

34
A 360-degree panoramic Image. A realistic 360 render of a house.

35
Computer Generated 3D VR

This is completely 3D immersive VR where you build a 3D space for the user to
explore and interact with

Computer Generated 3D VR.


36
Primary features or factors that help to create a complete virtual reality experience

1. Immersion: as explained above is the trick to get our brain to visualize itself in an environment that
it is not currently in.
2. Teleportation: is the ability of moving across various environments without having the need to
leave your premise. Virtual Reality allows you to change your physical surrounding without moving
even an inch from your position.
3. Interaction: when one is able to interact with this new environment that one is looking at, the power
of the immersion amplifies into making the belief of this Virtual Reality to be an actual Reality more
concrete.
4. Presence: is the ability to feel that one is actually at the place that one sees one is in.
5. Sensory feedback: It is easy to break the illusion of Virtual Reality if our brain sees something, but
our other senses reject that notion and rebel against it. But then our senses complement to the visual
feedback that it is receiving, it creates an absolute Virtual Reality.
37
Present Development on VR:

 Technological advances that helped shape Virtual Reality possible

Most of the sensors like gyroscopes and motion sensors that are used to track the head
orientation and body positions in a VR headset were primarily developed for smartphones.
Small HD screens used initially to make the display for smartphones are used as displays in
a Virtual Reality headset.
list of few technical advances
1. Haptics: Haptics is the basic involvement of touch
as a feedback to the senses for confirming the belief of
whatever they are seeing is actually there.

38
2. 3D Display: 3D or 3-dimensional display is the technology that helps build
this illusion of depth.
• This is used to produce a three-dimensional effect by projecting the same
scene into both eyes but depicted from slightly different perspectives.
The display mechanisms that help achieve 3D display are:

39
1. Stereoscopy: Stereoscopy (also called stereoscopics,
or stereo imaging) is a technique for creating or
enhancing the illusion of depth in an image by means of
stereopsis for binocular vision.

2. Polarization: A polarized 3D system uses


polarization glasses to create the illusion of three-
dimensional images by restricting the light that
reaches each eye
40
3. Alternate Frame Rendering: Alternate Frame Rendering (AFR) is a technique of graphics
rendering in personal computers which combines the work output of two or more graphics
processing units (GPU) for a single monitor, in order to improve image quality, or to
accelerate the rendering performance. The technique is that one graphics processing unit
computes all the odd video frames, the other renders the even frames.

4. 360 Degree View: The ability of constructing displays that show a complete 360-degree
environment either by taking an individual into an environment which has displays
surrounding in all directions or by rendering the images on the displays placed in front of
eyes which moves as quickly and rapidly with the moving chassis of the display as and when
the head rotates. 41
5. Motion and Orientation: The ability of measuring motion and direction in space and
translating it into a Virtual environment is critical for creating the illusion of the virtual
reality. And this ability of the HMDs to respond correctly to the user’s actions in the
virtual environment, is achieved by the help of these sensors:

• Accelerometer: An accelerometer is an instrument used to measure acceleration of a


moving or a vibrating body and is therefore used in VR devices to measure the
acceleration along a particular axis.

42
• Gyroscope: A gyroscope is a device used to measure
orientation. The device consists of a wheel or disc
mounted so that it can spin rapidly about an axis which
itself is free to alter in any direction. Gyroscope sensor
is responsible for the autorotation of the screen and view
on the screen whenever a phone is rotated.

• Magnetometer: A magnetometer is a device used to


measure magnetic forces, usually Earth’s magnetism and
thus tell the direction that it is facing. A compass is a
simple type of magnetometer, one that measures the
direction of an ambient magnetic field.

43
6. Depth Sensing: As the name suggests, depth sensing is the ability of a computing
system to measure depth of the real environment. The main components that make it
possible are an IR (Infra-Red) projector and an IR Camera.

• An IR projector emits many dots in the surrounding in its line of sight and the IR
camera then sees and understand these dots and the processors calibrate the position of
the object according to the shape, size, and density of these dots.

44
6. Light Field Camera: A light field camera, also known as plenoptic camera, captures
information about the light field emanating from a scene; that is, the intensity of light in a
scene, and also the direction that the light rays are traveling in space. This contrasts with
a conventional camera, which records only light intensity.
• One type of light field camera uses an array of micro-lenses placed in front of an
otherwise conventional image sensor to sense intensity, color, and directional information.
Multi-camera arrays are another type of light field camera. Holograms are a type of film-
based light field image.

45
7. Computer Graphics: This is probably the most critical topic in Virtual Reality.
Although VR has been in existence through many decades but only recently with
increasing portable computing power being easily accessible, a lot of quality work in
Computer Graphics has been made possible, that in turn enables the kind of VR that
we experience today.

46
Challenges in virtual reality

1. Realistic sense
2. No Nausea
3. Depth
4. Non-interfering Sensors
5. Ergonomics
6. Immersion
7. Presence
8. Teleportation
9. Movements
10.Interactions

47
Basic terminologies in VR industries

1. HMDs: Head-mounted Displays also referred to sometimes as ‘Virtual Reality


headsets’, or ‘VR glasses’ attach straight to your head and present visuals directly
to your eyes. HMDs may have a small display optic in front of one eye
(Monocular HMD) or both eyes. (Binocular HMD) All Oculus devices are HMDs
since they are mounted on the head and has display optics that pertain to both eyes.

48
 FOV: The Field of View is the extent of the observable world that is seen at any
given moment. The field of view is usually given as an angle for the horizontal or
vertical component of the FOV.

A larger angle indicates a larger field of view. For immersive VR, our entire FOV
needs to be the virtual world. As the device is brought closer to your eyes, the screen
takes up more of your FOV. Biconvex lenses magnify the screen further and make the
virtual world your entire FOV.

49
50
Field of View

51
Frames per second

•FPS: Frame rate or Frames per second is the frequency (rate) at which consecutive images called frames appear on a
display. Displaying frames in quick succession creates the illusion of motion. i.e., more the frames smoother the
motion.

Minimum Requirement Naked-eye judder-free acuity

Frame Rate

90fps with low persistence >300 fps

52
Transform
1. Transform is used to place the bodies correctly in the world and calculate how they
should appear on displays. It consists of Position (Translation) & Rotation
(Orientation) of the object with reference to the given coordinate system. In may
also include the scale of the object in virtual world.

53
Degree Of Freedom

1. Degrees of Freedom is the number of independently variable factors which can affects
the transform of an object. Ex: Desktop Mouse Movement - 2DOF.
2. Degree of Freedom of a VR setup depends on different sensors (only rotational tracking
or positional tracking) used in setup.
3. Head Rotation - Where I am looking - 3DOF Object Movement in space - Where I am -
3DOF Object Movement + Rotation in space - 6DOF.
4. Rotational Degree of Freedom are identified by amount of rotation across Pitch, Yaw &
Roll axis.

54
55
Latency
1. Latency is a time interval between the stimulation and response, or, from a more general
point of view, a time delay between the cause and the effect of some physical change in the
system being observed. VR and neuroscience experts have found through user studies that a
latency greater than 20ms causes motion sickness and discomfort and have projected that it
may be necessary to reduce it to 15ms or even 7ms to fully eliminate them.
2. The direct perception of latency varies wildly among people. Even when it is not
perceptible, it has been one of the main contributors to VR sickness. Adaptation causes
great difficulty because people can adjust to a constant amount of latency through long
exposure; returning to the real world might be difficult in this case.
56
Foveated imaging

1. Foveated imaging is a digital image processing technique in which the image


resolution, or amount of detail, varies across the image according to one or more
fixation points. A fixation point indicates the highest resolution region of the image and
corresponds to the center of the eye's retina, the fovea.

2. In VR, Foveated rendering is a technique used for performance optimization. It will be


more effective with eye tracking sensors. In absence of eye tracking, Fixed Foveated
Rendering (FFR) is a technology that allows the edges of the eye texture to be rendered at
a lower resolution than the center.

57
Asynchronous Time Warp

1. Timewarp / Time warping also known as Reprojection is a technique in VR (which was


long known as post rendering image warp) that warps the rendered image before
sending it to the display to correct for the head movement occurred after the rendering.
Executing timewarp operation asynchronously in a separate thread is considered as
Asynchronous Time Warp (ATW). In VR, this technique is also used to generate
intermediate frames in situations when the game can’t maintain frame rate, helping to
reduce judder.
2. Also, in case a prediction is used for generating the frames, time warp is used as a last-
moment adjustment to overcome prediction errors.

58
Asynchronous Space Warp (ASW)

1. Asynchronous Space Warp (ASW) is a frame-rate smoothing technique that almost


halves the CPU/GPU time required to produce nearly the same output from the same
content. ATW is limited for Rotational TW. Whereas, ASW applies animation
detection, camera translation, and head translation to previous frames in order to
predict the next frame. As a result, motion is smoothed, and applications can run on
lower performance hardware.

59
Engines/Tools to build VR experiences

1. Unity 3D
2. Unreal Engine
3. Godot Engine
4. WebVR
5. Scapic
6. Lumberyard
7. CRYENGINE

60
Game Engines

1. Blender
2. SketchUp
3. SimLab Soft

61
IDEs for Native Development & Few SDKs

1. Android Studio
2. Visual Studio
3. Spark AR
4. Lens Studio
5. AR Core
6. AR Kit
7. Vuforia
8. Cardboard SDK

62
Discussion (5 min)

63
THANK YOU

You might also like