0% found this document useful (0 votes)
23 views101 pages

CG Lesson12 (En)

Uploaded by

Hoàng Minh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views101 pages

CG Lesson12 (En)

Uploaded by

Hoàng Minh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 101

Lesson 15

Virtual reality

Trinh Thanh Trung School of ICT, HUST


1.
Overview
Definition

■ Coates (1992):
□ Virtual Reality is electronic simulations of
environments experienced via head mounted eye
goggles and wired clothing enabling the end user to
interact in realistic three-dimensional situations.
■ Greenbaum (1992):
□ Virtual Reality is an alternate world filled with
computer-generated images that respond to human
movements. These simulated environments are
usually visited with the aid of an expensive data suit
which features stereophonic video goggles and
fiber-optic gloves.
Definition

■ Isdale (1998)
□ VR is a way for humans to visualize, manipulate and
interact with computers and extremely complex data

■ Other concepts
□ Virtual Worlds, Virtual Environments, Immersive VR,
Cyberspace ...
History
■ 1962 : Sensorama (from the movie industry : Morton
Heilig)
■ 1970sh : visualisation of virtual world on the screen
■ 1970 : First Head Mounted Display : Daniel Vivkers
from Utah University (From a Ivan Sutherland/MIT's idea)
■ 1982 : Dataglove
■ 1980-85 : First VR commercial products 1987 Virtual
Cockpit (British Aerospace) head and hand tracking, eye
tracking, 3d visuals, 3D audio, speech recognition vibro
tactile feedback
■ 1990-95 : Popularisation of VR (Film, Books...)
□ ARMY – spend close to $1000 million 1998 in VRresearch.
VR – a new media

■ VR is often associated with gaming (only partly


true)
■ VR is a new kind of media!
□ E.g. A story can be told using text, photos, music,
videos
□ … and using virtual reality.
■ Anything can be achieved by other media may
as well achievable by using VR
□ We have movies, stories, immersive music
experience, social networks, simulation, education,
online meeting…
□ And of course, games.
VR vs. other media

■ Vividness (accurately represent


the environment)
□ breadth (visibility, audibility,
touch, smell)
□ depth (quality, fidelity)
■ Interactivity (enable users to
change the environment)
□ speed (update rates, time lag)
□ mapping (text, speech, gestures,
gaze, complex behavior patterns)
Application

■ Archeology:
Reconstruction of
historic sites that
have been
destroyed or no
longer exist

Recreation of the Homolovi IV ruin


Application

■ Training and
coaching: Allows
learners to be
trained before
using the real
device, especially
when using the
device is difficult
or too expensive
Flight simulator of the US Air Force
Application

■ Fiction: Creating
non-existent scenes

Recreating the architectural scenes


Application

■ Entertainment: Build fictional scenes for


entertainment purposes such as games, movies...

11
Types of VR

■ Desktop VR
□ Non-immersive: 3D
mouse, window system
□ 3D world displayed on
screen; using mouse and
keyboard as interactive
devices
□ VRML, Games
Types of VR

■ Window on a World (WoW)


□ “One must look at a display screen as a window through
which one beholds a virtual world. The challenge to
computer graphics is to make the picture in the window
look real, sound real and the objects act real”
[Computer Graphics].
□ Some systems use a traditional monitor to display the
visual world.
□ One must view the display as a window through which
one can see a virtual world.
Types of VR

■ video-mapping
□ A variation of the WoW-style approach. This approach combines a
video of the user with 2D graphics. The user will see an interactive
display of his image with the computer.
□ Simple DVR system using common interactive devices such as 3D
mouse, window system. The 3D world is displayed on the screen, with
the mouse and keyboard as interactive inputs.
Other types of VR (?)

■ Augmented reality (AR)


□ The combination of graphics and the
real world forms a common system for
viewing and evaluating the real world.
□ Combine Telepresence and virtual reality
systems for Mixed Reality or Seamless
Simulation systems
□ Computer-generated inputs are
combined with telepresence inputs
and/or the user's view of the real world.
Other types of VR (?)

■ Mixed reality (MR)


□ Environments in which real and virtual
subjects and objects interact in real time,
and in which you can interact with both
real and virtual components.
□ Require VR headsets equipped w/
cameras

■ eXtended reality (XR)


□ XR = VR + AR + MR
2.
Concepts in VR
Virtual environment’s
architecture
Architectural components

■ Objects, entities (car, house, sky…)


□ Geometry attributes - Shape, color, texture
□ Physical attributes - Weight, speed, position, etc…

■ User
□ Interact with the environment, view the environment
through camera
□ Has a representation in the virtual world (avatar)
□ Able to listen, speak (voice recognition)
□ Able to move (gesture recognition)
Architectural components

■ Multiuser
■ Network connection
■ Simulation
□ Control the application
□ Event-based control (collision detection)
□ Simulate object’s physical attributes
■ World
□ Combine other components into a virtual world that
the users interact.
Perceive
The ability to hear; the auditory faculty; SYN. audition, auditory sense,
hearing sound ears, body
sense of hearing, auditory modality.

The ability to see; the faculty of vision; SYN. vision, visual sense, visual
sight image eyes
modality.

The faculty of touch; SYN. sense of touch, skin senses, touch modality,
touch surface / temperature skin
cutaneous senses.

smell odour nose The faculty of smell; SYN. sense of smell, olfaction, olfactory modality.

tongue/
taste savour, flavour The faculty of taste; SYN. gustation, sense of taste, gustatory modality.
nose

position, movement, The perception of body position and movement and muscular tension
kinesthesia muscles
muscular tensions etc; SYN: kinaestesia, feeling of movement

balance, acceleration,
position, location, The ability to sense the position and location and orientation and
proprioception ear
orientation, movement movement of the body and its parts.
of the body
Perceive

■ monoscopic cues
□ relative size
□ interposition and occlusion
□ perspective distortion
□ lighting and shadows
□ texture gradient
□ motion parallax
■ binocular (stereoscopic) cues
□ stereo disparity
□ convergence
Recreate sensation

1. monoscopic cues
□ realistic rendering / lighting simulation
2. stereoscopic cues -> stereodisparity
presentation of appropriate view to each eye
□ - time multiplexing of images
□ - multiplexing with chromatic filters (anaglyph)
□ - multiplexing with polarizer filters
□ - providing two views simultaneously
□ Color Encoded Stereo Image Pair
Presence

■ "Virtual presence is experienced by a person when


sensory information generated only by and within a
computer compels a feeling of being present in an
environment other than the one the person is actually in”
(Sheridan, 1992, pg.6)
■ Presence: is a psychological perception that occurs in
a technology-based environment formed in an immersive
mean.
■ In fact, the system operating in this way does not
necessarily apply to everyone, but only to a few people
with certain roles in the community.
■ Hard to integrate immersive and presence
■ Application: Telepresence & teleoperate
Technical requirements for
presence
■ 6 degrees of freedom movement
□ rotational accuracy < ¼ degrees
□ Translational accuracy < 1mm
□ rock –solid tracking
■ > 90 frames per second
□ low pixel persistence < 3ms
■ < 20ms latency – motion-to-photon time
■ > 1k resolution per eye
■ > 110 degree field of view
■ Calibrated, quality optics
Telepresence
■ Telepresence: A term that describes the use of different
technologies that have the effect of placing the user in a
different location.
■ Telepresence is a way of reimagining, visualizing worlds
generated entirely in computers. This is a technique that
connects real-world remote sensors with the operator's senses.
These remote sensors can be positioned on a robot, like tools.
■ Firefighters use remote control vehicles to handle dangerous
situations. Surgeons use extremely small devices on cables to
perform surgery without making a large hole in the patient's
body. These devices have a small video camera at the business
end.
■ Robots equipped with these telepresence systems have
changed the way deep-sea or volcanic experiments are
performed. This technique is also applied in space studies.
Telepresence
Immersive
■ Creating a sense of immersion between the user and the
environment, the observed image is the image the user can see
including space and direction.
■ The part of the image the user can see is only a very small
part of the display space
■ Usually the technology for the display plays an important
role and other perceptions are also addressed.
■ VR systems make the user a part of the simulated virtual
world, not just the virtual environment being a partial of the real
world that the user presents.
■ The first immersive VR system is a flight simulation system
where immersion is a sophisticated combination of real devices
and virtual images. Real cockpit with realistic equipment that
allows pilots to use like a normal flight. Images shown are pre-
prepared virtual scenes.
3.
VR hardware
Common hardware model

Physical Simulation
and Animation

Lights Visual
Illustration Model
Head Position
Objects
Geometry Audio OUTPUT
INPUT
Surface Properties
Dynamic Properties
Hand Position Physical Constrains
Acoustic Properties Haptic

Collision Detection
VR system model
Head Tracker Receiver
Head-mounted display(HMD)
Head phones
Microphone

Glove Video Camera


Glove tracker Voice 3D Sound
receiver Recognition Haptics Processor
Face / Gesture /
Motion Analysis
Host Computer
3D Tracker
System VE Real-time OS Graphics
Processor

Tracker Machine VE
Transmitter Intelligence Database
Display and interaction devices

■ Visual Displays (3D imagery)


□ Head Mounted Displays (HMD)
□ Projection Displays (CAVE, Virtual Plane)
■ Acoustic Displays (spatial sound)
□ Multi-Channel Sound Systems
□ Specialized Convolution Processors (e.g.
Convolvotron)
■ Haptic Displays (force feedback)
□ Robot Arms (e.g. Grope, Phantom)
□ Active Joystics (e.g. Microsoft Sidewinder)
□ Vibrotactile Devices (e.g. Logitec Cyberman)
Head Mounted Display (HMD)

Oculus Rift
3-DOF vs. 6-DOF

■ DOF: Degree of freedom


■ 6-DOF is required for “real” VR app
□ to avoid motion sickness
3-DOF HMD

■ (almost) gone extinct


□ Google Cardboard
▫ Not actually a “product”, more like a “blueprint”
▫ Still available for a very affordable price
□ (discontinued) Samsung GearVR, Google Daydream
▫ Works with specific phones
▫ Include a 3-DOF controller to interact with the env
□ (discontinued) Oculus Go
▫ Stand-alone 3-DOF headset w/ controller

■ 360-degree videos are ALWAYS 3-DOF


6-DOF HMD

Oculus Quest 2 300-400 € Standalone 1832x1920 per eye 90 Hz

Valve Index 1000-2000 € Not standalone 1600x1440 per eye 120 Hz

HTC Vive Pro 2 800-1400 € Not standalone 2440x2440 per eye 120 Hz

HTC Vive Cosmos 670 € Not standalone 1700x1440 per eye 90 Hz

HTC Vive Focus 3 1500 € Standalone 2448x2448 per eye 90 Hz

Sony PlayStation VR 300-900 € Not standalone 1080x960 per eye 120 Hz

Pico 4 429 € Standalone 2160x2160 per eye 90 Hz

Sony PSVR2 600 € Not Standalone 2000x2040 per eye 120 Hz


Inside-out vs Outside-in tracking

■ used for 6-DOF tracking IR Diodes

Cameras

IR LEDs

■ Lost tracking?
□ Predict from received sensor data
CAVE Automatic Virtual
Environment

Christe’s CAVE
Haptic Displays

Tactus' floating keyboard allows the virtual


keyboard on the touch screen to float and
respond like a physical keyboard
Haptic devices

■ Determine the position and direction of the


hand
■ Force and torque feedback devices
■ Tactile devices
■ Devices that generate stimuli such as heat or
cold
Sensory devices
Kinetic interfaces
3.
VR-specific issues
Details matter

As everything can be viewed extremely close up you


will want to have the highest quality renders from the
highest quality source assets possible.
Additionally: because the angular resolution of the
contemporary HMD-s is still rather low, the quality of
each pixels does really matter!

“They pay much attention to everything. They want to


mess with things and they want to poke at things.

…We got to fill every nook and granny with something in


a way you couldn't justify it before.”

- Adam Savage, Inside Valve: Making Half-Life: Alyx for Virtual Reality
Computer Graphics
issues in VR
General graphics issues in VR

● Has high resolution


● Has at least two viewpoints active at once due
to the stereoscopic nature
● Has high frame rate
● Focused on minimizing latency
● Good anti-aliasing and high image clarity is
highly preferred as pixels are large
General graphics issues in VR (cont.)

● Tends towards high field of view


● Tends towards more complex / detailed world
simulation
● Tends to use more graphical hacks but can
make them more visible
● Not always going for the most realistic/detailed
graphics though
● The most popular platform currently is
standalone and mobile / ARM based
Resolution

● The more the better!


● The angular resolution of human vision
is ~2 pixels per arcminute*
○ This would be 120 PPD (pixel per
degree), though display aliasing makes
this less clear
○ Quest 2 has ~19 PPD, Varjo VR-1 ~60
PPD
● Increase Resolution and Fill Rate to
avoid Screen Door Effect
● RGB Stripe displays (LCD) usually have
better fill rate than PenTile(OLED)
Field of View (of a human)
● Straight ahead is 200° (horizontal FoV)
○ 120° of that is binocular
○ can rotate another 50°

● Vertical FOV is 135°

● The acuity of the human vision varies greatly


throughout the field of view!

● Quest 2 has
○ 97° horizontal, 93° vertical visible FoV
○ 104° horizontal, 98° vertical rendered FoV
● Valve Index
○ 108° horizontal, 104° vertical visible FoV
○ 108° horizontal, 109° vertical rendered FoV
Lens Distortion

Can be either: pin cushion


barrel final image
distortion distortion

● Barrel distortion (from lenses)

● Pincushion distortion
● Mustache distortion (combination of the above)
● Many ways of correcting this including the The
Brown–Conrady model
Correcting lens distortion in real-
time
There are at least 3 options:
● Doing it in the fragment shader
○ Slowest, but is very accurate
● Mesh based: calculating it in the vertex
shader
○ Using a separate pass utilizing a
tessellated grid
○ Can be combined with chromatic
aberration correction!
● Doing it using vertex displacement on the
whole scene
○ Doable but can be a hassle
● The squishy and moving nature of eyes
makes these kinds of corrections tricky
Correcting chromatic
aberration
● The Index of Refraction of light
depends on wavelength
● Displays have Red, Green and
Blue subpixels so you only need
to figure out the offsets for
those 3 wavelength
○ Though yes the light they emit
might not be fully
monochromatic
● Achieving this with a vertex grid:
have 3 sets of UVs with different
distortion on them based on if
they affect the R, G or B buffers
Stencil Mesh / HiddenAreaMask
Reduces the amount of pixels that need to be shaded
● VR.HiddenAreaMask variable in UE
● Reduced from shading 457 million pixels/sec to 378
million pixels/sec (1512 × 1680 per eye render
resolution) on Original Vive
● PS! Nowadays doing 844 million display pixels/sec on
Quest 2
○ Uncompressed this would be (3 x 8 bit at 120FPS) 20Gb/s

DisplayPort Typical Monitor Max. Data Rate (4


Specification Year Resolution lanes)* Bit Rate Class
1 2006 1440p @ 60 Hz 5.18/8.64 Gbps RBR/HBR
1.2 2009 4K @ 60 Hz 17.28 Gbps HBR2
1.4 2016 4K @ 120 Hz 25.92 Gbps HBR3
8K @ 60 Hz (with
DSC)
2 2019 8K @ 60 Hz 77.36 Gbps UHBR
A bit more on data rates

● Displayport specification standards


DisplayPort
Specification Year Typical Monitor Resolution Max. Data Rate (4 lanes)* Bit Rate Class
1 2006 1440p @ 60 Hz 5.18/8.64 Gbps RBR/HBR
1.2 2009 4K @ 60 Hz 17.28 Gbps HBR2
4K @ 120 Hz
1.4 2016 25.92 Gbps HBR3
8K @ 60 Hz (with DSC)
2 2019 8K @ 60 Hz 77.36 Gbps UHBR

● HDR is coming to consumer VR with


PSVR2
○ One would expect that is is using full
4:4:4 chroma subsampling
○ Key announced specs 2k × 2k per
eye at max 120Hz and has eye
tracking
(Fixed) Foveated Rendering

■ Already implemented and used in Quest apps and


some PC apps as well
□ The resolution drop off can vary in time
□ Tiled rendering on mobile plaforms
■ Greatly helped out by eye tracking, but can work
without it
□ Predicting larger adoption of dynamically tracked foveated
rendering with the release of PSVR2 and Meta “Quest Pro”
■ Can be achieved using Variable Rate Shading
■ Can also be used for supersampling the ares in focus:
Nvidia VRSS
□ Used in several PCVR games, can use Tobii’s eye tracking
□ Drivel level feature (but needs Nvidia’s approval)
■ Could be used for display signal compression
Motion to Photon Latency

■ The time it takes for the physical motion of the


HMD to updating as light emitting from the display.
■ Or in some cases input in general
■ Minimize this as much as possible!
□ Late update (and predict) HMD and controller
transforms
What the timings look like in UE 4.26
Also check out other graphics debugging / timing tools like Nvidia Nsight, Microsoft GPUView and Radeon Developer Tool Suite
Framerate

■ Hitting framerate is much more important in VR


than in flat screen games as not updating the display in a
timely and consistent manner can lead to motion sickness
■ Some headsets support multiple framerates
□ E.g. Quest 2 supports: 72Hz, 90Hz and 120 Hz
■ What is acceptable depends on application
□ How much viewport and scene motion there is
□ Colors of the scene
□ Latency sensitivity of controls
■ Low Persistence: the image is shown to the user for
only a fraction of the duration of the frame
Framerate drop mitigation
● Oculus
○ Timewarp
■ Reproject rendered frame using the latest headset
transform
○ Asynchronous TimeWarp
■ Transforms stereoscopic images based on the latest
head-tracking information to significantly reduce the
motion-to-photon delay , reducing latency and judder
in VR applications
○ Asynchronous SpaceWarp (& Spacewarp 2.0)
■ Is a frame-rate smoothing technique that almost
halves the CPU/GPU time required to produce nearly
the same output from the same content
○ Positional TimeWarp, Application Spacewarp…

● SteamVR
○ Interleaved reprojection
■ If frames are dropped, render at half rate and estimate
other frames
○ Motion Smoothing
■ Asynchronous SpaceWarp type technology (not depth
aware?)
The two views rendered for
the eyes are relatively close
together. This can be used to
optimize the rendering
process!
Instanced Stereo Rendering

Instead of sending the draw calls for the two view points separately to the
GPU do it once for both eyes!
● This is a CPU optimization, but a GPU net negative. Choose based on
content!
● You should still be doing draw call batching!
● Called Multiview for mobile VR in UE
● Things are changing with UE5 and Nanite (*Nanite does not work with
stereo rendering yet)
Other stereoscopic view
optimizations
● Round Robin Occlusion: do the occlusion culling
calculaton only for one eye per frame use that data for
both eyes when rendering objects, and then switch the
eye that is doing the calculation for next frame
● Monoscopic Far Field Rendering: render the far field
geometry from a central view as parallax from longer
distances becomes less noticeable (deprecated in UE)
Anti-aliasing
● MSAA is the standard
○ UE supports 4xMSAA though other engines can apps
go for 8x
● TAA is used in some applications, but its inherent
softness is not desirable
● DLSS works relatively well
○ Though can have artifacting with small repeating
details, e.g. cloth with visible weaving
○ Waiting for non vendor specific solution that works as
well
● SSAA is the desired result
○ Increasing resolution scale to 2x can very perceptible
image quality improvements
■ The original resolution scale is likely already
~1.4x the pixel resolution of the display
Specular Aliasing from Normal Maps

● If naively filtered, mips of normal maps can be


averaged out to a smoother surface than
expected. This can lead to specular aliasing.
● Inversely if Normal Map Mips are left too high
then that can also cause aliasing due to sub
pixels changes.
● This mostly affects PBR content
● PS! This also affects non VR games and might
not be handled by default for you!
Reducing Specular Aliasing
from Normal Maps
● To resolve this, the
roughness from the normal
maps can be encoded into
mips of the roughness
texture (or elsewhere if
need be)
○ In UE this is achieved using
the Composite Texture
functionality
Reducing Specular Aliasing
from Geometry
● Additionally if geometry is too dense
to be resolved properly (think thin
smooth wires for example), this can
also cause specular aliasing
○ A trick to mitigate this is to increase
the roughness of the pixels where
there is a lot of curvature changes
going on.
■ This is Normal Curvature to
Roughness feature in UE.
■ This helps but is not perfect (the
Valve talk has one additional
trick to help further)
Bent Normals
● A bent normal is a normal that points into the average
direction of light arriving at the surface point.
● This can be used to increase the quality of diffuse GI and
specular reflections
More tricks for VR
● Art direction!
● Memory is usually cheaper than compute cycles:
bake, bake and precompute!
● Pure normal maps hold up less than they do in flat
screen games, fine for smaller details, complement
them if possible
● Real geometry and Parallax Occlusion Mapping look
great for high detail visuals, it is likely that you can
get away with Bump Offset mapping instead.
● Be mindful of sprites, smaller sprites work a bit
better,
○ remove HMD roll from their orientation,
○ for sprites that can be close or be large, use camera
direction facing.
● Try using stereoscopic layers for distant geometry
and maybe more.
● Check out The VR Book to dive much deeper, it
includes much more than graphics!
Game design in VR
Locomotion

■ Moving oneself from one location to another.


■ Continuous movement
Press a button (move
thumbstick) to
continuously move in
some direction (relative
to head or hand).

Whaaa…?
□ Can cause motion sickness. Reset plz! RESET!
(vomit)
→ Avoiding by ‘vignetting’
We are moving!

No we are not!
Locomotion

■ No locomotion!
The game takes place
around the player in a
single location
(Only the player’s
physical movement
results in locomotion)

■ Fixed locations
Movement to these
locations can be
controlled by the
player or be event-
based.
Locomotion

■ Projectile moving

You shoot a projectile


and decide when to
teleport to its location.

■ Blink teleportation

Instantly teleport to
the desired location.
Locomotion

Other
■ Simulate running
Forward motion by
moving the controllers
up and down →
Simulates the
movement of hands as
if running.

■ Grappling hook
You shoot a grappling
hook and it pulls you
forward (or the
player pulls themselves
on it).
Locomotion
Standard door height: 2.1 m
Average human height: 1.6 - 1.7 m
My height: 1.88 m
■ Use real world dimensions 1.7 m is ~81% (⅘) of 2.1 m Half-Life 2 (2004)

6 4 3

5 3
2
4
2

3
1 1
2

1
https://learnleveldesign.com/tutorials/first-twenty-minutes-of-half-life-2/
GUI

Diegetic Non-Diegetic

Good for VR

Spatial Meta
GUI

■ Diegetic
□ Part of the environment

Tribocalypse VR (2017)
Dead Space (2008) Half-Life: Alyx (2020)
GUI

■ Non-Diegetic
□ Not a part of the environment in
anyway (e.g. usual GUI)
Neverwinter (2013) Totem Games (2016)
GUI

■ Spartial
□ In the 3D environment

Neverwinter (2013) Tribocalypse VR (2017)


GUI

■ Meta
□ Not in the environment but the player should be
aware of
Doom Eternal (2020) Need for Speed 2 (1997)
Turning regular games
to VR
Regular games to VR

○ Warnings, rules, everything will be bad at first.


○ Will be very time-consuming

○ Need special attention


● Minimize sickness effects
● Aesthetics secondary
● Study human perception
● Reuse assets
● Focus on geometric detail
● Appearance of hands, arms, body
● Careful with zooming
Regular games to VR

Others
● Not only perception but also study being in a
new world
● Travel methods need changing
● Interactions are more important
● Movement methods need agreement
● Testing importance (including motion sickness)
● Not too hard rules
● Iterate quickly, fail fast, learn what works best
Tips and suggestions

● Let go of the idea of “experiencing everything


and a lot”
○ Avoid super detailed world
○ Less can be more with actions
○ Super cool fights with a lot of things happening
in it work differently
○ Need to learn how to move, assumptions may
not work
○ Too much intensity is uncomfortable
○ A lot of visible “buttons” can lead to a
disappointment..
Tips and suggestions

● Find ways to tell stories that don’t need camera


to control it
○ Less movie, more game
○ Long scenes where camera shows everything
you see don’t work
○ Camera view is eyes at all times
○ Following a needed path needs guidance (a lot)
○ You never know where the player is looking. So
don’t cheat.
Tips and suggestions

● Consider that players take things slower in VR


than in a flat screen game
“People slow down so much [in Half-Life: Alyx],
That's in contrast to how fast your character
moves in Half-Life games traditionally. You're
very, very, very fast in those games, and at the
furthest end of the bell curve on the other
extreme is how slow people go [in VR].”

- Sean Vanaman, Valve’s designer and writer


Tips and suggestions

● If you make something look interactive, it has


to be interactive
“If anything looks interactable, it has to be
interactable, The player has to be able to touch
it, lift it, throw it or press it, whatever the
interaction may be. I think that's the most
important [rule] of all.”
- Odeldahl, Apex Construct creative director
● Interacting means different ways for different
people
Tips and suggestions

● Movement needs planning (a lot)


○ Yes, the motion sickness again
○ But also…
○ Teleportation can make you interact less instead
of more
○ Natural movement can be an advantage,
preferred for some
○ Feeling lost when many different movement
options
○ Moving in VR can take much more time if fully
experienced
Thanks!
Any questions?
Credits

Lecture notes provided by School of Information and


Communication Technology, Hanoi University of Science and
Technology.
Composed by Huynh Quyet Thang, Le Tan Hung, Trinh Thanh
Trung and others
Edited by Trinh Thanh Trung

□ Raimond Tunnel, “Virtual Reality Game Design”, Computer


Graphics Seminar
□ Ats Kurvet, “Computer Graphics in Virtual Reality”
□ Kertu Toompea, “Virtual Reality”

■ Presentation template by SlidesCarnival


■ Photographs by Death to the Stock Photo (license)
■ Diverse device hand photos by Facebook Design
Resources

You might also like