0% found this document useful (0 votes)
18 views15 pages

Arvr Unit 2

The document outlines the computing architecture of Virtual Reality (VR) systems, detailing components such as VR headsets, PCs or consoles, GPUs, CPUs, and tracking systems. It discusses rendering principles, techniques, and user comfort considerations, emphasizing the importance of graphics and haptics rendering in creating immersive experiences. Additionally, it covers graphics architecture, accelerators, and benchmarks used to evaluate GPU performance.

Uploaded by

abiderdude123
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views15 pages

Arvr Unit 2

The document outlines the computing architecture of Virtual Reality (VR) systems, detailing components such as VR headsets, PCs or consoles, GPUs, CPUs, and tracking systems. It discusses rendering principles, techniques, and user comfort considerations, emphasizing the importance of graphics and haptics rendering in creating immersive experiences. Additionally, it covers graphics architecture, accelerators, and benchmarks used to evaluate GPU performance.

Uploaded by

abiderdude123
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

UNIT II AR/VR COMPUTING ARCHITECTURE

Computing Architectures of VR – Rendering Principle – Graphics and Haptics


Rendering –PC Graphics Architecture – Graphics Accelerators – Graphics
Benchmarks – Workstation Based Architectures – SGI Infinite Reality
Architecture – Distributed VR Architectures –Multi-pipeline Synchronization –
Collocated Rendering Pipelines – Distributed Virtual Environments – AR
Architecture.

2.1 Computing Architectures of VR


This can describe the high-level computing architecture of a typical VR
(Virtual Reality) system and provide a textual representation, but please note that
creating detailed architectural diagrams would be beyond the scope of this text-
based format.

VR Headset:

 The VR headset is the primary interface between the user and the virtual
environment.
 It includes sensors for tracking head movements (gyroscopes, accelerometers,
and sometimes external tracking sensors like cameras or Lighthouse base
stations).

PC or Console:

 The VR experience is often powered by a high-performance computer or gaming


console.
 The PC or console processes the virtual environment, generates 3D graphics,
and handles various calculations to provide a seamless VR experience.

Graphics Processing Unit (GPU):

 The GPU is responsible for rendering 3D graphics with high frame rates to
ensure a smooth VR experience.
 It processes geometry, textures, lighting, and shaders.

Central Processing Unit (CPU):

 The CPU handles various tasks, such as physics simulations, AI, and game logic.
 It manages the overall coordination of VR operations.

Memory (RAM):

 High-speed RAM stores data required for rendering and running the virtual
environment.
 It stores textures, models, and other assets.

VR Tracking System:

 This system tracks the user's movements and positions in real-time.


 Components like cameras, infrared sensors, or external tracking devices work
together to determine the user's location and orientation within the virtual
space.

1
Motion Controllers:

 VR motion controllers allow users to interact with the virtual environment.


 These controllers have their sensors and input buttons for tracking hand
movements and gestures.

VR Software Platform:

 The VR software platform includes the operating system, VR runtime, and VR-
specific APIs.
 It manages the VR hardware, handles input from sensors, and renders the
virtual world.

VR Content:

 VR content, such as games, applications, and simulations, is developed using 3D


modelling software, game engines, and VR development kits.
 This content is loaded into the VR system for users to experience.

Audio System:

 VR systems provide spatial audio to enhance immersion.


 This involves audio processing, head-related transfer functions (HRTF), and
positional audio to create a 3D sound scape.

Network Connectivity:

 In some VR applications, there may be a need for network connectivity to


access online content, multiplayer interactions, or cloud-based processing.
 Interactions between these components happen in real-time to create an
immersive VR experience.
 The headset tracks the user's head and sometimes hand movements, which are
then processed by the CPU and GPU to update the VR world in real-time.
 The motion controllers enable user interaction within the virtual environment.

2.2 Rendering Principles


Rendering focuses on seamlessly integrating virtual content into the real
world, considering lighting, shadows, and other environmental factors to
make it look more natural.

 Rendering VR content is the hardware capabilities and limitations of the


target devices.
 VR devices vary in terms of resolution, refresh rate, field of view, tracking
system, battery life, and processing power.
 These factors affect the amount of graphical detail, complexity, and
interactivity that can be rendered in VR without compromising the frame
rate, latency, and battery life.
 Therefore, it is important to choose the appropriate VR device for your
learning objectives and audience, and to optimize your content accordingly.

Rendering techniques

 Stereoscopic rendering is a popular technique, but it doubles the


rendering workload and requires more GPU resources.
2
 Foveated rendering: To reduce the cost and it is single-pass rendering.
 Dynamic lighting: Shadows are essential for creating realistic
environments, but they also consume a lot of CPU and GPU resources.
 Occlusion removing and level of detail (LOD) techniques reduce the
number of objects and polygons rendered in VR, but require careful planning
and testing to avoid faults.

User comfort and accessibility

 When creating VR content for educational or training purposes, user comfort


and accessibility should be taken into consideration.
 To prevent or minimize motion sickness, eye strain, or fatigue, maintain a
stable and consistent frame rate of at least 60 frames per second (FPS),
preferably 90 FPS or higher.
 Additionally, keep the latency between the user's head movement and the
corresponding image update on the screen below 20 milliseconds for
accurate and natural VR interactions.
 Furthermore, provide clear and intuitive navigation methods, such as
teleportation, locomotion, or hand gestures, for easy and comfortable VR
movement.
 Additionally, offer various options and settings for users to adjust the VR
content according to their preferences and needs.

Algorithmic view in rendering

 Algorithms that sort geometric information to obtain greater efficiently


generally fall under computational geometry.
 Suppose that a point-sized light source is placed in the virtual world. Using
the trichromatic theory, its spectral power distribution is sufficiently
represented by R, G, and B values shown in fig.

Fig. RGB Light source

 All that matter is the angle that the surface makes with respect to the light
source.
 Let n be the outward surface normal and let ℓ be a vector from the surface
intersection point to the light source.
 Assume both n and ℓ are unit vectors, and let θ denote the angle between
them.
 The dot product n · ℓ = cos θ yields the amount of reduction (between 0 and
1) due to the tilting of the surface relative to the light source.
 Think about how the effective area of the triangle is reduced due to its tilt. A
pixel under the Lambertian shading model is illuminated as

 R = dRIR max(0, n · ℓ)
 G = dGIG max(0, n · ℓ)
 B = dBIB max(0, n · ℓ),

3
 in which (dR, dG, dB) represents the spectral reflectance property of the
material (triangle) and (Ir, IG, IR) is represents the spectral power
distribution of the light source.
 Under the typical case of white light, IR = IG = IB.
 For a white or gray material, we would also have dR = dG = dB.
 Using vector notation, can be compressed into
L = dI max(0, n · ℓ)
Where L = (R, G, B), d = (dR, dG, dB), & I = (IR, IG, IB).

 Each triangle is assumed to be on the surface of an object, rather than the


object itself.
 Therefore, if the light source is behind the triangle, then the triangle should
not be illuminated because it is facing away from the light (it cannot be lit
from behind). To handle this case, the max function appears to avoid n · ℓ <
0.

2.3 Graphics and Haptics Rendering


2.3.1 Graphics Rendering

Rendering or image synthesis is the process of generating a photorealistic or


non-photorealistic image from a 2D or 3D model by means of a computer program.

 It is a specifically defined term or information structures, the model is a


representation of three dimensional (3D) objects.
 It includes data on geometry, point of view, texture lighting.
 The picture is a bitmap representation or a picture with raster graphics.
 The word can be similar to an "artist's rendering" of a scenario.
 'Rendering' is often used in a video processing format to define the method
of measuring impact to generate the final graphics adapter.

 It is one of the key sub-topics in three-dimensional computer graphics, and


is still related to others in operation.
 It's the last significant step in the 'graphics pipeline,' allowing the prototypes
and animations the final appearance.
 It has become a quite discrete topic with the growing complexity of computer
graphics

 It plays roles in: computer and video games, simulation software, visual
effects for movies or TV, and simulation of designs, each using a different
arrangement of functions and strategies. A large range of renderers are
accessible as a service.

 Some are incorporated into big items of modeling and animation, others are
stand-alone, others are open-source ventures free of charge.

 A renderer is a carefully designed framework on the inside, focused on a


limited combination of disciplines related to: light physics, sensory
perceptions, mathematics, and development of the technology.

 Rendering can be done gradually in the development of random graphics, as


in pre-rendering, or in timely manner.

4
 Pre-rendering is a computer-intensive method typically used for the
formation of movies, while real-time rendering is sometimes done for three-
dimensional computer games that focus on the utilization of three-
dimensional hardware accelerator graphics processors.

2.3.2 Haptics Rendering

Haptic rendering is the process of converting computer algorithms containing


force information to a mechanical interface capable of
displaying haptic information to a user.

 It is equivalent to visual rendering, which converts graphics from a file into


visual form, as on a video display.
 Haptic rendering is comprised of a hardware and a software component.

 At the simplest level, this information is contained in the representation of


theobject’s physical attributes shape, elasticity, texture, mass, and so on.

 Architecture for haptic feedback Virtual reality (VR) applications strive to


simulate real or imaginary scenes with which users can interact and perceive
the effects of their actions in real time.
 Ideally the user interacts with the simulation via all five senses; however,
today’s typical VR applications rely on a smaller subset, typically vision,
hearing, and more recently, touch.

 Figure 2 shows some example devices. One way to distinguish between


haptic- interfacedevices is by their grounding locations. For interdigittasks,
force-feedback gloves, such as the Hand ForceFeedback (HFF).

5
2.4 PC-Graphics Architecture

Fig 2.4.1 PC- Graphics architecture

CPU (Central Processing Unit):

 The CPU is the central brain of a computer. It handles general-purpose


computing tasks, including running the operating system, managing
applications, and executing instructions. While the CPU is crucial for overall
system performance, its impact on graphics-related tasks is more significant
in certain scenarios, such as physics simulations and AI processing.
GPU (Graphics Processing Unit):

 The GPU is dedicated to handling graphics-related tasks. It excels at parallel


processing and is specifically designed to render images, textures, and
perform complex calculations required for graphics-intensive applications.
Modern GPUs, such as those from NVIDIA and AMD, are essential for
gaming, video editing, 3D rendering, and other graphics-centric tasks. They
often come with their own VRAM (Video RAM).

System RAM (Random Access Memory):

 System RAM is general-purpose memory that the CPU uses to store data that
is actively being used or processed. While it's not directly responsible for
graphics rendering, having sufficient system RAM is crucial for overall system
performance, especially when running multiple applications simultaneously.
In graphics-intensive tasks, having enough RAM allows for smoother
operation and better multitasking.
Video RAM (VRAM):

 VRAM is a specific type of memory reserved for the GPU. It is used to store
textures, frame buffers, and other graphics-related data. The amount of
VRAM on a graphics card is a critical factor, particularly for gaming and other
graphics-intensive applications. Higher VRAM allows for larger textures and
more complex scenes, and it can impact the performance at higher
resolutions.
6
In summary:

 CPU is the general-purpose processor, essential for overall system


functionality.
 GPU is the dedicated graphics processor responsible for rendering images
and handling graphics tasks.
 System RAM is used for general data storage and is crucial for overall system
performance.
 VRAM is specific to the GPU and is used to store graphics-related data.
 For optimal graphics performance, it's important to have a balanced system
where the CPU, GPU, and RAM work together effectively.
 The specific requirements depend on the intended use of the computer,
whether it's gaming, content creation, or other graphics-intensive
applications

2.5 Graphics Accelerators


Graphic accelerator cards usually come to us as add-on cards that plug into a PCI
bus or AGP slot, or the circuitry is integrated into the motherboard and attached to
one of these buses. A typical 2D/3D graphic accelerator card has the following
major components as follows.

 Graphic accelerator chipset or co-processor


 Expansion Bus interface
 Video memory
 RAMDAC
 Firmware in Flash BIOS
 Software driver

Graphic accelerator chipset or co-processor –

 It is the brain of the video card and determines what exactly the board can
and can’t do.
 In this, chipsets are one of the core components and better chipsets will
provide you more efficient and more acceleration features.
 Better chipsets include extended capabilities or we can say extra
functionalities like 3D acceleration.

Expansion Bus interface

 Since many graphics operations entail copying of memory images or blocks


of data from system memory to the display adapter’s on-screen memory, the
expansion bus interface in terms of speed and bandwidth plays a major role
in deciding the video performance.
 Moreover, in cheaper graphic accelerator designs featuring minimal
accelerating hardware, most of the video processing tasks fall back to the
PC’s processor, which makes bus performance even more important.
 The graphic data movement between the PC motherboard and the adapter
takes place over the PC’s expansion bus. Hence it’s quite natural that a
faster and wider expansion bus interface is essential for better video
performance.

7
Video memory

 There are two important issues with video memory of graphic accelerator
cards – memory size and memory type. The video memory of traditional
display adapters such as EGA, VGA, etc., was known as “frame buffer’
because core purpose of the video memory was to store frame pixel bits.
 In sharp contrast, the video memory chips of graphic accelerators not only
stores frame pixel bits buffering removes this limitation, enabling frame rates
that match the full refresh rate of the monitor.

RAMDAC –
 In low cost and earlier VGA cards, the RAMDAC is integrated into the video
controller chip. But in high-performance graphic cards, the RAMDAC is
separate.
 The Digital to Analog converter (DAC) part of the RAMDAC chip converts
digital values of three primary colors into analog video signals. In this, you
can say RAM on a RAMDAC is used for holding information like palette
information and not for the actual image. Graphic accelerators need faster
DACs to support higher screen refresh rates, which is essential for 3D
graphics.

Firmware in Flash BIOS –


 All graphics accelerators require video BIOS and driver software. The video
BIOS is the firmware permanently recorded on an EPROM/Flash BIOS chip.
 The firmware contains a minimal amount of software necessary for
supporting a graphics controller to provide the desired screen environment.
Also, the BIOS software interfaces the graphics accelerator hardware to a
standard set of DOS functions.

Software driver –
 Drivers are actually part of the board and they play the controlling role in all
graphic accelerators.
 In general, drivers are sophisticated chunks of code that enable graphic
cards to talk and take orders from the operating system and its applications.
Without the graphic card driver software, there is no way a gaming
application will know how to generate the display because Windows would
have no idea that the card is there.

2.6 AR-VR Graphics Benchmarks


Graphics benchmarks are tests designed to evaluate the performance of graphics
processing units (GPUs) under various conditions. These benchmarks help users,
especially gamers and professionals, assess the capabilities of their graphics cards
and compare them with others on the market.

Here are some commonly used graphics benchmarks:

3DMark:

Description: 3DMark is one of the most popular and widely used benchmarks for
assessing overall graphics performance. It includes various tests such as Fire
Strike, Time Spy, and Port Royal, each focusing on different aspects of GPU
performance like gaming, DirectX 12 capabilities, and ray tracing.

8
Unigine Heaven and Valley:

Description: Unigine Heaven and Valley are benchmarks that stress the GPU by
rendering highly detailed environments. They are often used to test the stability
and performance of graphics cards, especially in scenarios involving large-scale
landscapes and intricate textures.

FurMark:

Description: FurMark is a GPU stress test that pushes graphics cards to their limits.
It renders a furry object using various algorithms and shaders, putting a heavy load
on the GPU. FurMark is commonly used to test the stability and cooling
performance of graphics cards.

Cinebench:

Description: While Cinebench is primarily known as a CPU benchmark, it also


includes a graphics test called Cinebench GPU. This test evaluates the GPU's
performance in rendering a complex 3D scene. Cinebench is often used in the
professional space for assessing rendering capabilities.

Superposition Benchmark:

Description: Developed by Unigine, Superposition is a benchmark that focuses on


testing GPU stability and performance in VR and heavy graphics workloads. It
supports both DirectX and OpenGL and offers a variety of scenarios to evaluate
different aspects of GPU capabilities.

SPECviewperf:

Description: SPECviewperf is a benchmark designed for professionals working with


computer-aided design (CAD) and other 3D visualization software. It provides a set
of standardized viewsets representing various applications to measure GPU
performance in professional workflows.

Geekbench GPU Benchmark:

Description: Geekbench is a general-purpose benchmarking tool that includes a


GPU benchmark. It measures the GPU's performance in tasks like image processing
and computer vision, providing a score that can be compared across different
GPUs.

When interpreting benchmark results, it's important to consider the specific


workload being tested and how well it aligns with your intended use. For example,
gaming benchmarks may not accurately represent the performance of a GPU in
professional applications, and vice versa. Additionally, real-world performance in
specific applications or games may differ from synthetic benchmarks, so it's often
useful to consult a variety of sources for a comprehensive assessment.

9
2.7 AR-VR Workstation Based Architectures

2.8 SGI Infinite Reality Architecture


The DG5 SGI Infinite Reality board can be combined with various optional daughter
boards to meet graphics needs. For example, the DG5-8 board has a VIO5H
daughterboard, which adds six monitor connectors, for a total of eight.

The Silicon Graphics Onyx2 rack system, SGI Onyx 3000 series, and SGI Onyx 300
graphics systems use a dedicated graphics enclosure that houses one or two
InfiniteReality graphics pipes. The first pipeline has up to four RM boards, and the
second pipeline has up to two RM boards. (Figure 2 shows a graphics module with
two InfiniteReality pipes: pipe 0 with four RM boards, and pipe 1 with two RM
boards.)

10
 In contrast to the previously mentioned systems, the Silicon Graphics Onyx2
deskside graphics system can be configured with a maximum of one pipeline
with two RM boards.
 The Silicon Graphics Onyx2 deskside graphics system includes the following
components:
• Geometry engine (GE) (one board per pipe)
• Raster manager (RM) (one or two boards per pipe)
• Display generator (DG5) (one board per pipe)

 Ordered new, the InfiniteReality4 graphics pipe consists of the GE16-4, at


least a two-channel display generator (DG5-2), and the new raster manager
(RM11).
 If you order it as an upgrade, you can enhance your current InfiniteReality
graphics system by upgrading, at the pipe-component level, to the GE16-4,
DG5, and RM11 boards available with InfiniteReality4.
 The upgrade requires that your system is running IRIX 6.5.17 release
software, or later, for the enhanced capabilities to work. Compared to the
InfiniteReality3 graphics system
 the InfiniteReality4 graphics system provides the following increased
capabilities:
• Texture memory is increased from 256 MB to 1024 MB per pipe.
• Raster memory (frame buffer) is increased from 80 MB to 2.5 GB per RM11
board.
• Raster Manager board speed is increased for higher pixel fill performance.

2.9 Distributed VR Architectures


Distributed VR is a relatively new area of research. Virtual reality systems
are now largely software components, rather than requiring the dedicated head-up-
display input controllers and renderer hardware of the past
11
2.10 Multi-pipeline Synchronization

12
2.11 Collocated Rendering Pipelines

13
2.12 Distributed Virtual Environments

14
2.13 AR Architecture
Augmented Reality is technology in computer graphics which combines the
real time environment with the digital one. In Virtual reality users totally
experience new world while in augmented reality digital information display over
the real environment. To experience Augmented Reality user need AR headset.

Augmented Reality Architecture

Augmented reality (AR) consists of six different components:

1. User
2. Device
3. Interaction
4. Virtual Content
5. Tracking
6. Real life Entity

1. User: The most essential part of augmented reality is its user. The user can
be a student, doctor, employee. This user is responsible for creation of AR
models.
2. Interaction: It is a process between device and user. The word itself consist
of its meaning some action perform by one entity as result in creation or
some action performed by other entity.
3. Device: This component is responsible for creation, display and interaction
of 3D models. The device can be portal or in static state. Example, mobile,
computer, AR headsets etc.
4. Virtual Content: The virtual content is nothing but the 3D model created or
generated by the system or AR application. Virtual content is type of
information that can be integrated in real world user’s environment. This
Virtual content can be 3D models, texture, text, images etc.
5. Tracking: This component is basically process which makes possible
creation of AR models. Tracking is sort of algorithm which help to determine
the device where to place or integrate the 3D model in real world
environment. There are many types of Tracking algorithm available which
can be used in development of AR applications.
6. Real-life entity: The last component AR architecture is real world entities.
This entities can be tree, book, fruits, computer or anything which is visible
in screen. AR application does not change position of real life entity. It only
integrate the digital information with this entities.

------ End -------


15

You might also like