0% found this document useful (0 votes)
152 views104 pages

pcsdk-1 11

Uploaded by

Jimin Seo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
152 views104 pages

pcsdk-1 11

Uploaded by

Jimin Seo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 104

PC SDK

Version 1.11
2 | Introduction | Oculus Rift

Copyrights and Trademarks


© 2017 Oculus VR, LLC. All Rights Reserved.

OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC. (C) Oculus VR, LLC. All rights reserved.
BLUETOOTH is a registered trademark of Bluetooth SIG, Inc. All other trademarks are the property of their
respective owners. Certain materials included in this publication are reprinted with the permission of the
copyright holder.

2 |  | 
Oculus Rift | Contents | 3

Contents

Rendering to the Oculus Rift...............................................................................5


Developer Release Guide.................................................................................... 7
Welcome to the Release Guide........................................................................................................................ 7
Changes in Version 1.11.x.................................................................................................................................7
Migrating to SDK 1.11...................................................................................................................................... 8
Oculus API Change Archive.............................................................................................................................. 8
Changes For Release 1.10.x........................................................................................................................ 8
Changes For Release 1.9.x.......................................................................................................................... 9
Changes For Release 1.8.x........................................................................................................................ 10
Changes For Release 1.7.x........................................................................................................................ 12
Changes For Release 1.6.x........................................................................................................................ 13
Changes For Release 1.5.x........................................................................................................................ 13
Changes For Release 1.4.x........................................................................................................................ 14
Changes For Release 1.3.x........................................................................................................................ 15
Changes For Release 0.8.0........................................................................................................................ 20
Changes For Release 0.7.0........................................................................................................................ 21
Changes For Release 0.6.0........................................................................................................................ 24
Changes For Release 0.5........................................................................................................................... 31
Changes For Release 0.4........................................................................................................................... 32
Changes For Release 0.4 Since Release 0.2.5...........................................................................................33

PC SDK Getting Started Guide......................................................................... 35


Getting Started with the SDK......................................................................................................................... 35
Recommended Specifications.................................................................................................................... 35
Oculus Rift Setup....................................................................................................................................... 35
Oculus Rift SDK Setup.....................................................................................................................................36
Installation...................................................................................................................................................36
Compiler Settings.......................................................................................................................................36
Build Solutions............................................................................................................................................36
Getting Started with the Demos..................................................................................................................... 36
Software Developers and Integration Engineers....................................................................................... 37
Artists and Game Designers...................................................................................................................... 38
OculusWorldDemo Demo.......................................................................................................................... 38

PC SDK Developer Guide................................................................................. 41


LibOVR Integration.......................................................................................................................................... 41
Overview of the SDK................................................................................................................................. 41
Initialization and Sensor Enumeration............................................................................................................. 42
Head Tracking and Sensors....................................................................................................................... 44
Health and Safety Warning........................................................................................................................ 48
Rendering to the Oculus Rift.......................................................................................................................... 48
Rendering to the Oculus Rift..................................................................................................................... 49
Rendering Setup Outline........................................................................................................................... 50
Advanced Rendering Configuration................................................................................................................ 63
Coping with Graphics API or Hardware Rendertarget Granularity............................................................ 64
Forcing a Symmetrical Field of View......................................................................................................... 65
Improving Performance by Decreasing Pixel Density................................................................................66
Improving Performance by Decreasing Field of View............................................................................... 67
4 | Contents | Oculus Rift

Improving Performance by Rendering in Mono........................................................................................ 68


Protecting Content..................................................................................................................................... 69
VR Focus Management....................................................................................................................................69
Oculus Guardian System................................................................................................................................. 73
Rift Audio......................................................................................................................................................... 76
Oculus Touch Controllers................................................................................................................................ 79
Controller Data........................................................................................................................................... 81
Hand Tracking............................................................................................................................................ 81
Button State................................................................................................................................................81
Button Touch State.................................................................................................................................... 84
Haptic Feedback........................................................................................................................................ 85
SDK Samples and Gamepad Usage................................................................................................................86
Optimizing Your Application........................................................................................................................... 86
SDK Performance Statistics........................................................................................................................ 86
Oculus Debug Tool....................................................................................................................................89
Performance Head-Up Display...................................................................................................................92
Performance Indicator.............................................................................................................................. 100
Pairing the Oculus Touch Controllers........................................................................................................... 101
Asynchronous SpaceWarp............................................................................................................................. 101

Reference Content........................................................................................... 103


Developer Reference..................................................................................................................................... 103
Troubleshooting............................................................................................................................................. 103
PC SDK PDFs.................................................................................................................................................103
Oculus Rift | Rendering to the Oculus Rift | 5

Rendering to the Oculus Rift


Welcome to the Oculus PC SDK!

The Oculus PC SDK enables you to build VR experiences for the Oculus Rift in C++. To download the PC SDK,
go to the Downloads Page.

Supporting SDKs
Oculus provides several supporting SDKs that add functionality to the core SDK. These include:

Audio SDK

Properly spatialized audio dramatically improves immersion. To get the Audio SDK, go to the Downloads Page.
To learn more, review the Audio SDK Documentation.

Platform SDK

Identity, social, engagement, and revenue features can dramatically improve the games and experiences that
you create. To get the Platform SDK, go to the Downloads Page. To learn more, review the Platform SDK
Documentation.

Avatar SDK

Having a head and hands in VR improves social and Touch experiences. To get the Avatar SDK, go to the
Downloads Page. To learn more, review the Avatar SDK Documentation.

Getting Started
To get started with the PC SDK, go to Getting Started with the SDK on page 35.

Learn and Share


Have questions about using the PC SDK? Visit the PC SDK Forum to ask our community and experts.
6 | Rendering to the Oculus Rift | Oculus Rift

Help, I’m Over My Head


If you are not a C++ developer, you might have more success getting started with a game engine such as Unity
or Unreal. For more information, review the Unity Documentation and the Unreal Engine Documentation.
Oculus Rift | Developer Release Guide | 7

Developer Release Guide


Welcome to the Developer Release Guide.

This guide describes major changes since the last release, provides instructions about how to migrate from the
last release, and provides a history of changes to the SDK.

Welcome to the Release Guide


This release introduces new changes and updates to the Oculus SDK.

Changes in Version 1.11.x


Overview of Release 1.11.x
This release enables you to specify a tracking origin, provides raw button values, and adds Asynchronous
SpaceWarp (ASW) statistics.

New Features for 1.11.x


• Added ovr_SpecifyTrackingOrigin(), which enables you to specify the tracking origin instead of
setting the tracking origin based on the user's current location. For more information, see Position Tracking
on page 46.
• Added IndexTriggerRaw, HandTriggerRaw, and ThumbstickRaw, which provide raw button values
without a deadzone or filter. For more information, see Button State on page 81.
• Added the following ASW Performance HUD statistics:

• ASW Status
• ASW Active-Toggle Count
• ASW Presented-Frame Count
• ASW Failed-Frame Count

• Added the following ASW SDK statistics:

• AswIsActive
• AswActivatedToggleCount
• AswPresentedFrameCount
• AswFailedFrameCount

API Changes for 1.11.x


There are no breaking changes to version 1.11.x.

Known Issues
The following are known issues:
8 | Developer Release Guide | Oculus Rift

• Some older AMD CPUs are not currently compatible with Asynchronous SpaceWarp (ASW), which might
cause your system to repeatedly crash. The workaround is to disable ASW by setting its registry key to 0. For
more information, see Asynchronous SpaceWarp on page 101.
• If you encounter intermittent tracking issues, remove the batteries from any Engineering Sample Oculus
Remotes that you paired with your headset and contact Developer Relations for replacement remotes.
• If you bypass the shim and communicate with the DLL directly, without specifying a version to ovr_Initialize,
the DLL has no way of knowing the SDK version with which the application was built. This can result in
unpredictable or erratic behavior which might cause the application to crash.
• There are some USB chipsets that do not meet the USB 3.0 specification and are incompatible with the
Oculus Rift sensor. If you receive a notification in Oculus Home or the Oculus App, plug the sensor into a
different USB 3.0 port (blue). If none of the USB 3.0 ports work, plug the sensor into a USB 2.0 port (black).
• Antivirus software, such a McAfee, can cause installation issues. To work around the issue, make sure you
have the latest updates and disable real-time scanning.
• If you encounter installation issues, delete the Oculus folder and install the software again.
• If the Rift displays a message that instructs you to take off the headset, remove it and place it on a flat
surface for 10-15 seconds.
• The keyboard and mouse do not work in Oculus Home. To select an item, gaze at it and select it using the
Oculus Remote or Xbox controller.
• Bandwidth-intensive USB devices, such as web cams and high-end audio interfaces, might not work when
using the Rift. To work around this issue, install the device on another USB host controller or a separate
computer.
• For dual-boot systems using DK2 or CB1 HMDs, the OS selection screen might appear on the HMD instead
of the monitor. To work around this, try plugging the HMD into a different port or unplug the HMD while
booting.
• If you are running your application from the Unity Editor and you press the controller's home button to
return to Oculus Home, you will be prompted to close the application. If you select OK, Unity might remain
in a state where it is running, but will never get focus. To work around this, restart Unity.

Migrating to SDK 1.11


Migrating from SDK 1.10.x to SDK 1.11
There are no breaking SDK changes or migration requirements other than installing the new SDK.

Oculus API Change Archive


This section describes API changes for each version release.

Changes For Release 1.10.x


This section describes changes to the Oculus SDK, the Oculus App, Oculus Home, and the runtime.

Overview of Release 1.10.1


This release enables Asynchronous SpaceWarp (ASW) by default.
Oculus Rift | Developer Release Guide | 9

New Features for 1.10.1


• Enabled ASW by default. For more information, see Asynchronous SpaceWarp on page 101.
• Added Touch optimizations.
• Minor performance improvements, memory management fixes, and other updates.

API Changes for 1.10.1


There are no breaking changes to version 1.10.x.

Known Issues
The following are known issues:

• Some older AMD CPUs are not currently compatible with Asynchronous SpaceWarp (ASW), which might
cause your system to repeatedly crash. The workaround is to disable ASW by setting its registry key to 0. For
more information, see Asynchronous SpaceWarp on page 101.
• If you encounter intermittent tracking issues, remove the batteries from any Engineering Sample Oculus
Remotes that you paired with your headset and contact Developer Relations for replacement remotes.
• If you bypass the shim and communicate with the DLL directly, without specifying a version to ovr_Initialize,
the DLL has no way of knowing the SDK version with which the application was built. This can result in
unpredictable or erratic behavior which might cause the application to crash.
• There are some USB chipsets that do not meet the USB 3.0 specification and are incompatible with the
Oculus Rift sensor. If you receive a notification in Oculus Home or the Oculus App, plug the sensor into a
different USB 3.0 port (blue). If none of the USB 3.0 ports work, plug the sensor into a USB 2.0 port (black).
• Antivirus software, such a McAfee, can cause installation issues. To work around the issue, make sure you
have the latest updates and disable real-time scanning.
• If you encounter installation issues, delete the Oculus folder and install the software again.
• If the Rift displays a message that instructs you to take off the headset, remove it and place it on a flat
surface for 10-15 seconds.
• The keyboard and mouse do not work in Oculus Home. To select an item, gaze at it and select it using the
Oculus Remote or Xbox controller.
• Bandwidth-intensive USB devices, such as web cams and high-end audio interfaces, might not work when
using the Rift. To work around this issue, install the device on another USB host controller or a separate
computer.
• For dual-boot systems using DK2 or CB1 HMDs, the OS selection screen might appear on the HMD instead
of the monitor. To work around this, try plugging the HMD into a different port or unplug the HMD while
booting.
• If you are running your application from the Unity Editor and you press the controller's home button to
return to Oculus Home, you will be prompted to close the application. If you select OK, Unity might remain
in a state where it is running, but will never get focus. To work around this, restart Unity.

Migrating from SDK 1.9.x to SDK 1.10.1


There are no breaking SDK changes or migration requirements other than installing the new SDK.

Changes For Release 1.9.x


This section describes changes to the Oculus SDK, the Oculus App, Oculus Home, and the runtime.

Overview of Release 1.9.0


This release includes performance improvements, memory management fixes, and minor updates.
10 | Developer Release Guide | Oculus Rift

New Features for 1.9.0


• Improved activity detection for Touch.
• Improved Asynchronous SpaceWarp. For more information, see Asynchronous SpaceWarp on page 101.
• Improved the Oculus Guardian system.

API Changes for 1.9.0


There are no breaking changes to version 1.9.0.

Known Issues
The following are known issues:

• If you encounter intermittent tracking issues, remove the batteries from any Engineering Sample Oculus
Remotes that you paired with your headset and contact Developer Relations for replacement remotes.
• When Touch controllers wake up from idle, they might move erratically for a couple of seconds before
tracking is reestablished.
• If you bypass the shim and communicate with the DLL directly, without specifying a version to ovr_Initialize,
the DLL has no way of knowing the SDK version with which the application was built. This can result in
unpredictable or erratic behavior which might cause the application to crash.
• There are some USB chipsets that do not meet the USB 3.0 specification and are incompatible with the
Oculus Rift sensor. If you receive a notification in Oculus Home or the Oculus App, plug the sensor into a
different USB 3.0 port (blue). If none of the USB 3.0 ports work, plug the sensor into a USB 2.0 port (black).
• Antivirus software, such a McAfee, can cause installation issues. To work around the issue, make sure you
have the latest updates and disable real-time scanning.
• If you encounter installation issues, delete the Oculus folder and install the software again.
• If the Rift displays a message that instructs you to take off the headset, remove it and place it on a flat
surface for 10-15 seconds.
• The keyboard and mouse do not work in Oculus Home. To select an item, gaze at it and select it using the
Oculus Remote or Xbox controller.
• Bandwidth-intensive USB devices, such as web cams and high-end audio interfaces, might not work when
using the Rift. To work around this issue, install the device on another USB host controller or a separate
computer.
• For dual-boot systems using DK2 or CB1 HMDs, the OS selection screen might appear on the HMD instead
of the monitor. To work around this, try plugging the HMD into a different port or unplug the HMD while
booting.
• If you are running your application from the Unity Editor and you press the controller's home button to
return to Oculus Home, you will be prompted to close the application. If you select OK, Unity might remain
in a state where it is running, but will never get focus. To work around this, restart Unity.

Migrating from SDK 1.8.x to SDK 1.9.0


There are no breaking SDK changes or migration requirements other than installing the new SDK.

Changes For Release 1.8.x


This section describes changes to the Oculus SDK, the Oculus App, Oculus Home, and the runtime.

Overview of Release 1.8.0


This release includes performance improvements, memory management fixes, and minor updates.
Oculus Rift | Developer Release Guide | 11

New Features for 1.8.0


• Oculus now provides a boundary system. When a user gets too close to the edge of the boundary,
translucent boundary markers are displayed in a layer that is superimposed over the game or experience.
For more information, see Oculus Guardian System on page 73.
• Added AdaptiveGpuPerformanceScale. The value is calculated as desired_GPU_utilization /
current_GPU_utilization. When it is 1.0, the GPU is performing optimally. When this drops below 1.0,
consider resolution and other changes to improve frame rate.

API Changes for 1.8.0


There are no breaking changes to version 1.8.0.

Known Issues
The following are known issues:

• If you encounter intermittent tracking issues, remove the batteries from any Engineering Sample Oculus
Remotes that you paired with your headset and contact Developer Relations for replacement remotes.
• When Touch controllers wake up from idle, they might move erratically for a couple of seconds before
tracking is reestablished.
• If you bypass the shim and communicate with the DLL directly, without specifying a version to ovr_Initialize,
the DLL has no way of knowing the SDK version with which the application was built. This can result in
unpredictable or erratic behavior which might cause the application to crash.
• There are some USB chipsets that do not meet the USB 3.0 specification and are incompatible with the
Oculus Rift sensor. If you receive a notification in Oculus Home or the Oculus App, plug the sensor into a
different USB 3.0 port (blue). If none of the USB 3.0 ports work, plug the sensor into a USB 2.0 port (black).
• Antivirus software, such a McAfee, can cause installation issues. To work around the issue, make sure you
have the latest updates and disable real-time scanning.
• If you encounter installation issues, delete the Oculus folder and install the software again.
• If the Rift displays a message that instructs you to take off the headset, remove it and place it on a flat
surface for 10-15 seconds.
• The keyboard and mouse do not work in Oculus Home. To select an item, gaze at it and select it using the
Oculus Remote or Xbox controller.
• Bandwidth-intensive USB devices, such as web cams and high-end audio interfaces, might not work when
using the Rift. To work around this issue, install the device on another USB host controller or a separate
computer.
• For dual-boot systems using DK2 or CB1 HMDs, the OS selection screen might appear on the HMD instead
of the monitor. To work around this, try plugging the HMD into a different port or unplug the HMD while
booting.
• If you are running your application from the Unity Editor and you press the controller's home button to
return to Oculus Home, you will be prompted to close the application. If you select OK, Unity might remain
in a state where it is running, but will never get focus. To work around this, restart Unity.

Migrating from SDK 1.7.x to SDK 1.8.0


There are no breaking SDK changes or migration requirements other than installing the new SDK.
12 | Developer Release Guide | Oculus Rift

Changes For Release 1.7.x


This section describes changes to the Oculus SDK, the Oculus App, Oculus Home, and the runtime.

Overview of Release 1.7.0


This release supports a new Touch controller button and introduces buffered haptics.

New Features for 1.7.0


• The values for IndexTrigger, HandTrigger, and Thumbstick include deadzones. Added
IndexTriggerNoDeadzone, HandTriggerNoDeadzone, and ThumbstickNoDeadzone, which return
values for the trigger, hand trigger, and thumbstick without deadzones.

API Changes for 1.7.0


There are no breaking changes to version 1.7.0.

Known Issues
The following are known issues:

• When Touch controllers wake up from idle, they might move erratically for up to 10 seconds before tracking
is reestablished.
• If you bypass the shim and communicate with the DLL directly, without specifying a version to ovr_Initialize,
the DLL has no way of knowing the SDK version with which the application was built. This can result in
unpredictable or erratic behavior which might cause the application to crash.
• There are some USB chipsets that do not meet the USB 3.0 specification and are incompatible with the
Oculus Rift sensor. If you receive a notification in Oculus Home or the Oculus App, plug the sensor into a
different USB 3.0 port (blue). If none of the USB 3.0 ports work, plug the sensor into a USB 2.0 port (black).
• Antivirus software, such a McAfee, can cause installation issues. To work around the issue, make sure you
have the latest updates and disable real-time scanning.
• If you encounter installation issues, delete the Oculus folder and install the software again.
• If the Rift displays a message that instructs you to take off the headset, remove it and place it on a flat
surface for 10-15 seconds.
• The keyboard and mouse do not work in Oculus Home. To select an item, gaze at it and select it using the
Oculus Remote or Xbox controller.
• Bandwidth-intensive USB devices, such as web cams and high-end audio interfaces, might not work when
using the Rift. To work around this issue, install the device on another USB host controller or a separate
computer.
• For dual-boot systems using DK2 or CB1 HMDs, the OS selection screen might appear on the HMD instead
of the monitor. To work around this, try plugging the HMD into a different port or unplug the HMD while
booting.
• If you are running your application from the Unity Editor and you press the controller's home button to
return to Oculus Home, you will be prompted to close the application. If you select OK, Unity might remain
in a state where it is running, but will never get focus. To work around this, restart Unity.

Migrating from SDK 1.6.x to SDK 1.7.0


There are no breaking SDK changes or migration requirements other than installing the new SDK.
Oculus Rift | Developer Release Guide | 13

Changes For Release 1.6.x


This section describes changes to the Oculus SDK, the Oculus App, Oculus Home, and the runtime.

Overview of Release 1.6.0


This release supports a new Touch controller button and introduces buffered haptics.

New Features for 1.6.0


• The latest version of the left Touch controller now has an Enter button. For more information, see Button
State on page 81.
• The SDK now supports buffered haptics for detailed effects. For more information, see Haptic Feedback on
page 85.

API Changes for 1.6.0


There are no breaking changes to version 1.6.0.

Known Issues
The following are known issues:

• There are some USB chipsets that do not meet the USB 3.0 specification and are incompatible with the
Oculus Rift sensor. If you receive a notification in Oculus Home or the Oculus App, plug the sensor into a
different USB 3.0 port (blue). If none of the USB 3.0 ports work, plug the sensor into a USB 2.0 port (black).
• Antivirus software, such a McAfee, can cause installation issues. To work around the issue, make sure you
have the latest updates and disable real-time scanning.
• If you encounter installation issues, delete the Oculus folder and install the software again.
• If the Rift displays a message that instructs you to take off the headset, remove it and place it on a flat
surface for 10-15 seconds.
• The keyboard and mouse do not work in Oculus Home. To select an item, gaze at it and select it using the
Oculus Remote or Xbox controller.
• Bandwidth-intensive USB devices, such as web cams and high-end audio interfaces, might not work when
using the Rift. To work around this issue, install the device on another USB host controller or a separate
computer.
• For dual-boot systems using DK2 or CB1 HMDs, the OS selection screen might appear on the HMD instead
of the monitor. To work around this, try plugging the HMD into a different port or unplug the HMD while
booting.
• If you are running your application from the Unity Editor and you press the controller's home button to
return to Oculus Home, you will be prompted to close the application. If you select OK, Unity might remain
in a state where it is running, but will never get focus. To work around this, restart Unity.

Changes For Release 1.5.x


This section describes changes to the Oculus SDK, the Oculus App, Oculus Home, and the runtime.

Overview of Release 1.5.0


The 1.5.0 release focuses on performance improvements and minor bug fixes.

New Features for 1.5.0


• The SDK now supports protected content, which is designed to prevent any mirroring of the compositor. For
more information, see Protecting Content on page 69.
14 | Developer Release Guide | Oculus Rift

• The Touch controller pairing process has improved. For more information, see Pairing the Oculus Touch
Controllers on page 101.

API Changes for 1.5.0


There are no breaking changes to version 1.5.0.

Known Issues
The following are known issues:

• Version 16.5.2 of the AMD driver can cause flickering on your computer screen. If you encounter this issue,
use the 16.5.1 driver.
• There are some USB chipsets that do not meet the USB 3.0 specification and are incompatible with the
Oculus Rift sensor. If you receive a notification in Oculus Home or the Oculus App, plug the sensor into a
different USB 3.0 port (blue). If none of the USB 3.0 ports work, plug the sensor into a USB 2.0 port (black).
• Antivirus software, such a McAfee, can cause installation issues. To work around the issue, make sure you
have the latest updates and disable real-time scanning.
• If you encounter installation issues, delete the Oculus folder and install the software again.
• If the Rift displays a message that instructs you to take off the headset, remove it and place it on a flat
surface for 10-15 seconds.
• The keyboard and mouse do not work in Oculus Home. To select an item, gaze at it and select it using the
Oculus Remote or Xbox controller.
• Bandwidth-intensive USB devices, such as web cams and high-end audio interfaces, might not work when
using the Rift. To work around this issue, install the device on another USB host controller or a separate
computer.
• For dual-boot systems using DK2 or CB1 HMDs, the OS selection screen might appear on the HMD instead
of the monitor. To work around this, try plugging the HMD into a different port or unplug the HMD while
booting.
• If you are running your application from the Unity Editor and you press the controller's home button to
return to Oculus Home, you will be prompted to close the application. If you select OK, Unity might remain
in a state where it is running, but will never get focus. To work around this, restart Unity.

Migrating from SDK 1.4.x to SDK 1.5.0


There are no breaking SDK changes or migration requirements other than installing the new SDK.

Changes For Release 1.4.x


This section describes changes to the Oculus SDK, the Oculus App, Oculus Home, and the runtime.

Overview of Release 1.4.0


The 1.4.0 release focuses on performance improvements and minor bug fixes.

New Features for 1.4.0


• The SDK now supports protected content, which is designed to prevent any mirroring of the compositor. For
more information, see Protecting Content on page 69.
• The Touch controller pairing process has improved. For more information, see Pairing the Oculus Touch
Controllers on page 101.
Oculus Rift | Developer Release Guide | 15

API Changes for 1.4.0


There are no breaking changes to version 1.4.0.

Known Issues
The following are known issues:

• Version 16.5.2 of the AMD driver can cause flickering on your computer screen. If you encounter this issue,
use the 16.5.1 driver.
• There are some USB chipsets that do not meet the USB 3.0 specification and are incompatible with the
Oculus Rift sensor. If you receive a notification in Oculus Home or the Oculus App, plug the sensor into a
different USB 3.0 port (blue). If none of the USB 3.0 ports work, plug the sensor into a USB 2.0 port (black).
• Antivirus software, such a McAfee, can cause installation issues. To work around the issue, make sure you
have the latest updates and disable real-time scanning.
• If you encounter installation issues, delete the Oculus folder and install the software again.
• If the Rift displays a message that instructs you to take off the headset, remove it and place it on a flat
surface for 10-15 seconds.
• The keyboard and mouse do not work in Oculus Home. To select an item, gaze at it and select it using the
Oculus Remote or Xbox controller.
• Bandwidth-intensive USB devices, such as web cams and high-end audio interfaces, might not work when
using the Rift. To work around this issue, install the device on another USB host controller or a separate
computer.
• For dual-boot systems using DK2 or CB1 HMDs, the OS selection screen might appear on the HMD instead
of the monitor. To work around this, try plugging the HMD into a different port or unplug the HMD while
booting.
• If you are running your application from the Unity Editor and you press the controller's home button to
return to Oculus Home, you will be prompted to close the application. If you select OK, Unity might remain
in a state where it is running, but will never get focus. To work around this, restart Unity.

Migrating from SDK 1.3.x to SDK 1.4.0


There are no breaking SDK changes or migration requirements other than installing the new SDK.

Changes For Release 1.3.x


This section describes changes to the Oculus SDK, the Oculus App, Oculus Home, and the runtime.

Overview of Release 1.3.2


The 1.3.2 release focuses on performance improvements and minor bug fixes.

Overview of Release 1.3.0


Release 1.3.0 is the first public release since SDK Version 0.8 and represents significant changes.

The Oculus SDK now provides Asynchronous TimeWarp (ATW). With ATW, TimeWarp is automatically applied
to the last rendered frame by the Oculus Compositor at the same time as distortion. For more information, see
Asynchronous TimeWarp on page 62.

Oculus now provides guidance and APIs for VR focus management, which helps you smoothly transition users
between your game or experience and Oculus Home. For more information, see VR Focus Management on
page 69.
16 | Developer Release Guide | Oculus Rift

New Features for 1.3.2


The following are new features for the Oculus SDK and runtime:
• Added support for installing Oculus and VR applications to different drives.
• Support for multiple payment options.

New Features for 1.3.0


The following are new features for the Oculus SDK and runtime:

• Added Asynchronous TimeWarp (ATW). For more information, see Asynchronous TimeWarp on page 62.
• Replaced the legacy runtime download with Oculus Setup. To download Oculus Setup, go to https://
www.oculus.com/setup/.
• Added features for VR focus management, which helps you smoothly transition users between your game or
experience and Oculus Home. For more information, see VR Focus Management on page 69.
• Updated queue ahead to be adaptive. Queue ahead previously processed frames 2.8 milliseconds in
advance to improve CPU and GPU parallelism. Adaptive queue ahead works similarly, but automatically
adjusts the start time from 0 to -1 frame (depending on the current performance).
• Added the performance indicator, which displays when the application is slow or not maintaining frame rate.
For more information, see Performance Indicator on page 100
• Added the Oculus Compositor performance HUD (ovrPerfHud_CompRenderTiming) and renamed the
application performance HUD (ovrPerfHud_RenderTiming) to ovrPerfHud_CompRenderTiming>.
• Support for DirectX 12 (DX12). For more information, refer to the Oculus Room Tiny (DX12) sample.

Runtime Changes for 1.3.0


Changes include:

• Added Oculus Setup, which installs and configures the Oculus Rift, installs the Oculus App, and installs
Oculus Home.
• Added the Oculus App, which replaces the Oculus Configuration Utility. To open the Oculus App, double-
click the Oculus desktop icon.
• Added Oculus Home, the VR-based application for launching games and experiences. If the Oculus App is
open, Oculus Home automatically runs whenever you put on the headset.
• Added the universal menu to perform many common tasks, such as recentering and lens adjustment. To
open the universal menu, press the Oculus button on the Oculus remote or the Xbox button on the Xbox
Controller
• Account, device, and privacy management tasks are now performed through the Oculus App. To make
changes, click the menu icon (three dots) in the upper right corner of the Oculus App and select Settings.
• Previously, when you locked your computer, any VR content continued to display in the headset. Now, when
you lock the computer, the headset displays a blank screen.

API Updates for 1.3.0


• Added ovrTrackerFlags, which returns whether the sensor is present and is in a valid position.
• Added ovrSessionStatus flags to for focus management. For more information, see VR Focus
Management on page 69.
• Added ovrTrackerPose to get the position of a specific sensor.
• Added bit masks that provide button touch states (ovrTouch_RButtonMask and ovrTouch_LButtonMask),
button press states (ovrButton_RMask and ovrButton_LMask), and finger poses (ovrTouch_RPoseMask and
ovrTouch_LPoseMask) for the Oculus Touch controllers.
Oculus Rift | Developer Release Guide | 17

• Added ovrTrackingState::CalibratedOrigin, which is the initial origin configured by the user when
he or she set up the headset. Every time a user recenters a headset, the updated location is relative to this
value.
• Added the ovr_ClearShouldRecenterFlag function for applications that want to manually calculate a
recentered tracking pose instead of using the provided SDK function ovr_RecenterTrackingOrigin.
• Added the utility function ovrPosef_FlipHandedness to help applications easily flip any ovrPosef from
a left to right-handed coordinate system.

API Changes for 1.3.0


This release represents a major revision of the API. Changes to the API include:

• The previous API returned a mutable struct which had to be modified in a specific way and passed back to
the API. The Oculus SDK now returns an Opaque handle that represents the TextureSwapChain.
• Removed the deprecated ovrHmdStruct synonym for ovrHmd.
• Removed the ovrStatusBits::ovrStatus_CameraPoseTracked and
ovrStatusBits::ovrStatus_PositionConnected flags.
• Moved CameraFrustumHFovInRadians, CameraFrustumVFovInRadians,
CameraFrustumNearZInMeters, and CameraFrustumFarZInMeters from ovrHmdDesc
to ovrTrackerDesc. Renamed them FrustumHFovInRadians, FrustumVFovInRadians,
FrustumNearZInMeters, and FrustumFarZInMeters
• Removed CameraPose and LeveledCameraPose from ovrTrackingState and added them to
ovrTrackerPose (as Pose and LeveledPose).
• Renamed HmdToEyeViewOffset to HmdToEyeOffset.
• Renamed ovrSwapTextureSet to ovrTextureSwapChain.
• Removed the deprecated functions ovr_ResetBackOfHeadTracking and
ovr_ResetMulticameraTracking.
• Removed ovr_SetEnabledCaps.
• Removed ovr_ConfigureTracking.
• Removed the ovr_GetEnabledCaps function.
• Removed the ovr_GetTrackingCaps function.
• Removed the ovrLayerDirect layer type.
• Renamed the ovr_RecenterPose function to ovr_RecenterTrackingOrigin.
• Changed ovrMaxLayerCount from 32 to 16.
• Moved the bindFlags parameter in function ovr_CreateTextureSwapChainDX to be part of the
ovrTextureSwapChainDesc structure.
• Added the output parameter outSensorSampleTime to the utility function ovr_GetEyePoses.
• Modified ovrMatrix4f_Projection to be right-handed by default and changed
ovrProjection_RightHanded to ovrProjection_LeftHanded.
• Modified the default handedness of ovrMatrix4f_Projection to be right-handed by default and
changed ovrProjection_RightHanded to be ovrProjection_LeftHanded.
• Added the ovrPosef_FlipHandedness utility function.
• Renamed ovrControllerType_SID to ovrControllerType_Remote.
• Renamed ovrMirrorTextureDesc::Flags to ovrMirrorTextureDesc::MiscFlags and
ovrTextureSwapChainDesc::Flags to ovrTextureSwapChainDesc::MiscFlags.
• Removed the OVR_Kernel.h header file.

Fixed Issues for 1.3.2


The following are fixed issues:
18 | Developer Release Guide | Oculus Rift

• Oculus App

• Purchase totals now display with correct local currency code.


• The back button now retains scroll position.
• Rift options are responsive if no internet connection is available.
• Fixed issue where user profile displays if no internet connection is available.
• Fixed timing and caching issues related to displaying the status of friend requests.
• Oculus Home

• Recently played apps only appear if the app is launchable.


• Better notification for USB chipsets that do not meet the USB 3.0 specification.
• Better processing and management of paid purchases.
• Purchase totals now display with correct local currency code.
• Additional UI fixes and updates.
• Platform SDK

• Improved UE4 linking.


• Fix message parsing resulting from ovr_Matchmaking_JoinRoom.

Known Issues
The following are known issues:

• There are some USB chipsets that do not meet the USB 3.0 specification and are incompatible with the
Oculus Rift sensor. If you receive a notification in Oculus Home or the Oculus App, plug the sensor into a
different USB 3.0 port (blue). If none of the USB 3.0 ports work, plug the sensor into a USB 2.0 port (black).
• Antivirus software, such a McAfee, can cause installation issues. To work around the issue, make sure you
have the latest updates and disable real-time scanning.
• If you encounter installation issues, delete the Oculus folder and install the software again.
• If the Rift displays a message that instructs you to take off the headset, remove it and place it on a flat
surface for 10-15 seconds.
• The keyboard and mouse do not work in Oculus Home. To select an item, gaze at it and select it using the
Oculus Remote or Xbox controller.
• Bandwidth-intensive USB devices, such as web cams and high-end audio interfaces, might not work when
using the Rift. To work around this issue, install the device on another USB host controller or a separate
computer.
• For dual-boot systems using DK2 or CB1 HMDs, the OS selection screen might appear on the HMD instead
of the monitor. To work around this, try plugging the HMD into a different port or unplug the HMD while
booting.
• If you are running your application from the Unity Editor and you press the controller's home button to
return to Oculus Home, you will be prompted to close the application. If you select OK, Unity might remain
in a state where it is running, but will never get focus. To work around this, restart Unity.

Migrating from SDK 1.3.0 to SDK 1.3.2


There are no breaking SDK changes or migration requirements other than installing the new SDK.

Migrating from SDK 1.2 to SDK 1.3.x


There are no breaking SDK changes or migration requirements other than installing the new SDK.
Oculus Rift | Developer Release Guide | 19

Migrating from SDK 0.8 to SDK 1.3.x


One of the most significant changes is that the SwapTextureSet internals are no longer exposed.
The previous API returned a mutable struct which had to be modified in a specific way and passed back to the
API. The 0.9 API returns an Opaque handle that represents the TextureSwapChain.
To modify your application loop:

1. Replace calls that use the ovrSwapTextureSet structure to determine the count and textures with
ovr_GetTextureSwapChainLength and ovr_GetTextureSwapChainBufferDX/GL.
2. Instead of indexing into the texture (and/or render target) arrays with the CurrentIndex field, use
ovr_GetTextureSwapChainCurrentIndex to obtain the index.
3. Instead of manually modifying the index, use ovr_CommitTextureSwapChain to advance the state of the
chain. Note that this should occur after rendering into the texture and before referencing the texture swap
chain in a SubmitFrame call.
4. Instead of using ovrTexture and pointer typecasting to obtain the specific texture object, use
ovr_GetMirrorTextureBufferDX/GL.

For more information, see Rendering to the Oculus Rift on page 48.

You will also need to update your application to support VR focus management. For more information, see VR
Focus Management on page 69.

Make the following additional changes:

1. Update your code to use ovr_GetSessionStatus::DisplayLost instead of


ovr_GetSessionStatus::HmdPresent. For more information, see VR Focus Management on page
69.
2. Update your code to call ovr_RecenterTrackingOrigin when
ovr_GetSessionStatus::ShouldRecenter is true.
3. When calling ovr_CreateTextureSwapChainDX, update your code to start passing BindFlags as part
of the ovrTextureSwapChainDesc structure.
4. When calling ovr_GetEyePoses, retrieve the outSensorSampleTime provided by the function instead of
manually querying it with ovr_GetTimeInSeconds.
5. When calling ovrMatrix4f_Projection, the default is now right-handed. If needed, flip the handedness
flag to ovrProjection_LeftHanded.

There are also several smaller changes:

1. Replace references to HmdToEyeViewOffset with HmdToEyeOffset.


2. Replace ovrTrackingState::CameraPose and ovrTrackingState::LeveledCameraPose with
ovrTrackerPose::Pose and ovrTrackerPose::LeveledPose
3. Replace ovrSessionStatus::HasVrFocus with ovrSessionStatus::IsVisible.
4. Remove calls to ovr_GetTrackingCaps, as it is no longer supported.
5. Remove calls to ovr_GetEnabledCaps, as it is no longer supported.
6. Remove usage of ovrLayerType_Direct, as it is no longer supported.
7. Rename usage of ovrTextureFlag_Typeless to ovrTextureMisc_DX_Typeless.
8. Rename calls to ovr_RecenterPose to ovr_RecenterTrackingOrigin.
9. Rename usage of ovrControllerType_SID to ovrControllerType_Remote.
10.Rename usage of ovrMirrorTextureDesc::Flags to ovrMirrorTextureDesc::MiscFlags and
ovrTextureSwapChainDesc::Flags to ovrTextureSwapChainDesc::MiscFlags.
11.Use ovrMaxLayerCount as the max number of layers. ovrMaxLayerCount was 32 in previous SDK
versions but has been reduced.
20 | Developer Release Guide | Oculus Rift

Changes For Release 0.8.0


The Oculus SDK 0.8 release changes the SDK from an HMD-based model to a session-based model and adds
several new features.

New Features
The following are new features for the Oculus SDK and runtime:

• Improved support for Windows 10.


• Added ovr_GetSessionStatus, which returns whether the headset is present and whether it has VR
focus and can render to the headset.
• Added ovr_Detect to OVR_CAPI_Util.h, which enables you to detect the presence of a headset without
initializing LibOVR. This can be useful when a game has VR and non-VR modes.
• Added HandStatusFlags to ovrTrackingState, which specifies whether the the Oculus Touch
controllers are being tracked. Status includes orientation and position.
• Added SensorSampleTime to ovrLayerEyeFov, which specifies when the render pose was calculated.
This is useful for measuring application tracking latency.
• Added ovr_GetTrackingCaps to get the tracking capabilities of the device.
• Usage of ovr_ConfigureTracking is no longer needed unless you want to disable tracking features. By
default, ovr_Create enables the full tracking capabilities supported by any given device.
• Added ovrLayerHudMode, which enables the headset user to see information about a layer.
• Added ovrControllerType_None and ovrControllerType_XBox to ovrControllerType.
• The Oculus Debug Tool was added to simplify troubleshooting. For more information see Oculus Debug
Tool on page 89.

Runtime Changes
Changes include:

• If you have an NVIDIA GPU, make sure to upgrade to the 358.70 driver or later. To get the driver, go to
https://developer.nvidia.com/gameworks-vr-driver-support.
• If you have an AMD GPU, we recommend the Catalyst 15.10 Beta or later. To get the driver, go to http://
support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx.

API Changes
This release represents a major revision of the API. Changes to the API include:

• Applications no longer need to call ovr_ConfigureTracking. ovr_Create automatically enables the full
tracking capabilities supported by any given device.
• Replaced ovr_GetFrameTiming with ovr_GetPredictedDisplayTime.
• Added latencyMarker to ovrTrackingState. When set to ovrTrue, this indicates that it will be used
in the rendering loop and will be used to calculate latency.
• To emphasize the session model, renamed ovrHmd to ovrSession and hmd to session.
• ovrLayerType_QuadInWorld and ovrLayerType_QuadHeadLocked were renamed to
ovrLayerType_Quad and are now differentiated by the ovrLayerFlag_HeadLocked flag.
• Added ovrMaxLayerCount, which sets the maximum number of layers to 32.
• Removed ovrInit_ServerOptional. If you use this to detect whether the OVRService is available,
periodically call ovr_Initialize or poll ovr_Detect instead.
• Removed ovrTrackingCap_Idle from ovrTrackingCaps.
Oculus Rift | Developer Release Guide | 21

Known Issues
The following are known issues:
• The Oculus service might crash when gathering diagnostic logs from the Oculus Config Util. If this happens,
the service will automatically restart and the logs will be retained.
• The Oculus service and Config Util might hang when running the demo scene and another VR app at the
same time"

Migrating from SDK 0.7.x to SDK 0.8


To migrate:

1. Update calls to ovr_GetFrameTiming with ovr_GetPredictedDisplayTime using the new syntax.


2. Update instances of ovrHmd hmd to ovrSession session.
3. Remove any code that uses ovrInit_ServerOptional. If you use this to detect whether the OVRService
is available, periodically call ovr_Initialize or poll ovr_Detect instead.
4. Remove calls to ovr_ConfigureTracking as the SDK enables all existing tracking features by default.
5. For ovrLayerType_EyeFov layers, fill in the SensorSampleTime with timestamps captured when
ovr_GetTrackingState is being called in the same frame just before generating the view matrix to render the
two eye textures.

Changes For Release 0.7.0


A number of changes were made to the API since the 0.6 release.

Overview of Oculus SDK 0.7


The most significant change is the addition of the Direct Driver Mode and the removal of Extended Mode.

Direct Driver Mode uses features of NVIDIA Gameworks VR or AMD LiquidVR to render directly to the HMD. If
the installed GPU does not support NVIDIA Gameworks VR or AMD LiquidVR, it uses Oculus Direct Mode.

The removal of the legacy Extended Mode means that users can no longer manage the Oculus Rift as an
extended monitor, which will affect some games. Additionally, Standalone Mode (which uses the Oculus Rift as
the only display device) is no longer supported.

Runtime Changes
This release represents significant changes to the runtime. Changes include:

• The runtime now supports Direct Driver Mode. Direct Driver Mode requires the latest GPU drivers:
• NVIDIA Driver Version 355.83 or later
• AMD Catalyst Display Driver Version 15.200.1062.1005 or later
• The SDK now uses sRGB-aware rendering instead of making assumptions. Unless you update your code,
rendered scenes will appear brighter than before. For more information, see Migrating to SDK 1.11 on page
8.
• Applications built against SDKs prior to 0.6 will not work with the 0.7 runtime. Developers should recompile
their applications using the 0.7 SDK.
• Preliminary support for Windows 10, which requires Direct Driver Mode. If you are using Windows 10, make
sure to get the recommended drivers.
• Extended Mode is no longer supported. This means that users can no longer manage the Oculus Rift as an
extended monitor, which will affect some games built against SDKs prior to 0.6.
• Standalone Mode (which uses the Oculus Rift as the only display device) is no longer supported.
22 | Developer Release Guide | Oculus Rift

• The runtime no longer supports the 32-bit versions of Windows. Although you will need to use a 64-bit
version to operate the runtime, 32-bit applications will still work properly.

API Changes
This release represents a major revision of the API. Changes to the API include:

• The SDK now uses sRGB-aware rendering instead of making assumptions. Unless you update your code,
rendered scenes will appear brighter than before. For more information, see Migrating to SDK 1.11 on page
8.
• Converted ovrHmd_XXX function names to ovr_XXX. This is for improved internal consistency and
consistency with mobile.
• Changed ovrHmd from a struct pointer to an opaque pointer and left ovrHmdDesc as a separate struct.
• Removed ovrHmd_ResetFrameTiming from the public interface, as it is no longer relevant since SDK
v0.6.0 and does not appear to be in use.
• Removed ovrHmdDesc::EyeRenderOrder as it is no longer relevant.
• Changed ovrHmdDesc::ProductName and ovrHmdDesc::Manufacturer from pointers to arrays.
• Renamed ovrHmdDesc::HmdCaps to ovrHmdDesc::AvailableHmdCaps to provide available
capabilities and added DefaultHmdCaps to provide the default capabilities.
• Added ovrHmdDesc::DefaultHmdCaps to convey the default caps to the user. This enables applications
to support future HMDs correctly by default and allows applications to OR in caps as needed.
• Renamed ovrHmdDesc::TrackingCaps to ovrHmdDesc::AvailableTrackingCaps to provide
available tracking capabilities and added DefaultTrackingCaps to provide default tracking capabilities.
• Added ovrHmdDesc::DefaultTrackingCaps to convey the default caps to the user. This enables
applications to support future HMDs correctly by default and allows applications to OR in caps as needed.
• Added ovrHmdDesc::DisplayRefreshRate, which represents the nominal refresh rate of the newly
created HMD.
• Removed the index parameter of ovrHmd_Create (ovr_Create) as we currently support a single HMD.
• Added the LUID parameter to ovrResult returned by ovrHmd_Create (ovr_Create).
• Added the ovrError_DisplayLost (6000) error return value to ovr_SubmitFrame.
• Removed ovrRenderAPIType::ovrRenderAPI_D3D9_Obsolete and
ovrRenderAPIType::ovrRenderAPI_D3D10_Obsolete.
• Removed ovrHmdCaps::ovrHmdCap_LowPersistence and made it enabled by default. This fixes a bug
in which an application that didn't call ovrHmd_SetEnabledCaps inherited the settings of the previous
application instead of the default settings.
• Removed ovrHmdCaps::ovrHmdCap_DynamicPrediction and made it enabled by default on HMDs
that support it (DK2 and later).
• Removed ovrInitFlags::ovrInit_ForceNoDebug.
• Made ovrLogCallback take a userData parameter, so application-specific context can be conveyed by
the SDK user.
• Renamed ovrHmd_ResetOnlyBackOfHeadTrackingForConnectConf to
ovr_ResetBackOfHeadTracking.
• Added ovr_ResetMulticameraTracking to reset the location of the headset.
• Removed ovr_WaitTillTime, as it has been deprecated for a while and implements an undesirable spin
wait.
• ovrHmd_Detect was removed. You can now use ovr_GetHmdDesc(nullptr) instead.
• ovrHmd_CreateDebug was removed. To enable a virtual HMD when a physical one isn't present, use
RiftConfigUtil utility.
• ovr_CreateSwapTextureSetD3D11 now takes an additional flags parameter.
Oculus Rift | Developer Release Guide | 23

Known Issues
The following are known issues:
• Previously, a user could unplug the Oculus Rift and plug it back in without affecting the running app. With
0.7, the app won't continue to work unless the app recreates the shared textures.
• OpenGL-based applications can judder if the application is configured to sync to the monitor's refresh rate
instead of the refresh rate of the headset. To work around this, set the NVIDIA Vertical Sync control panel
option to Use the 3D Application Setting or set the AMD control panel Wait for Vertical Refresh option to
disabled.
• If you receive the error message “HMD powered off, check HDMI connection” in the Oculus Configuration
Utility with your headset and sensor correctly plugged in, make sure to update all of your system drivers
(graphics, USB, and so on).
• The mirror window might be blank and the headset might not work after installing the 0.7 runtime and the
NVIDIA 355.83 driver (or later). To fix this issue, restart your computer with the headset plugged in. You
should only have to do this once.

Migration: General Changes

1. Change ovrHmd_XXX function references to ovr_XXX.


2. ovr_Create now returns an out LUID. Use the LUID to select the IDXGIAdapter on which the ID3D11Device
is created.
3. Update ovr_Create (ovrHmd_Create) to no longer pass the index parameter as the SDK supports a single
HMD.
4. Update your code to handle the ovrError_DisplayLost (6000) error from ovr_SubmitFrame.
5. If needed, update your code to get the HMD specifications from ovrHmdDesc. The ovrSessionstruct pointer
was changed to an opaque pointer; ovrHmdDesc is now a separate struct.
6. To get information about the available capabilities of the HMD, read ovrHmdDesc::AvailableHmdCaps
instead of ovrHmdDesc::HmdCaps. Applications do not need to call ovr_SetEnabledCaps; the default caps
reported by ovrHmdDesc::DefaultHmdCaps work fine.
7. To get information about the tracking capabilities of the HMD, read ovrHmdDesc::AvailableTrackingCaps
instead of ovrHmdDesc::TrackingCaps. To get the default tracking capabilities, read
ovrHmdDesc::DefaultTrackingCaps.
8. Do not enable ovrHmdCaps::ovrHmdCap_LowPersistence and
ovrHmdCaps::ovrHmdCap_DynamicPrediction as they are now internal and enabled by default.
9. Remove any calls to ovr_WaitTillTime.

Migration: Compositor and sRGB/Gamma Correctness


Prior to Oculus SDK 0.7, the Oculus compositor treated all eye texture formats as sRGB color textures,
even when marked otherwise. As a result, when applications provided sRGB-correct textures (e.g.
DXGI_FORMAT_R8G8B8A8_UNORM_SRGB and GL_SRGB8_ALPHA8), the results would look wrong. The
compositor now requires all provided textures to be labelled with correct sRGB format. If an application uses an
eye texture format such as DXGI_FORMAT_R8G8B8A8_UNORM, this will cause the results in the HMD display
to look too bright even though the mirror texture visible on the desktop window might look normal.

There are a few ways to address this, but we will describe two of them. The first ensures that the application
and Oculus compositor correctly manage sRGB. The second is for existing applications that want to make the
fewest rendering changes.
Note: Oculus strongly recommends that you don't simply apply pow(2.2) in the shader that writes into
an 8-bit eye texture. While the results in the final HMD output might look right initially, there will be
significant luminance-banding issues that only show up under subtle visual situations.
24 | Developer Release Guide | Oculus Rift

The recommended method requires you to render in an sRGB-correct fashion, and then set the texture
format accordingly. For D3D11, most applications use DXGI_FORMAT_R8G8B8A8_UNORM_SRGB instead of
DXGI_FORMAT_R8G8B8A8_UNORM. For OpenGL, the correct format is GL_SRGB8_ALPHA8, but you have to
make sure that the application calls glEnable(GL_FRAMEBUFFER_SRGB).
In some cases, making an existing application sRGB-correct might not be straightforward. There are some
implications, including sRGB-correct diffuse texture sampling, linear-space and energy-conserving shading,
and GPU-accelerated gamma read/write technology that are available in any modern GPU. We expect some
applications that are gamma-correct do not rely on GPU read/write conversions and have opted to handle the
conversions via shader math such as applying pow(2.2) and its inverse. This approach requires some finesse
which is explained in the second method.

Although not recommended, the least resistance method allows the application to keep its rendering as-is
while only modifying the calls to ovr_CreateSwapTextureSetD3D11() for D3D11 rendering or modifying the GL
context state for OpenGL rendering.

Since each render API requires different approaches, a detailed explanation is provided in the function
declarations of both ovr_CreateSwapTextureSetD3D11() and ovr_CreateSwapTextureSetGL() in
OVR_CAPI_D3D.h and OVR_CAPI_GL.h respectively. For this purpose and other potential uses, we
introduced a ovrSwapTextureSetD3D11_Typeless flag for D3D11 that allows the application to create DXGI-
specific typeless textures that can have a ShaderResourceView that does not have to be the same as the
RenderTargetView. Keep in mind that this also applies as a mirror texture creation flag. For OpenGL, the
application can simply drop the glEnable(GL_FRAMEBUFFER_SRGB) while still creating a swap texture set with
the format GL_SRGB8_ALPHA8.

For more information, refer to the "Swap Texture Set Initialization" section and the code samples provided in
the SDK.

Changes For Release 0.6.0


A number of changes were made to the API since the 0.5 release.

Overview of Oculus SDK 0.6.0.1


This Oculus SDK 0.6.0.1 release introduces queue ahead. Queue ahead improves CPU and GPU parallelism and
increases the amount of time that the GPU has to process frames. For more information, see Adaptive Queue
Ahead on page 63.

Overview of Oculus SDK 0.6


The Oculus SDK 0.6 release introduces the compositor, a separate process for applying distortion and
displaying scenes and other major changes.

There are four major changes to Oculus SDK 0.6:


• The addition of the compositor service and texture sets.
• The addition of layer support.
• Removal of client-based rendering.
• Simplification of the API.

The compositor service moves distortion rendering from the application process to the OVRServer process
using texture sets that are shared between the two processes. A texture set is basically a swap chain, with
buffers rotated to allow game rendering to proceed while the current frame is distorted and displayed.

Layer support allows multiple independent application render targets to be independently sent to the HMD.
For example, you might render a heads-up display, background, and game space each in their own separate
render target. Each render target is a layer, and the layers are combined by the compositor (rather than the
Oculus Rift | Developer Release Guide | 25

application) right before distortion and display. Each layer may have a different size, resolution, and update
rate.
The API simplification is a move towards the final API, which primarily removes support for application-based
distortion rendering. For more information on each of these, see the Developer Guide for this SDK release. API
changes are discussed briefly below.

Note: Applications built with the 0.5 and 0.4 SDK are supported by the 0.6 runtime associated with this
SDK. However, these applications behave as they previously did and do not take advantage of the new
0.6 features.

New Features in 0.6.0.1


The following are major new features for the Oculus SDK and runtime:

• Added queue ahead. Queue ahead improves CPU and GPU parallelism and increases the amount of time
that the GPU has to process frames. For more information, see Adaptive Queue Ahead on page 63.
• Added the Debug HUD, which provides useful information while using the HUD. For more information, see
Performance Head-Up Display on page 92. To enable it in OculusWorldDemo, press F11.
• Added two samples:

• ORT (Direct Quad)—verifies and demonstrates direct quads.


• ORT (Performance HUD)—demonstrates the performance HUD.
• Added additional menu options to OculusWorldDemo.

New Features in 0.6


The following are major new features for the Oculus SDK and runtime:

• Added the compositor service, which improves compatibility and support for simultaneous applications.
• Added layer support, which increases flexibility and enables developers to tune settings based on the
characteristics and requirements of each layer.
• Significantly improved error handling and reporting.
• Added a suite of new sample projects which demonstrate techniques and the new SDK features.
• Removed application-side DirectX and OpenGL API shims, which results in improved runtime compatibility
and reliability.
• Simplified the API, as described below.
• Changed Extended mode to use the compositor process. Rendering setup is now identical for extended and
direct modes. The application no longer needs to know which mode is being used.
• Extended mode can now support mirroring, which was previously only supported by Direct mode.
• Simplified the timing interface and made it more robust by moving to a single function:
ovr_GetFrameTiming.
• Fixed a number of bugs and reliability problems.

The following are major new features for Unity:

• Disabled eye texture anti-aliasing when using deferred rendering. This fixes the blackscreen issue.
• Eliminated the need for the DirectToRift.exe in Unity 4.6.3p2 and later.
• Removed the hard dependency from the Oculus runtime. Apps now render in mono without tracking when
VR isn't present.

API Changes in 0.6


This release represents a major revision of the API. These changes significantly simplify the API while retaining
essential functionality. Changes to the API include:
26 | Developer Release Guide | Oculus Rift

• Removed support for application-based distortion rendering. Removed functions include


ovr_CreateDistortionMesh, ovr_GetRenderScaleAndOffset, and so on. If you feel that you require
application-based distortion rendering, please contact Oculus Developer Relations.
• Introduced ovrSwapTextureSets, which are textures shared between the OVRServer process and the
application process. Instead of using your own back buffers, applications must render VR scenes and layers
to ovrSwapTextureSet textures. Texture sets are created with ovr_CreateSwapTextureSetD3D11/OpenGL
and destroyed with ovr_DestroySwapTextureSet.
• ovr_BeginFrame was removed and ovr_EndFrame was replaced with ovr_SubmitFrame.
• Added a new layer API. A list of layer pointers is passed into ovr_SubmitFrame.
• Improved error reporting, including adding the ovrResult type. Some API functions were changed to return
ovrResult. ovr_GetLastError was replaced with ovr_GetLastErrorInfo.
• Removed ovr_InitializeRenderingShim, as it is no longer necessary with the service-based compositor.
• Removed some ovrHmdCaps flags, including ovrHmdCap_Present, ovrHmdCap_Available,
ovrHmdCap_Captured, ovrHmdCap_ExtendDesktop, ovrHmdCap_NoMirrorToWindow, and
ovrHmdCap_DisplayOff.
• Removed ovrDistortionCaps. Some of this functionality is present in ovrLayerFlags.
• ovrHmdDesc no longer contains display device information, as the service-based compositor now handles
the display device.
• Simplified ovrFrameTiming to only return the DisplayMidpointSeconds prediction timing value.
All other timing information is now available though the thread-safe ovr_GetFrameTiming. The
ovr_BeginFrameTiming and EndFrameTiming functions were removed.
• Removed the LatencyTest functions (e.g. ovr_GetLatencyTestResult).
• Removed the PerfLog functions (e.g. ovr_StartPerfLog), as these are effectively replaced by ovrLogCallback
(introduced in SDK 0.5).
• Removed the health-and-safety-warning related functions (e.g. ovr_GetHSWDisplayState). The HSW
functionality is now handled automatically.
• Removed support for automatic HMD mirroring. Applications can now create a mirror texture (e.g. with
ovr_CreateMirrorTextureD3D11 / ovr_DestroyMirrorTexture) and manually display it in a desktop window
instead. This gives developers flexibility to use the application window in a manner that best suits their
needs, and removes the OpenGL problem with previous SDKs in which the application back-buffer limited
the HMD render size.
• Added ovrInitParams::ConnectionTimeoutMS, which allows the specification of a timeout for ovr_Initialize to
successfully complete.
• Removed ovr_GetHmdPosePerEye and added ovr_CalcEyePoses.

Bugs Fixed
The following bugs were fixed since 0.5:

• HmdToEyeViewOffset provided the opposite of the expected result; it now properly returns a vector to each
eye's position from the center.
• If both the left and right views are rendered to the same texture, there is less "bleeding" between the two.
Apps still need to keep a buffer zone between the two regions to prevent texture filtering from picking
up data from the adjacent eye, but the buffer zone is much smaller than before. We recommend about 8
pixels, rather than the previously recommended 100 pixels. Because systems vary, feedback on this matter is
appreciated.
• Fixed a crash when switching between Direct and Extended Modes.
• Fixed performance and judder issues in Extended Mode.

Known Issues
The following are known issues:
Oculus Rift | Developer Release Guide | 27

• Pre-Kepler NVIDIA GPUs might return "No display attached to compositor" or "SubmitLayers failed" errors,
which can result in a black screen for some applications. NVIDIA GTX 600 GPUs and later use the Kepler or
Maxwell architectures.
• Some Intel GPUs might return "No display attached to compositor" or "SubmitLayers failed" errors, which
can result in a black screen for some applications.
• Standard RGB (sRGB) is not properly supported.
• Timeout Detection Recovery (TDR) is not properly supported.
• Windows 10 is not yet supported.
• Extended mode does not currently work with AMD GPUs due to issues in the AMD drivers.
• For NVIDIA GPUs, please use driver version 350.12. The latest NVIDIA driver is unstable with the runtime.
• Switching from Extended Mode to Direct Mode while running Oculus World Demo causes sideways
rendering.
• Judder with Oculus Room Tiny Open GL examples in Windows 7.
• The Oculus Configuration Utility can crash when the Demo Scene is repeatedly run.
• Application usage of CreateDXGIFactory can result in reduced performance; applications should use
CreateDXGIFactory1 instead. Support for CreateDXGIFactory is deprecated in this release and will be
removed in a future release.
• For Windows 7 in Extended Mode, any monitors connected to the computer go black when the headset is
on and return to normal operation when the headset is removed.
• For Windows 7 in Extended Mode, if the headset is placed above the monitor(s), all displays might go black.
The workaround is to place the headset to the right or left of the monitor(s).
• PC SDK applications will crash if the OVR service is not running.

Migration: Texture Sets and Layers


Prior to Oculus SDK 0.6, the Oculus SDK relied on the game engine to create system textures for eye
rendering. To use the SDK, developers stored the API-specific texture pointers into the ovrTexture structure
and passed them into ovr_EndFrame for distortion and display on the Rift. After EndFrame returned, a new
frame was rendered into the texture, repeating the process Oculus SDK 0.6 changes this in two major ways.

The first is by introducing the concept of ovrSwapTextureSet, a collection of textures that are used in
round-robin fashion for rendering. A texture set is basically a swap chain for rendering to the Rift, with
buffers rotated to allow the game rendering to proceed while the current frame is distorted and displayed.
Unlike textures in earlier SDKs, ovrSwapTextureSet and its internal textures must be created by calling
ovr_CreateSwapTextureSetD3D11 or ovr_CreateSwapTextureSetGL. Implementing these functions in the SDK
allows us to support synchronization and properly share texture memory with the compositor process. For more
details on texture sets, we advise reading the “New Features” section on them.

The second is with the introduction of layers. Instead of a single pair of eye-buffers holding all the visual data in
the scene, the application can have multiple layers of different types overlaid on each other. Layers are a large
change to the API, and we advise reading the “New Features” section on them for more details. This part of
the guide gives only the bare minimum instructions to port an existing single-layer app to the new API.

With the introduction of texture sets and layers, you need to make several changes to how your application
handles eye buffer textures in the game engine.

Migration: Render Target Creation Code


Previously, the app would have used the API's standard texture creation calls to make render targets for the eye
buffers - either one render target for each eye, or a single shared render target with the eyes side-by-side on
it. Fundamentally the same process happens, but using the ovr_CreateSwapTextureSet function for your API
instead. So the code might have been similar to the following:
28 | Developer Release Guide | Oculus Rift

D3D11_TEXTURE2D_DESC dsDesc;
dsDesc.Width = size.w;
dsDesc.Height = size.h;
dsDesc.MipLevels = 1;
dsDesc.ArraySize = 1;
dsDesc.Format = DXGI_FORMAT_B8G8R8A8_UNORM;
dsDesc.SampleDesc.Count = 1;
dsDesc.BindFlags = D3D11_BIND_SHADER_RESOURCE | D3D11_BIND_RENDER_TARGET;
DIRECTX.Device->CreateTexture2D(&dsDesc, NULL, &(eye->Tex));
DIRECTX.Device->CreateShaderResourceView(Tex, NULL, &(eye->TexSv));
DIRECTX.Device->CreateRenderTargetView(Tex, NULL, &(eye->TexRtv));

Instead, the replacement code should be similar to the following:

D3D11_TEXTURE2D_DESC dsDesc;
dsDesc.Width = size.w;
dsDesc.Height = size.h;
dsDesc.MipLevels = 1;
dsDesc.ArraySize = 1;
dsDesc.Format = DXGI_FORMAT_B8G8R8A8_UNORM;
dsDesc.SampleDesc.Count = 1;
dsDesc.BindFlags = D3D11_BIND_SHADER_RESOURCE | D3D11_BIND_RENDER_TARGET;
ovr_CreateSwapTextureSetD3D11(session, DIRECTX.Device, &dsDesc, &(eyeBuf->TextureSet));
for (int i = 0; i < eyeBuf->TextureSet->TextureCount; ++i)
{
ovrD3D11Texture* tex = (ovrD3D11Texture*)&(eyeBuf->TextureSet->Textures[i]);
DIRECTX.Device->CreateRenderTargetView(tex->D3D11.pTexture, NULL, &(eyeBuf->TexRtv[i]));
}

Note:

The application must still create and track the RenderTargetViews on the textures inside the texture sets
- the SDK does not do this automatically (not all texture sets need to be rendertargets). The SDK does
create ShaderResourceViews for its own use.
Texture sets cannot be multisampled - this is an unfortunate restriction of the way the OS treats these
textures. If you wish to use MSAA eyebuffers, you must create the MSAA eyebuffers yourself as before,
then create matching non-MSAA texture sets, and have each frame resolve the MSAA eyebuffer target
into the respective texture set. See the OculusRoomTiny (MSAA) sample app for more information.

Before shutting down the HMD using ovr_Destroy() and ovr_Shutdown(), make sure to destroy the
texture sets using ovr_DestroySwapTextureSet.

Migration: Scene Rendering


Scene rendering would have previously just rendered to the eyebuffers created above. Now, a texture set is a
series of textures, effectively in a swap chain, so a little more work is required. Scene rendering now needs to:

• Increment the value of ovrSwapTextureSet::CurrentIndex, wrapping around to zero if it equals


ovrSwapTextureSet::TextureCount. This makes sure the application is rendering to a new texture, not one
that is currently being displayed.
• Select the right texture or RenderTargetView in the set with the new ovrSwapTextureSet::CurrentIndex.
• Bind that as a rendertarget and render the scene to it, just like existing code.

So previously, for each eye:

DIRECTX.SetAndClearRenderTarget(pEyeRenderTexture[eye]->TexRtv, pEyeDepthBuffer[eye]);
DIRECTX.SetViewport(Recti(eyeRenderViewport[eye]));

The new code looks more like:


Oculus Rift | Developer Release Guide | 29

ovrSwapTextureSet *sts = &(pEyeRenderTexture[eye]->TextureSet);


sts->CurrentIndex = (sts->CurrentIndex + 1) % sts->TextureCount;
int texIndex = sts->CurrentIndex;
DIRECTX.SetAndClearRenderTarget(pEyeRenderTexture[eye]->TexRtv[texIndex], pEyeDepthBuffer[eye]);
DIRECTX.SetViewport(Recti(eyeRenderViewport[eye]));

Note: The introduction of texture sets does not technically prevent the game from using its own texture
buffers for rendering; an application can use its own buffers and copy the data into the Oculus SDK
textures before submit. However, because this would incur the overhead of copying eye buffers every
frame, we recommend using the SDK-provided buffers whenever possible.

Migration: Frame Submission


The game then submits the frame by calling ovr_SubmitFrame and passing in the texture set inside a layer,
which replaces the older ovr_EndFrame function which took two raw ovr*Texture structures. The layer type
that matches the previous eye-buffer behavior is the “EyeFov” layer type - that is, an eyebuffer with a supplied
FOV, viewport, and pose. Additionally, ovr_SubmitFrame requires a few more pieces of information from the
app that are now explicit instead of being implicit. Doing so allows them to dynamically adjusted, and supplied
separately for each layer. The new state required is:

• The viewport on the eyebuffer used for rendering each eye. This used to be stored inside the ovrTexture but
is now passed in explicitly each frame.
• The field of view (FOV) used for rendering each eye. This used to be set/queried at device creation, but is
now passed in explicitly each frame. In this case we still use the default that the SDK recommended, which is
now returned in ovrHmdDesc::DefaultEyeFov[]

So previously the code read:

ovrD3D11Texture eyeTexture[2];
for (int eye = 0; eye < 2; eye++)
{
eyeTexture[eye].D3D11.Header.API = ovrRenderAPI_D3D11;
eyeTexture[eye].D3D11.Header.TextureSize = pEyeRenderTexture[eye]->Size;
eyeTexture[eye].D3D11.Header.RenderViewport = eyeRenderViewport[eye];
eyeTexture[eye].D3D11.pTexture = pEyeRenderTexture[eye]->Tex;
eyeTexture[eye].D3D11.pSRView = pEyeRenderTexture[eye]->TexSv;
}
ovr_EndFrame(HMD, EyeRenderPose, &eyeTexture[0].Texture);

This is replaced with the following.

ovrLayerEyeFov ld;
ld.Header.Type = ovrLayerType_EyeFov;
ld.Header.Flags = 0;
for (int eye = 0; eye < 2; eye++)
{
ld.ColorTexture[eye] = pEyeRenderTexture[eye]->TextureSet;
ld.Viewport[eye] = eyeRenderViewport[eye];
ld.Fov[eye] = HMD->DefaultEyeFov[eye];
ld.RenderPose[eye] = EyeRenderPose[eye];
}
ovrLayerHeader* layers = &ld.Header;
ovrResult result = ovr_SubmitFrame(HMD, 0, nullptr, &layers, 1);

The slightly odd-looking indirection through the variable “layers” is because this argument to ovr_SubmitFrame
would normally be an array of pointers to each of the visible layers. Since there is only one layer in this case, it's
not an array of pointers, just a pointer.

Migration: Other SDK Changes


Before you begin migration, make sure to do the following:
30 | Developer Release Guide | Oculus Rift

• #include “OVR_CAPI_Util.h” and add OVR_CAPI_Util.cpp and OVR_StereoProjection.cpp to your project so


you can use ovr_CalcEyePoses(..).
• Allocate textures with ovr_CreateSwapTextureSetD3D11(..) instead of ID3D11Device::CreateTexture2D(..)
and create multiple textures as described above.
In this release, there are significant changes to the game loop. For example, the ovr_BeginFrame function is
removed and ovr_EndFrame is replaced by ovr_SubmitFrame . To update your game loop:

1. Replace calls to ovr_GetEyePoses(..) with ovr_calcEyePoses(..):

ovrTrackingState state;
ovr_GetEyePoses(m_hmd, frameIndex, m_offsets, m_poses, &state);

becomes:

ovrFrameTiming timing = ovr_GetFrameTiming(m_hmd, frameIndex);


ovrTrackingState state = ovr_GetTrackingState(m_hmd, timing.DisplayMidpointSeconds);
ovr_CalcEyePoses(state.HeadPose.ThePose, m_offsets, poses);

2. Replace calls to ovr_ConfigureRendering(..) with ovr_GetRenderDesc(..) as described above:

ovrBool success = ovr_ConfigureRendering(m_hmd, &apiConfig, distortionCaps, m_fov, desc);

becomes:

for (int i = 0; i < ovrEye_Count; ++i)


desc[i] = ovr_GetRenderDesc(m_hmd, (ovrEyeType)i, m_fov[i]);

3. Swap the target texture each frame. Instead of rendering to the same texture or pair of textures each frame,
you need to advance to the next texture in the ovrSwapTextureSet:

sts->CurrentIndex = (sts->CurrentIndex + 1) % sts->TextureCount;


camera->SetRenderTarget(((ovrD3D11Texture&)sts->Textures[ts->CurrentIndex]).D3D11.pTexture);

4. Remove calls to ovr_BeginFrame(..).


5. Replace calls to ovr_EndFrame(..) with ovr_SubmitFrame(..):

ovr_EndFrame(m_hmd, poses, textures);

becomes:

ovrViewScaleDesc viewScaleDesc;
viewScaleDesc.HmdSpaceToWorldScaleInMeters = 1.0f;

ovrLayerEyeFov ld;
ld.Header.Type = ovrLayerType_EyeFov;
ld.Header.Flags = 0;

for (int eye = 0; eye < 2; eye++)


{
viewScaleDesc.HmdToEyeViewOffset[eye] = m_offsets[eye];

ld.ColorTexture[eye] = m_texture[eye];
Oculus Rift | Developer Release Guide | 31

ld.Viewport[eye] = m_viewport[eye];
ld.Fov[eye] = m_fov[eye];
ld.RenderPose[eye] = m_poses[eye];
}

ovrLayerHeader* layers = &ld.Header;


ovr_SubmitFrame(m_hmd, frameIndex, &viewScaleDesc, &layers, 1);

Note:

Please refer to OculusRoomTiny source code for an example of how ovrTextureSets can be used to
submit frames in the updated game loop.

ovr_SubmtiFrame on success can return a couple different values. ovrSuccess means distortion
completed successfully and was displayed to the HMD. ovrSuccess_NotVisible means the frame
submission succeeded but that what was rendered was not visible on the HMD because another VR app
has focus. In this case the application should skip rendering and resubmit the same frame until submit
frame returns ovrSuccess rather than ovrSuccess_NotVisible.

The 0.6 simplifies the PC SDK, so you can remove a lot of functions that are no longer needed. To remove
functions:

1. Remove calls to ovr_AttachToWindow(..).


2. Remove calls to ovr_DismissHSWDisplay(..).
3. Remove calls to ovr_GetHSWDisplayState(..).
4. Remove all references to ovrTextureHeader::RenderViewport and use your own per-texture ovrRecti
viewports.

Now that you have finished updating your code, you are ready to test the results. To test the results:

1. With the HMD in a resting state, images produced by 0.6 should match ones produced by 0.5.
2. When wearing the HMD, head motion should feel just as responsive as before, even when you twitch your
head side-to-side and up-and-down.
3. Use the DK2 latency tester to confirm your render timing has not changed.

Changes For Release 0.5


A number of changes were made to the API since the 0.4 release.
The Oculus SDK 0.5 moves from static linking to a dynamic link library (DLL) model. Using a DLL offers several
advantages:

• As long as the arguments and return values are the same, experiences do not need to be recompiled to take
advantage of the updated library.
• Localization into new languages is easier because the functions remain consistent across languages.
• The DLL can be updated to take advantage of new features and headsets without affecting current games
and experiences.

In addition to moving to a DLL model, the following changes were made:

• SDK versions now use a product.major.minor.patch format. The product value is currently set to 0 as this is a
pre-release product. For example, 0.5.0.1 means Product 0, Major 5, Minor 0, Patch 1.
• Significant improvements were made to tracking behavior and performance.
• Improvements were made to the samples.
• The SDK now provides better reporting of display driver incompatibility.
• Support for DX10 was removed.
32 | Developer Release Guide | Oculus Rift

• DX9 support is deprecated and will be removed in a future version of the SDK.
• A bug was fixed where full persistence was inadvertently enabled due to device initialization races.
• Improvements were made to headset USB sleep management.
• Uncommon deadlocks were fixed in the runtime service.
• Diagnostics and configuration capture were improved.
• Monitor rotation is now supported in the legacy Extended mode.
• Default time warp scheduling is improved, which should reduce frame drops.

The following SDK changes were made:

• Moved and renamed LibOVR/Src/OVR_CAPI.h to LibOVR/Include/OVR_CAPI_0_5_0.h. Some additional


public headers such as OVR_Version.h have been moved to LibOVR/Include/. Any other previously public
headers are now private to LibOVR.
• Added enum ovrHmdCaps::ovrHmdCap_DebugDevice.
• Renamed enum ovrDistortionCaps::ovrDistortionCap_ProfileNoTimewarpSpinWaits to
ovrDistortionCap_ProfileNoSpinWaits.
• Removed enum ovrDistortionCaps::ovrDistortionCap_NoTimewarpJit.
• Added enum ovrDistortionCaps::ovrDistortionCap_TimewarpJitDelay.
• Removed ovrTrackingState::LastVisionProcessingTime.
• Removed ovrTrackingState::LastVisionFrameLatency.
• ovr_Initialize now takes a params argument. See the in-code documentation for details.
• ovr_Initialize now returns false for additional reasons.
• No API functions can be called after ovr_Shutdown except ovr_Initialize.
• The hmdToEyeViewOffset argument for ovr_GetEyePosess is now const.
• Added the ovrQuatf playerTorsoMotion argument to ovr_GetEyeTimewarpMatricesDebug.
• Added ovr_TraceMessage.

Changes For Release 0.4


A number of changes were made to the API since the 0.3.2 Preview release.

These are summarized as follows:

• Removed the method ovr_GetDesc. The ovrHmd handle is now a pointer to a ovrHmdDesc struct.
• The sensor interface has been simplified. Your application should now call ovr_ConfigureTracking at
initialization and ovr_GetTrackingState or ovr_GetEyePoses to get the head pose.
• ovr_BeginEyeRender and ovr_EndEyeRender have been removed. You should now use
ovr_GetEyePoses to determine predicted head pose when rendering each eye. Render poses and
ovrTexture info is now passed into ovr_EndFrame rather than ovr_EndEyeRender.
• ovrSensorState struct is now ovrTrackingState. The predicted pose Predicted is now named
HeadPose. CameraPose and LeveledCameraPose have been added. Raw sensor data can be obtained
through RawSensorData.
• ovrSensorDesc struct has been merged into ovrHmdDesc.
• Addition of ovr_AttachToWindow. This is a platform specific function to specify the application window
whose output will be displayed on the HMD. Only used if the ovrHmdCap_ExtendDesktop flag is false.
• Addition of ovr_GetVersionString. Returns a string representing the libOVR version.

There have also been a number of minor changes:

• Renamed ovrSensorCaps struct to ovrTrackingCaps.


Oculus Rift | Developer Release Guide | 33

• Addition of ovrHmdCaps::ovrHmdCap_Captured flag. Set to true if the application captured ownership


of the HMD.
• Addition of ovrHmdCaps::ovrHmdCap_ExtendDesktop flag. The display driver is in compatibility mode
(read only).
• Addition of ovrHmdCaps::ovrHmdCap_NoMirrorToWindow flag. Disables mirroring of HMD output to
the window. This may improve rendering performance slightly (only if ’Extend-Desktop’ is off).
• Addition of ovrHmdCaps::ovrHmdCap_DisplayOff flag. Turns off HMD screen and output (only if
’ExtendDesktop’ is off).
• Removed ovrHmdCaps::ovrHmdCap_LatencyTest flag. Was used to indicate support of pixel reading
for continuous latency testing.
• AdditionofovrDistortionCaps::ovrDistortionCap_Overdriveflag. Overdrivebrightness
transitions to reduce artifacts on DK2 displays.
• Addition of ovrStatusBits::ovrStatus_CameraPoseTracked flag. Indicates that the camera pose is
successfully calibrated.

Changes For Release 0.4 Since Release 0.2.5


In addition, the 0.4 Oculus API has been significantly redesigned since the 0.2.5 release, with the goals of
improving ease of use, correctness and supporting a new driver model.

The following is the summary of changes in the API:

• All of the HMD and sensor interfaces have been organized into a C API. This makes it easy to bind from
other languages.
• The new Oculus API introduces two distinct approaches to rendering distortion: SDK Rendered and Client
Rendered. As before, the application is expected to render stereo scenes onto one or more render targets.
With the SDK rendered approach, the Oculus SDK then takes care of distortion rendering, frame present,
and timing within the SDK. This means that developers don’t need to setup pixel and vertex shaders or
worry about the details of distortion rendering, they simply provide the device and texture pointers to the
SDK. In client rendered mode, distortion rendering is handled by the application as with previous versions of
the SDK. SDK Rendering is the preferred approach for future versions of the SDK.
• The method of rendering distortion in client rendered mode is now mesh based. The SDK returns a mesh
which includes vertices and UV coordinates which are then used to warp the source render target image to
the final buffer. Mesh based distortion is more efficient and flexible than pixel shader approaches.
• The Oculus SDK now keeps track of game frame timing and uses this information to accurately predict
orientation and motion.
• A new technique called Timewarp is introduced to reduce motion-to-photon latency. This technique re-
projects the scene to a more recently measured orientation during the distortion rendering phase.

The table on the next page briefly summarizes differences between the 0.2.5 and 0.4 API versions.

Functionality 0.2 SDK APIs 0.4 SDK C APIs


Initialization OVR::System::Init, DeviceManager, ovr_Initialize, ovr_Create, ovrHmd
HMDDevice, HMDInfo. handle and ovrHmdDesc.
Sensor OVR::SensorFusion class, with ovr_ConfigureTracking,
Interaction GetOrientation returning Quatf. ovr_GetTrackingState returning
Prediction amounts are specified manually ovrTrackingState. ovr_GetEyePoses
relative to the current time. returns head pose based on correct timing.
Rendering Setup Util::Render::StereoConfig helper ovr_ConfigureRendering populates
class creating StereoEyeParams, or ovrEyeRenderDesc based on the field of
manual setup based on members of view. Alternatively, ovr_GetRenderDesc
HMDInfo.
34 | Developer Release Guide | Oculus Rift

Functionality 0.2 SDK APIs 0.4 SDK C APIs


supports rendering setup for client
distortion rendering.
Distortion App-provided pixel shader based on Client rendered: based on the
Rendering distortion coefficients. distortion mesh returned by
ovr_CreateDistortionMesh. (or)
SDK rendered: done automatically in
ovr_EndFrame.
Frame Timing Manual timing with current-time relative Frame timing is tied to vsync with absolute
prediction. values reported by ovr_BeginFrame or
ovr_BeginFrameTiming.
Oculus Rift | PC SDK Getting Started Guide | 35

PC SDK Getting Started Guide


Welcome to the PC SDK Getting Started Guide.

To get started with the SDK:

• Verify your system meets the recommended specification. See Recommended Specifications on page
35.
• Install the Oculus App and Oculus Home. See Oculus Rift Setup on page 35.
• Install the SDK. See Oculus Rift SDK Setup on page 36.
• Get started with the demos. See Getting Started with the Demos on page 36.

Getting Started with the SDK


This guide describes how to install the SDK and try the demos.

Recommended Specifications
Presence is the first level of magic for great VR experiences: the unmistakable feeling that you’ve been
teleported somewhere new. Comfortable, sustained presence requires a combination of the proper VR
hardware, the right content, and an appropriate system.

For the full Rift experience, we recommend the following system:

• Graphics Card: GTX 1060 / AMD Radeon RX 480 or greater


• Alternative Graphics Card: NVIDIA GTX 970 / AMD Radeon R9 290 or greater
• CPU: Intel i5-4590 equivalent or greater
• Memory: 8GB+ RAM
• Video Output: Compatible HDMI 1.3 video output
• USB Ports: 3x USB 3.0 ports plus 1x USB 2.0 port
• OS: Windows 7 SP1 64 bit or newer, plus the DirectX platform update

Additionally, make sure you also have the latest GPU drivers:

• NVIDIA Driver Version 355.83 or later


• AMD Catalyst Display Driver Version 15.200.1062.1005 or later

The goal is for all Rift games and applications to deliver a great experience on this configuration. Ultimately,
we believe this will be fundamental to VR’s success, as you can optimize and tune your experiences for a known
specification, consistently achieving presence and simplifying development.

Oculus Rift Setup


Before installing the SDK, Oculus recommends making sure the firmware, runtime, and hardware are correctly
configured and tested.

To install the runtime and hardware, follow the steps in Oculus Setup. If you are not running the latest firmware
while using Oculus Setup, it will automatically install the latest firmware and drivers.

For more information about setting up the Oculus Rift, see https://www.oculus.com/setup/.
36 | PC SDK Getting Started Guide | Oculus Rift

Note: Because you are doing development work, make sure to enable unknown sources. To enable
Oculus to run software from unknown sources, click the gear icon and select Settings. Then, click
General and enable Unknown Sources.

Oculus Rift SDK Setup


This section describes how to set up the SDK.

Installation
The latest version of the Oculus SDK is always available from the Oculus Developer Center.

To download the latest package, go to http://developer.oculus.com.

SDK versions use a major.minor.patch format. For example, 5.0.1 means Major 5, Minor 0, Patch 1.

Note: The instructions in this section assume you have installed the Oculus Rift and software through
Oculus Setup.

Compiler Settings
The LibOVR libraries do not require exception handling or RTTI support.

Your game or application can disable these features for efficiency.

Build Solutions
Developers can rebuild the samples and LibOVR using the projects and solutions in the Samples and LibOVR/
Projects directories.

Windows
Solutions and project files for Visual Studio 2010, 2012, 2013, and 2015 are provided with the SDK. Samples/
Projects/Windows/VSxxxx/Samples.sln, or the 2012/2013/2015 equivalent, is the main solution that
allows you to build and run the samples, and LibOVR itself.
Note: The SDK does not support Universal Windows Platform (UWP).

Getting Started with the Demos


Now that the Rift is plugged in, the drivers are installed, and the SDK is installed, you are ready to begin using
the SDK.
Note: If you haven’t already, take a moment to adjust the Rift headset so that it’s comfortable for your
head and eyes.
Oculus Rift | PC SDK Getting Started Guide | 37

Software Developers and Integration Engineers


If you’re integrating the Oculus SDK into your game engine, Oculus recommend starting with the sample
projects.

Open the following projects, build them, and experiment with the provided sample code:

• Samples/Projects/Windows/VSxxxx/Samples.sln

OculusRoomTiny
This is a good place to start, because its source code compactly combines all critical features of the
Oculus SDK. It contains logic necessary to initialize LibOVR core, access Oculus devices, use the player’s
profile, implement head-tracking, sensor fusion, stereoscopic 3D rendering, and distortion processing.
OculusRoomTiny comes with Direct3D 11, OpenGL and Direct3D 12 variants each with their own separate
projects and source files.

Note: The Oculus Room Tiny (DX12) sample projects for each of VS2010 through VS2015 requires
that an appropriate Windows 10 SDK be set for the build. If you have a different Windows 10 SDK than
what is expected, then you will likely get compile errors about 'missing dx12.h' or similar. To fix this for
VS2010 - VS2013, you need to edit the Samples\OculusRoomTiny\OculusRoomTiny (DX12)\Projects
\Windows\Windows10SDKPaths.props file with a text editor and change the numbers to refer to your
Windows 10 SDK, typically installed with headers at C:>Program Files (x86)\Windows Kits\10\Include.
To fix this for VS2015 you need to edit the Project Properties → General → Target Platform Version
drop down box.

Figure 1: OculusRoomTiny

OculusWorldDemo
This is a more complex sample. It is intended to be portable and support many more features. These include
windowed/full-screen mode switching, XML 3D model and texture loading, movement collision detection,
adjustable view size and quality controls, 2D UI text overlays, and so on.

This is a good application to experiment with after you are familiar with the Oculus SDK basics. It also includes
an overlay menu with options and toggles that customize many aspects of rendering including FOV, render
38 | PC SDK Getting Started Guide | Oculus Rift

target use, timewarp and display settings. Experimenting with these options may provide developers with
insight into what the related numbers mean and how they affect things behind the scenes.
When running OculusWorldDemo in Windows, is uses Direct3D 11 by default. However, you can choose the
OpenGL rendering path by appending the command-line argument "-r GL" to the executable.
Beyond experimenting with the provided sample code, Oculus recommends reading the rest of this guide. It
covers LibOVR initialization, head-tracking, rendering for the Rift, and minimizing latency.

Artists and Game Designers


If you’re integrating the Oculus SDK into your game engine, Oculus recommends starting with the sample
projects.

If you’re an artist or game designer unfamiliar with C++, we recommend downloading Unity along with the
corresponding Oculus integration. You can use our out-of-the-box integrations to begin building Oculus-based
content immediately.

We also recommend reading through the Oculus Best Practices Guide, which has tips, suggestions, and
research oriented around developing great VR experiences. Topics include control schemes, user interfaces,
cut-scenes, camera features, and gameplay. The Best Practices Guide should be a go-to reference when
designing your Oculus-ready games.

Aside from that, the next step is to start building your own Oculus-ready game or application. Thousands of
other developers are out building the future of virtual reality gaming. To see what they are talking about, go to
forums.oculus.com.

OculusWorldDemo Demo
Oculus recommends running the pre-built OculusWorldDemo to explore the SDK. You can find a link to the
executable file in the root of the Oculus SDK installation.

The following is a screenshot of the OculusWorldDemo application:

Figure 2: OculusWorldDemo Application


Oculus Rift | PC SDK Getting Started Guide | 39

OculusWorldDemo Controls
The OculusWorldDemo uses a mix of standard and specialized controls.

The following table describes keys and devices that you use for movement:

Table 1: Movement

Key or Input Movement


W, S Move forward, back
A, D Strafe left, right
Mouse Look left, right
Left gamepad stick Move
Right gamepad stick Turn

The following table describes keys that you use for functions:

Table 2: Functions

Key(s) Function
F4 Multisampling toggle
F5 sRGB toggle
F7 Mono/stereo view mode toggle
F9 Hardware full-screen (low latency)
F11 Performance HUD toggle
E Motion relative to head/body
R Reset sensor orientation
Esc Cancel full-screen
-, + Adjust eye height
L Adjust fourth view value
Tab Options Menu
Spacebar Toggle debug info overlay
T Reset player position
Ctrl+Q Quit
G Cycle grid overlay mode
U, J Adjust second view value
I, K Adjust third view value
; Cylce rendered scenes
+Shift Adjust values quickly
O Toggle Time-Warp
C Toggle FreezeEyeUpdate
V Toggle Vsync
40 | PC SDK Getting Started Guide | Oculus Rift

OculusWorldDemo Usage
Once you’ve launched OculusWorldDemo, you should see a window on your PC monitor similar to the previous
screenshot.

When the image is correctly displayed inside the Rift then take a moment to look around in VR and double-
check that all of the hardware is working properly. You should be able to see that physical head translation is
now also recreated in the virtual word as well as rotation.

Important: If you need to move the sensor for any reason after initial calibration, be sure to minimize the
movement of the HMD for a few seconds while holding it within the tracking frustum. This will give the system
chance to recalibrate the sensor pose.

If you would like to explore positional tracking in more detail, you can press the semicolon (;) key to bring the
“sea of cubes” field that we use for debugging. In this mode, cubes are displayed that allow you to easily
observe positional tracking behavior. Cubes are displayed in green when head position is being tracked and in
red when sensor fusion falls back onto the head model.

There are a number of interesting things to take note of the first time you experience OculusWorldDemo. First,
the level is designed to scale. Thus, everything appears to be roughly the same height as it would be in the
real world. The sizes for everything, including the chairs, tables, doors, and ceiling, are based on measurements
from real world objects. All of the units are measured in meters.

Depending on your actual height, you may feel shorter or taller than normal. The default eye height of the
player in OculusWorldDemo is 1.61 meters (approximately the average adult eye height), but this can be
adjusted using the using the ‘+’ and ‘-’ keys.
Oculus Rift | PC SDK Developer Guide | 41

PC SDK Developer Guide


Welcome to the PC SDK Developer Guide.

This guide describes how to use the PC SDK and covers the following topics:

• Sensor initialization
• Rendering and advanced rendering
• VR focus management
• Spatialized audio
• Oculus Touch

Additionally, it contains information on the SDK samples, the Oculus Debug Tool, and the Performance HUD.

LibOVR Integration
The Oculus SDK is designed to be as easy to integrate as possible. This guide outlines a basic Oculus
integration with a C/C++ game engine or application.

We’ll discuss initializing the LibOVR, HMD device enumeration, head tracking, frame timing, and rendering for
the Rift.

Many of the code samples below are taken directly from the OculusRoomTiny demo source code (available in
Oculus/LibOVR/Samples/OculusRoomTiny). OculusRoomTiny and OculusWorldDemo are great places to
view sample integration code when in doubt about a particular system or feature.

Overview of the SDK


There are three major phases when using the SDK: setup, the game loop, and shutdown.

To add Oculus support to a new application, do the following:

1. Initialize LibOVR through ovr_Initialize.


2. Call ovr_Create and check the return value to see if it succeeded. You can periodically poll for the
presence of an HMD with ovr_GetHmdDesc(nullptr).
3. Integrate head-tracking into your application’s view and movement code. This involves:

a. Obtaining predicted headset orientation for the frame through a combination of the
GetPredictedDisplayTime and ovr_GetTrackingState calls.
b. Applying Rift orientation and position to the camera view, while combining it with other application
controls.
c. Modifying movement and game play to consider head orientation.
4. Initialize rendering for the HMD.

a. Select rendering parameters such as resolution and field of view based on HMD capabilities.

• See: ovr_GetFovTextureSize andovr_GetRenderDesc.


b. Configure rendering by creating D3D/OpenGL-specific swap texture sets to present data to the headset.

• See: ovr_CreateTextureSwapChainDX andovr_CreateTextureSwapChainGL.


5. Modify application frame rendering to integrate HMD support and proper frame timing:

a. Make sure your engine supports rendering stereo views.


42 | PC SDK Developer Guide | Oculus Rift

b. Add frame timing logic into the render loop to obtain correctly predicted eye render poses.
c. Render each eye’s view to intermediate render targets.
d. Submit the rendered frame to the headset by calling ovr_SubmitFrame.
6. Customize UI screens to work well inside of the headset.
7. Destroy the created resources during shutdown.

• See: ovr_DestroyTextureSwapChain, ovr_Destroy, and ovr_Shutdown.

A more complete summary of rendering details is provided in the Rendering Setup Outline on page 50
section.

Initialization and Sensor Enumeration


This example initializes LibOVR and requests information about the available HMD.

Review the following code:

// Include the OculusVR SDK


#include <OVR_CAPI.h>
void Application()
{
ovrResult result = ovr_Initialize(nullptr);
if (OVR_FAILURE(result))
return;

ovrSession session;
ovrGraphicsLuid luid;
result = ovr_Create(&session, &luid);
if (OVR_FAILURE(result))
{
ovr_Shutdown();
return;
}

ovrHmdDesc desc = ovr_GetHmdDesc(session);


ovrSizei resolution = desc.Resolution;

ovr_Destroy(session);
ovr_Shutdown();
}

As you can see, ovr_Initialize is called before any other API functions and ovr_Shutdown is called to
shut down the library before you exit the program. In between these function calls, you are free to create HMD
objects, access tracking state, and perform application rendering.

In this example, ovr_Create(&session &luid) creates the HMD. Use the LUID returned by ovr_Create() to
select the IDXGIAdapter on which your ID3D11Device or ID3D12Device is created. Finally, ovr_Destroy must
be called to clear the HMD before shutting down the library.

You can use ovr_GetHmdDesc() to get a description of the HMD.

If no Rift is plugged in, ovr_Create(&session, &luid) returns a failed ovrResult unless a virtual HMD is enabled
through RiftConfigUtil. Although the virtual HMD will not provide any sensor input, it can be useful for
debugging Rift-compatible rendering code and for general development without a physical device.

The description of an HMD (ovrHmdDesc) handle can be retrieved by calling ovr_GetHmdDesc(session).


The following table describes the fields:
Oculus Rift | PC SDK Developer Guide | 43

Field Type Description


Type ovrHmdType Type of the HMD, such as
ovr_DK2 or ovr_DK2 .
ProductName char[] Name of the product as a string.
Manufacturer char[] Name of the manufacturer.
VendorId short Vendor ID reported by the headset
USB device.
ProductId short Product ID reported by the headset
USB device.
SerialNumber char[] Serial number string reported by
the headset USB device.
FirmwareMajor short The major version of the sensor
firmware.
FirmwareMinor short The minor version of the sensor
firmware.
AvailableHmdCaps unsigned int Capability bits described by
ovrHmdCaps which the HMD
currently supports.
DefaultHmdCaps unsigned int Default capability bits described by
ovrHmdCaps for the current HMD.
AvailableTrackingCaps unsigned int Capability bits described by
ovrTrackingCaps which the
HMD currently supports.
DefaultTrackingCaps unsigned int Default capability bits described by
ovrTrackingCaps for the current
HMD.
DefaultEyeFov ovrFovPort[] Recommended optical field of view
for each eye.
MaxEyeFov ovrFovPort[] Maximum optical field of view that
can be practically rendered for
each eye.
Resolution ovrSizei Resolution of the full HMD screen
(both eyes) in pixels.
DisplayRefreshRate float Nominal refresh rate of the HMD
in cycles per second at the time of
HMD creation.

The description of a sensor (ovrTrackerDesc) handle can be retrieved by calling


ovr_GetTrackerDesc(sensor). The following table describes the fields:

Field Type Description


FrustumHFovInRadians float The horizontal FOV of the position
sensor frustum.
FrustumVFovInRadians float The vertical FOV of the position
sensor frustum.
44 | PC SDK Developer Guide | Oculus Rift

Field Type Description


FrustumNearZInMeters float The distance from the position
sensor to the near frustum bounds.
FrustumNearZInMeters float The distance from the position
sensor to the far frustum bounds.

Head Tracking and Sensors


The Oculus Rift hardware contains a number of micro-electrical-mechanical (MEMS) sensors including a
gyroscope, accelerometer, and magnetometer.

There is also a sensor to track headset position. The information from each of these sensors is combined
through the sensor fusion process to determine the motion of the user’s head in the real world and synchronize
the user’s view in real-time.

Once the ovrSession is created, you can poll sensor fusion for head position and orientation by calling
ovr_GetTrackingState. These calls are demonstrated by the following code:

// Query the HMD for ts current tracking state.


ovrTrackingState ts = ovr_GetTrackingState(session, ovr_GetTimeInSeconds(), ovrTrue);

if (ts.StatusFlags & (ovrStatus_OrientationTracked | ovrStatus_PositionTracked))


{
ovrPosef pose = ts.HeadPose.ThePose;
...
}

This example initializes the sensors with orientation, yaw correction, and position tracking capabilities. If the
sensor is not available during the time of the call, but is plugged in later, the sensor is automatically enabled by
the SDK.

After the sensors are initialized, the sensor state is obtained by calling ovr_GetTrackingState. This state
includes the predicted head pose and the current tracking state of the HMD as described by StatusFlags.
This state can change at runtime based on the available devices and user behavior. For example, the
ovrStatus_PositionTracked flag is only reported when HeadPose includes the absolute positional
tracking data from the sensor.

The reported ovrPoseStatef includes full six degrees of freedom (6DoF) head tracking data including
orientation, position, and their first and second derivatives. The pose value is reported for a specified absolute
point in time using prediction, typically corresponding to the time in the future that this frame’s image will be
displayed on screen. To facilitate prediction, ovr_GetTrackingState takes absolute time, in seconds, as a
second argument. The current value of absolute time can be obtained by calling ovr_GetTimeInSeconds. If
the time passed into ovr_GetTrackingState is the current time or earlier, the tracking state returned will be
based on the latest sensor readings with no prediction. In a production application, however, you should use
the real-time computed value returned by GetPredictedDisplayTime. Prediction is covered in more detail
in the section on Frame Timing.
Oculus Rift | PC SDK Developer Guide | 45

As already discussed, the reported pose includes a 3D position vector and an orientation quaternion. The
orientation is reported as a rotation in a right-handed coordinate system, as illustrated in the following figure.

Figure 3: Rift Coordinate System

The x-z plane is aligned with the ground regardless of camera orientation.
As seen from the diagram, the coordinate system uses the following axis definitions:

• Y is positive in the up direction.


• X is positive to the right.
• Z is positive heading backwards.

Rotation is maintained as a unit quaternion, but can also be reported in yaw-pitch-roll form. Positive rotation is
counter-clockwise (CCW, direction of the rotation arrows in the diagram) when looking in the negative direction
of each axis, and the component rotations are:

• Pitch is rotation around X, positive when pitching up.


• Yaw is rotation around Y, positive when turning left.
• Roll is rotation around Z, positive when tilting to the left in the XY plane.
46 | PC SDK Developer Guide | Oculus Rift

The simplest way to extract yaw-pitch-roll from ovrPose is to use the C++ OVR Math helper classes that are
included with the library. The following example uses direct conversion to assign ovrPosef to the equivalent C‍
++ Posef class. You can then use the Quatf::GetEulerAngles<> to extract the Euler angles in the desired
axis rotation order.
All simple C math types provided by OVR such as ovrVector3f and ovrQuatf have corresponding C++
types that provide constructors and operators for convenience. These types can be used interchangeably.

If an application uses a left-handed coordinate system, it can use the ovrPosef_FlipHandedness function
to switch any right-handed ovrPosef provided by ovr_GetTrackingState, ovr_GetEyePoses,
or ovr_CalcEyePoses functions to be left-handed. Be aware that the RenderPose and
QuadPoseCenterrequested for the ovrLayers must still use the right-handed coordinate system.

Position Tracking
The frustum is defined by the horizontal and vertical FOV, and the distance to the front and back frustum
planes.
Approximate values for these parameters can be accessed through the ovrTrackerDesc struct as follows:

ovrSession session;
ovrGraphicsLuid luid;
if(OVR_SUCCESS(ovr_Create(&session, &luid)))
{
// Extract tracking frustum parameters.
float frustomHorizontalFOV = session->CameraFrustumHFovInRadians;
...

The following figure shows the tracking sensor and a representation of the resulting tracking frustum.

Figure 4: Tracking Sensor and Tracking Frustum

The relevant parameters and typical values are list below:

Field Type Typical Value


FrustumHFovInRadians float 1.292 radians (74 degrees)
FrustumVFovInRadians float 0.942 radians (54 degrees)
FrustumNearZInMeters float 0.4m
FrustumFarZInMeters float 2.5m
Oculus Rift | PC SDK Developer Guide | 47

These parameters provide application developers with a visual representation of the tracking frustum. The
previous figure also shows the default tracking origin and associated coordinate system.
Note: Although the sensor axis (and hence the tracking frustum) are shown tilted downwards slightly,
the tracking coordinate system is always oriented horizontally such that the axes are parallel to the
ground.
By default, the tracking origin is located one meter away from the sensor in the direction of the optical axis but
with the same height as the sensor. The default origin orientation is level with the ground with the negative
axis pointing towards the sensor. In other words, a headset yaw angle of zero corresponds to the user looking
towards the sensor.

This can be modified using the API call ovr_RecenterTrackingOrigin, which resets the tracking origin to
the headset’s current location and sets the yaw origin to the current headset yaw value. Additionally, it can be
manually specified to any location using the API call ovr_SpecifyTrackingOrigin.

There are two types of tracking origins: floor-level and eye-level. Floor-level is recommended for room scale,
when the user stands, and when the user is interacting with objects on the floor (although telekinesis/force
grab/gaze grab is a better option for picking up objects). For most other experiences, especially when the user
is seated, eye-level is preferred. To get the current origin, use ovr_GetTrackingOriginType. To set the
origin, use ovr_SetTrackingOriginType.

Note: The tracking origin is set on a per application basis; switching focus between different VR apps
also switches the tracking origin.

The head pose is returned by calling ovr_GetTrackingState. The returned ovrTrackingState struct
contains several items relevant to position tracking:

• HeadPose—includes both head position and orientation.


• Pose—the pose of the sensor relative to the tracking origin.
• CalibratedOrigin—the pose of the origin previously calibrated by the user and stored in the profile
reported in the new recentered tracking origin space. This value can change when the application calls
ovr_RecenterTrackingOrigin, though it refers to the same location in real-world space. Otherwise it
will remain as an identity pose. Different tracking origin types will report different CalibrateOrigin poses, as
the calibration origin refers to a fixed position in real-world space but the two tracking origin types refer to
different y levels.

The StatusFlags variable contains the following status bits relating to position tracking:

• ovrStatus_PositionTracked—flag that is set only when the headset is being actively tracked.
There are several conditions that may cause position tracking to be interrupted and for the flag to become
zero:

• The headset moved wholly or partially outside the tracking frustum.


• The headset adopts an orientation that is not easily trackable with the current hardware (for example facing
directly away from the sensor).
• The exterior of the headset is partially or fully occluded from the sensor’s point of view (for example by hair
or hands).
• The velocity of the headset exceeds the expected range.

Following an interruption, assuming the conditions above are no longer present, tracking normally resumes
quickly and the ovrStatus_PositionTracked flag is set.

If you want to get the pose and leveled pose of a sensor, call ovr_GetTrackerPose. The returned
ovrTrackerPose struct contains the following:

• Pose—the pose of the sensor relative to the tracking origin.


48 | PC SDK Developer Guide | Oculus Rift

• LeveledPose— the pose of the sensor relative to the tracking origin but with roll and pitch zeroed out. You
can use this as a reference point to render real-world objects in the correct place.

User Input Integration


To provide the most comfortable, intuitive, and usable interface for the player, head tracking should be
integrated with an existing control scheme for most applications.

For example, in a first person shooter (FPS) game, the player generally moves forward, backward, left, and
right using the left joystick, and looks left, right, up, and down using the right joystick. When using the Rift, the
player can now look left, right, up, and down, using their head. However, players should not be required to
frequently turn their heads 180 degrees since this creates a bad user experience. Generally, they need a way to
reorient themselves so that they are always comfortable (the same way in which we turn our bodies if we want
to look behind ourselves for more than a brief glance).

To summarize, developers should carefully consider their control schemes and how to integrate head-tracking
when designing applications for VR. The OculusRoomTiny application provides a source code sample that
shows how to integrate Oculus head tracking with the aforementioned standard FPS control scheme.

For more information about good and bad practices, refer to the Oculus Best Practices Guide.

Health and Safety Warning


All applications that use the Oculus Rift periodically display a health and safety warning.

This warning appears for a short amount of time when the user wears the Rift; it can be dismissed by pressing
a key or gazing at the acknowledgement. After the screen is dismissed, it shouldn't display for at least 30
minutes.

Rendering to the Oculus Rift


The Oculus Rift requires split-screen stereo with distortion correction for each eye to cancel lens-related
distortion.

Figure 5: OculusWorldDemo Stereo Rendering


Oculus Rift | PC SDK Developer Guide | 49

Correcting for distortion can be challenging, with distortion parameters varying for different lens types and
individual eye relief. To make development easier, Oculus SDK handles distortion correction automatically
within the Oculus Compositor process; it also takes care of latency-reducing timewarp and presents frames to
the headset.
With Oculus SDK doing a lot of the work, the main job of the application is to perform simulation and render
stereo world based on the tracking pose. Stereo views can be rendered into either one or two individual
textures and are submitted to the compositor by calling ovr_SubmitFrame. We cover this process in detail in
this section.

Rendering to the Oculus Rift


The Oculus Rift requires the scene to be rendered in split-screen stereo with half of the screen used for each
eye.

When using the Rift, the left eye sees the left half of the screen, and the right eye sees the right half.
Although varying from person-to-person, human eye pupils are approximately 65 mm apart. This is known as
interpupillary distance (IPD). The in-application cameras should be configured with the same separation.

Note:

This is a translation of the camera, not a rotation, and it is this translation (and the parallax effect that
goes with it) that causes the stereoscopic effect. This means that your application will need to render the
entire scene twice, once with the left virtual camera, and once with the right.
The reprojection stereo rendering technique, which relies on left and right views being generated from
a single fully rendered view, is usually not viable with an HMD because of significant artifacts at object
edges.

The lenses in the Rift magnify the image to provide a very wide field of view (FOV) that enhances immersion.
However, this process distorts the image significantly. If the engine were to display the original images on the
Rift, then the user would observe them with pincushion distortion.

Figure 6: Pincushion and Barrel Distortion

To counteract this distortion, the SDK applies post-processing to the rendered views with an equal and
opposite barrel distortion so that the two cancel each other out, resulting in an undistorted view for each eye.
Furthermore, the SDK also corrects chromatic aberration, which is a color separation effect at the edges caused
by the lens. Although the exact distortion parameters depend on the lens characteristics and eye position
relative to the lens, the Oculus SDK takes care of all necessary calculations when generating the distortion
mesh.

When rendering for the Rift, projection axes should be parallel to each other as illustrated in the following
figure, and the left and right views are completely independent of one another. This means that camera setup
50 | PC SDK Developer Guide | Oculus Rift

is very similar to that used for normal non-stereo rendering, except that the cameras are shifted sideways to
adjust for each eye location.

Figure 7: HMD Eye View Cones

In practice, the projections in the Rift are often slightly off-center because our noses get in the way! But the
point remains, the left and right eye views in the Rift are entirely separate from each other, unlike stereo views
generated by a television or a cinema screen. This means you should be very careful if trying to use methods
developed for those media because they do not usually apply in VR.

The two virtual cameras in the scene should be positioned so that they are pointing in the same direction
(determined by the orientation of the HMD in the real world), and such that the distance between them is the
same as the distance between the eyes, or interpupillary distance (IPD). This is typically done by adding the
ovrEyeRenderDesc::HmdToEyeOffset translation vector to the translation component of the view matrix.

Although the Rift’s lenses are approximately the right distance apart for most users, they may not exactly match
the user’s IPD. However, because of the way the optics are designed, each eye will still see the correct view. It
is important that the software makes the distance between the virtual cameras match the user’s IPD as found in
their profile (set in the configuration utility), and not the distance between the Rift’s lenses.

Rendering Setup Outline


The Oculus SDK makes use of a compositor process to present frames and handle distortion.

To target the Rift, you render the scene into one or two render textures, passing these textures into the API.
The Oculus runtime handles distortion rendering, GPU synchronization, frame timing, and frame presentation to
the HMD.

The following are the steps for SDK rendering:

1. Initialize:

a. Initialize Oculus SDK and create an ovrSession object for the headset as was described earlier.
b. Compute the desired FOV and texture sizes based on ovrHmdDesc data.
c. Allocate ovrTextureSwapChain objects, used to represent eye buffers, in an API-
specific way: call ovr_CreateTextureSwapChainDX for either Direct3D 11 or 12 or
ovr_CreateTextureSwapChainGL for OpenGL.
2. Set up frame handling:

a. Use ovr_GetTrackingState and ovr_CalcEyePoses to compute eye poses needed for view
rendering based on frame timing information.
b. Perform rendering for each eye in an engine-specific way, rendering into the current texture within the
texture set. Current texture is retrieved using ovr_GetTextureSwapChainCurrentIndex() and
ovr_GetTextureSwapChainBufferDX() or ovr_GetTextureSwapChainBufferGL(). After
rendering to the texture is complete, the application must call ovr_CommitTextureSwapChain.
c. Call ovr_SubmitFrame, passing swap texture set(s) from the previous step within a ovrLayerEyeFov
structure. Although a single layer is required to submit a frame, you can use multiple layers and layer
types for advanced rendering. ovr_SubmitFrame passes layer textures to the compositor which
handles distortion, timewarp, and GPU synchronization before presenting it to the headset.
3. Shutdown:
Oculus Rift | PC SDK Developer Guide | 51

a. Call ovr_DestroyTextureSwapChain to destroy swap texture buffers. Call


ovr_DestroyMirrorTexture to destroy a mirror texture. To destroy the ovrSession object, call
ovr_Destroy.

Texture Swap Chain Initialization


This section describes rendering initialization, including creation of texture swap chains.

Initially, you determine the rendering FOV and allocate the required ovrTextureSwapChain. The following code
shows how the required texture size can be computed:

// Configure Stereo settings.


Sizei recommenedTex0Size = ovr_GetFovTextureSize(session, ovrEye_Left,
session->DefaultEyeFov[0], 1.0f);
Sizei recommenedTex1Size = ovr_GetFovTextureSize(session, ovrEye_Right,
session->DefaultEyeFov[1], 1.0f);
Sizei bufferSize;
bufferSize.w = recommenedTex0Size.w + recommenedTex1Size.w;
bufferSize.h = max ( recommenedTex0Size.h, recommenedTex1Size.h );

Render texture size is determined based on the FOV and the desired pixel density at the center of the
eye. Although both the FOV and pixel density values can be modified to improve performance, this
example uses the recommended FOV (obtained from session->DefaultEyeFov). The function
ovr_GetFovTextureSize computes the desired texture size for each eye based on these parameters.

The Oculus API allows the application to use either one shared texture or two separate textures for eye
rendering. This example uses a single shared texture for simplicity, making it large enough to fit both eye
renderings. Once texture size is known, the application can call ovr_CreateTextureSwapChainGL or
ovr_CreateTextureSwapChainDX to allocate the texture swap chains in an API-specific way. Here's how a
texture swap chain can be created and accessed under OpenGL:

ovrTextureSwapChain textureSwapChain = 0;

ovrTextureSwapChainDesc desc = {};


desc.Type = ovrTexture_2D;
desc.ArraySize = 1;
desc.Format = OVR_FORMAT_R8G8B8A8_UNORM_SRGB;
desc.Width = bufferSize.w;
desc.Height = bufferSize.h;
desc.MipLevels = 1;
desc.SampleCount = 1;
desc.StaticImage = ovrFalse;

if (ovr_CreateTextureSwapChainGL(session, &desc, &textureSwapChain) == ovrSuccess)


{
// Sample texture access:
int texId;
ovr_GetTextureSwapChainBufferGL(session, textureSwapChain, 0, &texId);
glBindTexture(GL_TEXTURE_2D, texId);
...
}

Here's a similar example of texture swap chain creation and access using Direct3D 11:

ovrTextureSwapChain textureSwapChain = 0;
std::vector<ID3D11RenderTargetView*> texRtv;

ovrTextureSwapChainDesc desc = {};


desc.Type = ovrTexture_2D;
desc.Format = OVR_FORMAT_R8G8B8A8_UNORM_SRGB;
desc.ArraySize = 1;
desc.Width = bufferSize.w;
desc.Height = bufferSize.h;
desc.MipLevels = 1;
52 | PC SDK Developer Guide | Oculus Rift

desc.SampleCount = 1;
desc.StaticImage = ovrFalse;
desc.MiscFlags = ovrTextureMisc_None;
desc.BindFlags = ovrTextureBind_DX_RenderTarget;

if (ovr_CreateTextureSwapChainDX(session, DIRECTX.Device, &desc, &textureSwapChain) == ovrSuccess)


{
int count = 0;
ovr_GetTextureSwapChainLength(session, textureSwapChain, &count);
texRtv.resize(textureCount);
for (int i = 0; i < count; ++i)
{
ID3D11Texture2D* texture = nullptr;
ovr_GetTextureSwapChainBufferDX(session, textureSwapChain, i, IID_PPV_ARGS(&texture));
DIRECTX.Device->CreateRenderTargetView(texture, nullptr, &texRtv[i]);
texture->Release();
}
}

Here's sample code from the provided OculusRoomTiny sample running in Direct3D 12:

ovrTextureSwapChain TexChain;
std::vector<D3D12_CPU_DESCRIPTOR_HANDLE> texRtv;
std::vector<ID3D12Resource*> TexResource;

ovrTextureSwapChainDesc desc = {};


desc.Type = ovrTexture_2D;
desc.ArraySize = 1;
desc.Format = OVR_FORMAT_R8G8B8A8_UNORM_SRGB;
desc.Width = sizeW;
desc.Height = sizeH;
desc.MipLevels = 1;
desc.SampleCount = 1;
desc.MiscFlags = ovrTextureMisc_DX_Typeless;
desc.StaticImage = ovrFalse;
desc.BindFlags = ovrTextureBind_DX_RenderTarget;

// DIRECTX.CommandQueue is the ID3D12CommandQueue used to render the eye textures by the app
ovrResult result = ovr_CreateTextureSwapChainDX(session, DIRECTX.CommandQueue, &desc, &TexChain);
if (!OVR_SUCCESS(result))
return false;

int textureCount = 0;
ovr_GetTextureSwapChainLength(Session, TexChain, &textureCount);
texRtv.resize(textureCount);
TexResource.resize(textureCount);
for (int i = 0; i < textureCount; ++i)
{
result = ovr_GetTextureSwapChainBufferDX(Session, TexChain, i, IID_PPV_ARGS(&TexResource[i]));
if (!OVR_SUCCESS(result))
return false;

D3D12_RENDER_TARGET_VIEW_DESC rtvd = {};


rtvd.Format = DXGI_FORMAT_R8G8B8A8_UNORM;
rtvd.ViewDimension = D3D12_RTV_DIMENSION_TEXTURE2D;
texRtv[i] = DIRECTX.RtvHandleProvider.AllocCpuHandle(); // Gives new D3D12_CPU_DESCRIPTOR_HANDLE
DIRECTX.Device->CreateRenderTargetView(TexResource[i], &rtvd, texRtv[i]);
}

Note: For Direct3D 12, when calling ovr_CreateTextureSwapChainDX, the caller provides a
ID3D12CommandQueue instead of a ID3D12Device to the SDK. It is the caller's responsibility to make
sure that this ID3D12CommandQueue instance is where all VR eye-texture rendering is executed. Or, it
can be used as a "join-node" fence to wait for the command lists executed by other command queues
rendering the VR eye textures.
Once these textures and render targets are successfully created, you can use them to perform eye-texture
rendering. The Frame Rendering section describes viewport setup in more detail.

The Oculus compositor provides sRGB-correct rendering, which results in more photorealistic visuals, better
MSAA, and energy-conserving texture sampling, which are very important for VR applications. As shown
above, applications are expected to create sRGB texture swap chains. Proper treatment of sRGB rendering is a
Oculus Rift | PC SDK Developer Guide | 53

complex subject and, although this section provides an overview, extensive information is outside the scope of
this document.
There are several steps to ensuring a real-time rendered application achieves sRGB-correct shading and
different ways to achieve it. For example, most GPUs provide hardware acceleration to improve gamma-correct
shading for sRGB-specific input and output surfaces, while some applications use GPU shader math for more
customized control. For the Oculus SDK, when an application passes in sRGB-space texture swap chains, the
compositor relies on the GPU's sampler to do the sRGB-to-linear conversion.

All color textures fed into a GPU shader should be marked appropriately with the sRGB-correct format, such
as OVR_FORMAT_R8G8B8A8_UNORM_SRGB. This is also recommended for applications that provide static
textures as quad-layer textures to the Oculus compositor. Failure to do so will cause the texture to look much
brighter than expected.

For D3D 11 and 12, the texture format provided in desc for ovr_CreateTextureSwapChainDX is
used by the distortion compositor for the ShaderResourceView when reading the contents of the texture.
As a result, the application should request texture swap chain formats that are in sRGB-space (e.g.
OVR_FORMAT_R8G8B8A8_UNORM_SRGB).

If your application is configured to render into a linear-format texture (e.g. OVR_FORMAT_R8G8B8A8_UNORM)


and handles the linear-to-gamma conversion using HLSL code, or does not care about any gamma-correction,
then:
• Request an sRGB format (e.g. OVR_FORMAT_R8G8B8A8_UNORM_SRGB) texture swap chain.
• Specify the ovrTextureMisc_DX_Typeless flag in the desc.
• Create a linear-format RenderTargetView (e.g. DXGI_FORMAT_R8G8B8A8_UNORM)

Note: The ovrTextureMisc_DX_Typeless flag for depth buffer formats (e.g. OVR_FORMAT_D32) is
ignored as they are always converted to be typeless.

For OpenGL, the format parameter ofovr_CreateTextureSwapChainGL is used by the distortion


compositor when reading the contents of the texture. As a result, the application should request texture swap
chain formats preferably in sRGB-space (e.g. OVR_FORMAT_R8G8B8A8_UNORM_SRGB). Furthermore, your
application should call glEnable(GL_FRAMEBUFFER_SRGB); before rendering into these textures.

Even though it is not recommended, if your application is configured to treat the texture as a linear format (e.g.
GL_RGBA) and performs linear-to-gamma conversion in GLSL or does not care about gamma-correction, then:

• Request an sRGB format (e.g. OVR_FORMAT_R8G8B8A8_UNORM_SRGB) texture swap chain.


• Do not call glEnable(GL_FRAMEBUFFER_SRGB); when rendering into the texture.
The provided code sample demonstrates how to use the provided ovrTextureMisc_DX_Typeless flag in
D3D11:

ovrTextureSwapChainDesc desc = {};


desc.Type = ovrTexture_2D;
desc.ArraySize = 1;
desc.Format = OVR_FORMAT_R8G8B8A8_UNORM_SRGB;
desc.Width = sizeW;
desc.Height = sizeH;
desc.MipLevels = 1;
desc.SampleCount = 1;
desc.MiscFlags = ovrTextureMisc_DX_Typeless;
desc.BindFlags = ovrTextureBind_DX_RenderTarget;
desc.StaticImage = ovrFalse;

ovrResult result = ovr_CreateTextureSwapChainDX(session, DIRECTX.Device, &desc,


&textureSwapChain);

if(!OVR_SUCCESS(result))
return;

int count = 0;
54 | PC SDK Developer Guide | Oculus Rift

ovr_GetTextureSwapChainLength(session, textureSwapChain, &count);


for (int i = 0; i < count; ++i)
{
ID3D11Texture2D* texture = nullptr;
ovr_GetTextureSwapChainBufferDX(session, textureSwapChain, i, IID_PPV_ARGS(&texture));
D3D11_RENDER_TARGET_VIEW_DESC rtvd = {};
rtvd.Format = DXGI_FORMAT_R8G8B8A8_UNORM;
rtvd.ViewDimension = D3D11_RTV_DIMENSION_TEXTURE2D;
DIRECTX.Device->CreateRenderTargetView(texture, &rtvd, &texRtv[i]);
texture->Release();
}

In addition to sRGB, these concepts also apply to the mirror texture creation. For more information, refer to the
function documentation provided for ovr_CreateMirrorTextureDX and ovr_CreateMirrorTextureGL
for D3D and OpenGL, respectively.

Frame Rendering
Frame rendering typically involves several steps: obtaining predicted eye poses based on the headset
tracking pose, rendering the view for each eye and, finally, submitting eye textures to the compositor through
ovr_SubmitFrame. After the frame is submitted, the Oculus compositor handles distortion and presents it on
the Rift.

Before rendering frames it is helpful to initialize some data structures that can be shared across frames. As an
example, we query eye descriptors and initialize the layer structure outside of the rendering loop:

// Initialize VR structures, filling out description.


ovrEyeRenderDesc eyeRenderDesc[2];
ovrVector3f hmdToEyeViewOffset[2];
ovrHmdDesc hmdDesc = ovr_GetHmdDesc(session);
eyeRenderDesc[0] = ovr_GetRenderDesc(session, ovrEye_Left, hmdDesc.DefaultEyeFov[0]);
eyeRenderDesc[1] = ovr_GetRenderDesc(session, ovrEye_Right, hmdDesc.DefaultEyeFov[1]);
hmdToEyeViewOffset[0] = eyeRenderDesc[0].HmdToEyeOffset;
hmdToEyeViewOffset[1] = eyeRenderDesc[1].HmdToEyeOffset;

// Initialize our single full screen Fov layer.


ovrLayerEyeFov layer;
layer.Header.Type = ovrLayerType_EyeFov;
layer.Header.Flags = 0;
layer.ColorTexture[0] = textureSwapChain;
layer.ColorTexture[1] = textureSwapChain;
layer.Fov[0] = eyeRenderDesc[0].Fov;
layer.Fov[1] = eyeRenderDesc[1].Fov;
layer.Viewport[0] = Recti(0, 0, bufferSize.w / 2, bufferSize.h);
layer.Viewport[1] = Recti(bufferSize.w / 2, 0, bufferSize.w / 2, bufferSize.h);
// ld.RenderPose and ld.SensorSampleTime are updated later per frame.

This code example first gets rendering descriptors for each eye, given the chosen FOV. The returned
ovrEyeRenderDescstructure contains useful values for rendering, including the HmdToEyeOffset for each
eye. Eye view offsets are used later to adjust for eye separation.

The code also initializes the ovrLayerEyeFov structure for a full screen layer. Starting with Oculus SDK
0.6, frame submission uses layers to composite multiple view images or texture quads on top of each other.
This example uses a single layer to present a VR scene. For this purpose, we use ovrLayerEyeFov, which
describes a dual-eye layer that covers the entire eye field of view. Since we are using the same texture set for
both eyes, we initialize both eye color textures to pTextureSet and configure viewports to draw to the left
and right sides of this shared texture, respectively.
Note: Although it is often enough to initialize viewports once in the beginning, specifying them as a
part of the layer structure that is submitted every frame allows applications to change render target size
dynamically, if desired. This is useful for optimizing rendering performance.
Oculus Rift | PC SDK Developer Guide | 55

After setup completes, the application can run the rendering loop. First, we need to get the eye poses to
render the left and right views.

// Get both eye poses simultaneously, with IPD offset already included.
double displayMidpointSeconds = GetPredictedDisplayTime(session, 0);
ovrTrackingState hmdState = ovr_GetTrackingState(session, displayMidpointSeconds, ovrTrue);
ovr_CalcEyePoses(hmdState.HeadPose.ThePose, hmdToEyeViewOffset, layer.RenderPose);

In VR, rendered eye views depend on the headset position and orientation in the physical space, tracked with
the help of internal IMU and external sensors. Prediction is used to compensate for the latency in the system,
giving the best estimate for where the headset will be when the frame is displayed on the headset. In the
Oculus SDK, this tracked, predicted pose is reported by ovr_GetTrackingState.

To do accurate prediction, ovr_GetTrackingState needs to know when the current frame will actually be
displayed. The code above calls GetPredictedDisplayTime to obtain displayMidpointSeconds for
the current frame, using it to compute the best predicted tracking state. The head pose from the tracking state
is then passed to ovr_CalcEyePoses to calculate correct view poses for each eye. These poses are stored
directly into the layer.RenderPose[2] array. With eye poses ready, we can proceed onto the actual frame
rendering.

if (isVisible)
{
// Get next available index of the texture swap chain
int currentIndex = 0;
ovr_GetTextureSwapChainCurrentIndex(session, textureSwapChain, &currentIndex);

// Clear and set up render-target.


DIRECTX.SetAndClearRenderTarget(pTexRtv[currentIndex], pEyeDepthBuffer);

// Render Scene to Eye Buffers


for (int eye = 0; eye < 2; eye++)
{
// Get view and projection matrices for the Rift camera
Vector3f pos = originPos + originRot.Transform(layer.RenderPose[eye].Position);
Matrix4f rot = originRot * Matrix4f(layer.RenderPose[eye].Orientation);

Vector3f finalUp = rot.Transform(Vector3f(0, 1, 0));


Vector3f finalForward = rot.Transform(Vector3f(0, 0, -1));
Matrix4f view = Matrix4f::LookAtRH(pos, pos + finalForward, finalUp);

Matrix4f proj = ovrMatrix4f_Projection(layer.Fov[eye], 0.2f, 1000.0f, 0);


// Render the scene for this eye.
DIRECTX.SetViewport(layer.Viewport[eye]);
roomScene.Render(proj * view, 1, 1, 1, 1, true);
}

// Commit the changes to the texture swap chain


ovr_CommitTextureSwapChain(session, textureSwapChain);
}

// Submit frame with one layer we have.


ovrLayerHeader* layers = &layer.Header;
ovrResult result = ovr_SubmitFrame(session, 0, nullptr, &layers, 1);
isVisible = (result == ovrSuccess);

This code takes a number of steps to render the scene:

• It applies the texture as a render target and clears it for rendering. In this case, the same texture is used for
both eyes.
• The code then computes view and projection matrices and sets viewport scene rendering for each eye.
In this example, view calculation combines the original pose (originPos and originRot values) with
the new pose computed based on the tracking state and stored in the layer. There original values can be
modified by input to move the player within the 3D world.
• After texture rendering is complete, we call ovr_SubmitFrame to pass frame data to the compositor. From
this point, the compositor takes over by accessing texture data through shared memory, distorting it, and
presenting it on the Rift.
56 | PC SDK Developer Guide | Oculus Rift

ovr_SubmitFrame returns once the submitted frame is queued up and the runtime is available to accept a
new frame. When successful, its return value is either ovrSuccess or ovrSuccess_NotVisible.
ovrSuccess_NotVisible is returned if the frame wasn't actually displayed, which can happen when VR
application loses focus. Our sample code handles this case by updating the isVisible flag, checked by the
rendering logic. While frames are not visible, rendering should be paused to eliminate unnecessary GPU load.

If you receive ovrError_DisplayLost, the device was removed and the session is invalid. Release the
shared resources (ovr_DestroyTextureSwapChain), destroy the session (ovr_Destroy), recreate it (ovr_Create),
and create new resources (ovr_CreateTextureSwapChainXXX). The application's existing private graphics
resources do not need to be recreated unless the new ovr_Create call returns a different GraphicsLuid.

Frame Timing
The Oculus SDK reports frame timing information through the ovr_GetPredictedDisplayTime function,
relying on the application-provided frame index to ensure correct timing is reported across different threads.

Accurate frame and sensor timing are required for accurate head motion prediction, which is essential for a
good VR experience. Prediction requires knowing exactly when in the future the current frame will appear
on the screen. If we know both sensor and display scanout times, we can predict the future head pose and
improve image stability. Computing these values incorrectly can lead to under or over-prediction, degrading
perceived latency, and potentially causing overshoot “wobbles”.
To ensure accurate timing, the Oculus SDK uses absolute system time, stored as a double, to represent sensor
and frame timing values. The current absolute time is returned by ovr_GetTimeInSeconds. Current time
should rarely be used, however, since simulation and motion prediction will produce better results when relying
on the timing values returned by ovr_GetPredictedDisplayTime. This function has the following signature:

ovr_GetPredictedDisplayTime(ovrSession session, long long frameIndex);

The frameIndex argument specifies which application frame we are rendering. Applications that make use
of multi-threaded rendering must keep an internal frame index and manually increment it, passing it across
threads along with frame data to ensure correct timing and prediction. The same frameIndex value must be
passed to ovr_SubmitFrame as was used to obtain timing for the frame. The details of multi-threaded timing
are covered in the next section, Rendering on Different Threads on page 56.

A special frameIndex value of 0 can be used in both functions to request that the SDK keep track of frame
indices automatically. However, this only works when all frame timing requests and render submission is done
on the same thread.

Rendering on Different Threads


In some engines, render processing is distributed across more than one thread.

For example, one thread may perform culling and render setup for each object in the scene (we'll call this the
“main” thread), while a second thread makes the actual D3D or OpenGL API calls (we'll call this the “render”
thread). Both of these threads may need accurate estimates of frame display time, so as to compute best
possible predictions of head pose.

The asynchronous nature of this approach makes this challenging: while the render thread is rendering a frame,
the main thread might be processing the next frame. This parallel frame processing may be out of sync by
exactly one frame or a fraction of a frame, depending on game engine design. If we used the default global
state to access frame timing, the result of GetPredictedDisplayTime could either be off by one frame
depending which thread the function is called from, or worse, could be randomly incorrect depending on how
threads are scheduled. To address this issue, previous section introduced the concept of frameIndex that is
tracked by the application and passed across threads along with frame data.

For multi-threaded rendering result to be correct, the following must be true: (a) pose prediction, computed
based on frame timing, must be consistent for the same frame regardless of which thread it is accessed from;
Oculus Rift | PC SDK Developer Guide | 57

and (b) eye poses that were actually used for rendering must be passed into ovr_SubmitFrame, along with
the frame index.
Here is a summary of steps you can take to ensure this is the case:

1. The main thread needs to assign a frame index to the current frame being processed for rendering. It would
increment this index each frame and pass it to GetPredictedDisplayTime to obtain the correct timing
for pose prediction.
2. The main thread should call the thread safe function ovr_GetTrackingState with the predicted time
value. It can also call ovr_CalcEyePoses if necessary for rendering setup.
3. Main thread needs to pass the current frame index and eye poses to the render thread, along with any
rendering commands or frame data it needs.
4. When the rendering commands executed on the render thread, developers need to make sure these things
hold:

a. The actual poses used for frame rendering are stored into the RenderPose for the layer.
b. The same value of frameIndex as was used on the main thead is passed into ovr_SubmitFrame.

The following code illustrates this in more detail:

void MainThreadProcessing()
{
frameIndex++;

// Ask the API for the times when this frame is expected to be displayed.
double frameTiming = GetPredictedDisplayTime(session, frameIndex);

// Get the corresponding predicted pose state.


ovrTrackingState state = ovr_GetTrackingState(session, frameTiming, ovrTrue);
ovrPosef eyePoses[2];
ovr_CalcEyePoses(state.HeadPose.ThePose, hmdToEyeViewOffset, eyePoses);

SetFrameHMDData(frameIndex, eyePoses);

// Do render pre-processing for this frame.


...
}

void RenderThreadProcessing()
{
int frameIndex;
ovrPosef eyePoses[2];

GetFrameHMDData(&frameIndex, eyePoses);
layer.RenderPose[0] = eyePoses[0];
layer.RenderPose[1] = eyePoses[1];

// Execute actual rendering to eye textures.


...

// Submit frame with one layer we have.


ovrLayerHeader* layers = &layer.Header;
ovrResult result = ovr_SubmitFrame(session, frameIndex, nullptr, &layers, 1);
}

The Oculus SDK also supports Direct3D 12, which allows submitting rendering work to the GPU from multiple
CPU threads. When the application calls ovr_CreateTextureSwapChainDX, the Oculus SDK caches off the
ID3D12CommandQueue provided by the caller for future usage. As the application calls ovr_SubmitFrame,
the SDK drops a fence on the cached ID3D12CommandQueue to know exactly when a given set of eye-
textures are ready for the SDK compositor.

For a given application, using a single ID3D12CommandQueue on a single thread is the easiest. But,it might
also split the CPU rendering workload for each eye-texture pair or push non-eye-texture rendering work, such
as shadows, reflection maps, and so on, onto different command queues. If the application populates and
executes command lists from multiple threads, it will also have to make sure that the ID3D12CommandQueue
58 | PC SDK Developer Guide | Oculus Rift

provided to the SDK is the single join-node for the eye-texture rendering work executed through different
command queues.

Layers
Similar to the way a monitor view can be composed of multiple windows, the display on the headset can be
composed of multiple layers. Typically at least one of these layers will be a view rendered from the user's virtual
eyeballs, but other layers may be HUD layers, information panels, text labels attached to items in the world,
aiming reticles, and so on.

Each layer can have a different resolution, can use a different texture format, can use a different field of view or
size, and might be in mono or stereo. The application can also be configured to not update a layer's texture if
the information in it has not changed. For example, it might not update if the text in an information panel has
not changed since last frame or if the layer is a picture-in-picture view of a video stream with a low framerate.
Applications can supply mipmapped textures to a layer and, together with a high-quality distortion mode, this
is very effective at improving the readability of text panels.
Every frame, all active layers are composited from back to front using pre-multiplied alpha blending. Layer 0 is
the furthest layer, layer 1 is on top of it, and so on; there is no depth-buffer intersection testing of layers, even if
a depth-buffer is supplied.

A powerful feature of layers is that each can be a different resolution. This allows an application to scale to
lower performance systems by dropping resolution on the main eye-buffer render that shows the virtual world,
but keeping essential information, such as text or a map, in a different layer at a higher resolution.

There are several layer types available:

EyeFov The standard "eye buffer" familiar from previous SDKs, which is typically a stereo
view of a virtual scene rendered from the position of the user's eyes. Although
eye buffers can be mono, this can cause discomfort. Previous SDKs had an
implicit field of view (FOV) and viewport; these are now supplied explicitly and the
application can change them every frame, if desired.
Quad A monoscopic image that is displayed as a rectangle at a given pose and
size in the virtual world. This is useful for heads-up-displays, text information,
object labels and so on. By default the pose is specified relative to the
user's real-world space and the quad will remain fixed in space rather than
moving with the user's head or body motion. For head-locked quads, use the
ovrLayerFlag_HeadLocked flag as described below.
EyeMatrix The EyeMatrix layer type is similar to the EyeFov layer type and is provided to
assist compatibility with Gear VR applications. For more information, refer to the
Mobile SDK documentation.
Disabled Ignored by the compositor, disabled layers do not cost performance. We
recommend that applications perform basic frustum-culling and disable layers that
are out of view. However, there is no need for the application to repack the list of
active layers tightly together when turning one layer off; disabling it and leaving it
in the list is sufficient. Equivalently, the pointer to the layer in the list can be set to
null.

Each layer style has a corresponding member of the ovrLayerType enum, and an associated
structure holding the data required to display that layer. For example, the EyeFov layer is type number
ovrLayerType_EyeFov and is described by the data in the structure ovrLayerEyeFov. These structures
share a similar set of parameters, though not all layer types require all parameters:

Parameter Type Description

Header.Type enum ovrLayerType Must be set by all layers to specify what type they are.
Oculus Rift | PC SDK Developer Guide | 59

Parameter Type Description


Header.Flags A bitfield of See below for more information.
ovrLayerFlags
ColorTexture TextureSwapChain Provides color and translucency data for the layer. Layers are
blended over one another using premultiplied alpha. This
allows them to express either lerp-style blending, additive
blending, or a combination of the two. Layer textures must
be RGBA or BGRA formats and might have mipmaps, but
cannot be arrays, cubes, or have MSAA. If the application
desires to do MSAA rendering, then it must resolve the
intermediate MSAA color texture into the layer's non-MSAA
ColorTexture.
Viewport ovrRecti The rectangle of the texture that is actually used, specified
in 0-1 texture "UV" coordinate space (not pixels). In theory,
texture data outside this region is not visible in the layer.
However, the usual caveats about texture sampling apply,
especially with mipmapped textures. It is good practice to
leave a border of RGBA(0,0,0,0) pixels around the displayed
region to avoid "bleeding," especially between two eye
buffers packed side by side into the same texture. The size
of the border depends on the exact usage case, but around
8 pixels seems to work well in most cases.
Fov ovrFovPort The field of view used to render the scene in an Eye layer
type. Note this does not control the HMD's display, it
simply tells the compositor what FOV was used to render
the texture data in the layer - the compositor will then
adjust appropriately to whatever the actual user's FOV
is. Applications may change FOV dynamically for special
effects. Reducing FOV may also help with performance on
slower machines, though typically it is more effective to
reduce resolution before reducing FOV.
RenderPose ovrPosef The camera pose the application used to render the scene
in an Eye layer type. This is typically predicted by the SDK
and application using the ovr_GetTrackingState and
ovr_CalcEyePoses functions. The difference between this
pose and the actual pose of the eye at display time is used
by the compositor to apply timewarp to the layer.
SensorSampleTime double The absolute time when the application sampled the
tracking state. The typical way to acquire this value is to
have an ovr_GetTimeInSeconds call right next to the
ovr_GetTrackingState call. The SDK uses this value
to report the application's motion-to-photon latency in the
Performance HUD. If the application has more than one
ovrLayerType_EyeFov layer submitted at any given
frame, the SDK scrubs through those layers and selects
the timing with the lowest latency. In a given frame, if no
ovrLayerType_EyeFov layers are submitted, the SDK
will use the point in time when ovr_GetTrackingState
was called with the latencyMarkerset to ovrTrue as the
substitute application motion-to-photon latency time.
QuadPoseCenter ovrPosef Specifies the orientation and position of the center
point of a Quad layer type. The supplied direction is
60 | PC SDK Developer Guide | Oculus Rift

Parameter Type Description


the vector perpendicular to the quad. The position is
in real-world meters (not the application's virtual world,
the actual world the user is in) and is relative to the
"zero" position set by ovr_RecenterTrackingOrigin
or ovr_SpecifyTrackingOrigin unless the
ovrLayerFlag_HeadLocked flag is used.
QuadSize ovrVector2f Specifies the width and height of a Quad layer type. As with
position, this is in real-world meters.

Layers that take stereo information (all those except Quad layer types) take two sets of most parameters, and
these can be used in three different ways:

• Stereo data, separate textures—the app supplies a different ovrTextureSwapChain for the left and right
eyes, and a viewport for each.
• Stereo data, shared texture—the app supplies the same ovrTextureSwapChain for both left and right
eyes, but a different viewport for each. This allows the application to render both left and right views to
the same texture buffer. Remember to add a small buffer between the two views to prevent "bleeding", as
discussed above.
• Mono data—the app supplies the same ovrTextureSwapChain for both left and right eyes, and the same
viewport for each.

Texture and viewport sizes may be different for the left and right eyes, and each can even have different fields
of view. However beware of causing stereo disparity and discomfort in your users.

The Header.Flags field available for all layers is a logical-or of the following:

• ovrLayerFlag_HighQuality—enables 4x anisotropic sampling in the compositor for this layer. This


can provide a significant increase in legibility, especially when used with a texture containing mipmaps;
this is recommended for high-frequency images such as text or diagrams and when used with the Quad
layer types. For Eye layer types, it will also increase visual fidelity towards the periphery, or when feeding in
textures that have more than the 1:1 recommended pixel density. For best results, when creating mipmaps
for the textures associated to the particular layer, make sure the texture sizes are a power of 2. However, the
application does not need to render to the whole texture; a viewport that renders to the recommended size
in the texture will provide the best performance-to-quality ratios.
• ovrLayerFlag_TextureOriginAtBottomLeft—the origin of a layer's texture is assumed to be at the
top-left corner. However, some engines (particularly those using OpenGL) prefer to use the bottom-left
corner as the origin, and they should use this flag.
• ovrLayerFlag_HeadLocked—Most layer types have their pose orientation and position specified relative
to the "zero position" defined by calling ovr_RecenterTrackingOrigin. However the app may wish to
specify a layer's pose relative to the user's face. When the user moves their head, the layer follows. This is
useful for reticles used in gaze-based aiming or selection. This flag may be used for all layer types, though it
has no effect when used on the Direct type.

At the end of each frame, after rendering to whichever ovrTextureSwapChain the application wants
to update and calling ovr_CommitTextureSwapChain, the data for each layer is put into the relevant
ovrLayerEyeFov / ovrLayerQuad / ovrLayerDirect structure. The application then creates a list of
pointers to those layer structures, specifically to the Header field which is guaranteed to be the first member of
each structure. Then the application builds a ovrViewScaleDesc struct with the required data, and calls the
ovr_SubmitFrame function.

// Create eye layer.


ovrLayerEyeFov eyeLayer;
eyeLayer.Header.Type = ovrLayerType_EyeFov;
eyeLayer.Header.Flags = 0;
for ( int eye = 0; eye < 2; eye++ )
{
Oculus Rift | PC SDK Developer Guide | 61

eyeLayer.ColorTexture[eye] = EyeBufferSet[eye];
eyeLayer.Viewport[eye] = EyeViewport[eye];
eyeLayer.Fov[eye] = EyeFov[eye];
eyeLayer.RenderPose[eye] = EyePose[eye];
}

// Create HUD layer, fixed to the player's torso


ovrLayerQuad hudLayer;
hudLayer.Header.Type = ovrLayerType_Quad;
hudLayer.Header.Flags = ovrLayerFlag_HighQuality;
hudLayer.ColorTexture = TheHudTextureSwapChain;
// 50cm in front and 20cm down from the player's nose,
// fixed relative to their torso.
hudLayer.QuadPoseCenter.Position.x = 0.00f;
hudLayer.QuadPoseCenter.Position.y = -0.20f;
hudLayer.QuadPoseCenter.Position.z = -0.50f;
hudLayer.QuadPoseCenter.Orientation.x = 0;
hudLayer.QuadPoseCenter.Orientation.y = 0;
hudLayer.QuadPoseCenter.Orientation.z = 0;
hudLayer.QuadPoseCenter.Orientation.w = 1;
// HUD is 50cm wide, 30cm tall.
hudLayer.QuadSize.x = 0.50f;
hudLayer.QuadSize.y = 0.30f;
// Display all of the HUD texture.
hudLayer.Viewport.Pos.x = 0.0f;
hudLayer.Viewport.Pos.y = 0.0f;
hudLayer.Viewport.Size.w = 1.0f;
hudLayer.Viewport.Size.h = 1.0f;

// The list of layers.


ovrLayerHeader *layerList[2];
layerList[0] = &eyeLayer.Header;
layerList[1] = &hudLayer.Header;

// Set up positional data.


ovrViewScaleDesc viewScaleDesc;
viewScaleDesc.HmdSpaceToWorldScaleInMeters = 1.0f;
viewScaleDesc.HmdToEyeViewOffset[0] = HmdToEyeOffset[0];
viewScaleDesc.HmdToEyeViewOffset[1] = HmdToEyeOffset[1];

ovrResult result = ovr_SubmitFrame(Session, 0, &viewScaleDesc, layerList, 2);

The compositor performs timewarp, distortion, and chromatic aberration correction on each layer separately
before blending them together. The traditional method of rendering a quad to the eye buffer involves two
filtering steps (once to the eye buffer, then once during distortion). Using layers, there is only a single filtering
step between the layer image and the final framebuffer. This can provide a substantial improvement in text
quality, especially when combined with mipmaps and the ovrLayerFlag_HighQuality flag.

One current disadvantage of layers is that no post-processing can be performed on the final composited
image, such as soft-focus effects, light-bloom effects, or the Z intersection of layer data. Some of these effects
can be performed on the contents of the layer with similar visual results.

Calling ovr_SubmitFrame queues the layers for display, and transfers control of the committed textures
inside the ovrTextureSwapChains to the compositor. It is important to understand that these textures
are being shared (rather than copied) between the application and the compositor threads, and that
composition does not necessarily happen at the time ovr_SubmitFrame is called, so care must be taken. To
continue rendering into a texture swap chain the application should always get the next available index with
ovr_GetTextureSwapChainCurrentIndex before rendering into it. For example:

// Create two TextureSwapChains to illustrate.


ovrTextureSwapChain eyeTextureSwapChain;
ovr_CreateTextureSwapChainDX ( ... &eyeTextureSwapChain );
ovrTextureSwapChain hudTextureSwapChain;
ovr_CreateTextureSwapChainDX ( ... &hudTextureSwapChain );

// Set up two layers.


ovrLayerEyeFov eyeLayer;
ovrLayerEyeFov hudLayer;
eyeLayer.Header.Type = ovrLayerType_EyeFov;
eyeLayer...etc... // set up the rest of the data.
hudLayer.Header.Type = ovrLayerType_Quad;
hudLayer...etc... // set up the rest of the data.
62 | PC SDK Developer Guide | Oculus Rift

// the list of layers


ovrLayerHeader *layerList[2];
layerList[0] = &eyeLayer.Header;
layerList[1] = &hudLayer.Header;

// Each frame...
int currentIndex = 0;
ovr_GetTextureSwapChainCurrentIndex(... eyeTextureSwapChain, &currentIndex);
// Render into it. It is recommended the app use ovr_GetTextureSwapChainBufferDX for each index on
texture chain creation to cache
// textures or create matching render target views. Each frame, the currentIndex value returned can
be used to index directly into that.
ovr_CommitTextureSwapChain(... eyeTextureSwapChain);

ovr_GetTextureSwapChainCurrentIndex(... hudTextureSwapChain, &currentIndex);


// Render into it. It is recommended the app use ovr_GetTextureSwapChainBufferDX for each index on
texture chain creation to cache
// textures or create matching render target views. Each frame, the currentIndex value returned can
be used to index directly into that.
ovr_CommitTextureSwapChain(... hudTextureSwapChain);

eyeLayer.ColorTexture[0] = eyeTextureSwapChain;
eyeLayer.ColorTexture[1] = eyeTextureSwapChain;
hudLayer.ColorTexture = hudTextureSwapChain;
ovr_SubmitFrame(Hmd, 0, nullptr, layerList, 2);

Asynchronous TimeWarp
Asynchronous TimeWarp (ATW) is a technique for reducing latency and judder in VR applications and
experiences.

In a basic VR game loop, the following occurs:

1. The software requests your head position.


2. The CPU processes the scene for each eye.
3. The GPU renders the scenes.
4. The Oculus Compositor applies distortion and displays the scenes on the headset.

The following shows a basic example of a game loop:

Figure 8: Basic Game Loop

When frame rate is maintained, the experience feels real and is enjoyable. When it doesn’t happen in time, the
previous frame is shown which can be disorienting. The following graphic shows an example of judder during
the basic game loop:

Figure 9: Basic Game Loop with Judder

When you move your head and the world doesn’t keep up, this can be jarring and break immersion.

ATW is a technique that shifts the rendered image slightly to adjust for changes in head movement. Although
the image is modified, your head does not move much, so the change is slight.

Additionally, to smooth issues with the user’s computer, game design or the operating system, ATW can help
fix “potholes” or moments when the frame rate unexpectedly drops.
Oculus Rift | PC SDK Developer Guide | 63

The following graphic shows an example of frame drops when ATW is applied:

Figure 10: Game Loop with ATW

At the refresh interval, the Compositor applies TimeWarp to the last rendered frame. As a result, a TimeWarped
frame will always be shown to the user, regardless of frame rate. If the frame rate is very bad, flicker will be
noticeable at the periphery of the display. But, the image will still be stable.

ATW is automatically applied by the Oculus Compositor; you do not need to enable or tune it. However,
although ATW reduces latency, make sure that your application or experience makes frame rate.

Adaptive Queue Ahead


To improve CPU and GPU parallelism and increase the amount of time that the GPU has to process a frame, the
SDK automatically applies queue ahead up to 1 frame.

Without queue ahead, the CPU begins processing the next frame immediately after the previous frame
displays. After the CPU finishes, the GPU processes the frame, the compositor applies distortion, and the frame
is displayed to the user. The following graphic shows CPU and GPU utilization without queue ahead:

Figure 11: CPU and GPU Utilization without Queue Ahead

If the GPU cannot process the frame in time for display, the previous frame displays. This results in judder.

With queue ahead, the CPU can start earlier; this provides the GPU more time to process the frame. The
following graphic shows CPU and GPU utilization with queue ahead:

Figure 12: CPU and GPU Utilization with Queue Ahead

Advanced Rendering Configuration


By default, the SDK generates configuration values that optimize for rendering quality.

It also provides a degree of flexibility. For example, you can make changes when creating render target
textures.

This section discusses changes you can make when choosing between rendering quality and performance, or if
the engine you are using imposes constraints.
64 | PC SDK Developer Guide | Oculus Rift

Coping with Graphics API or Hardware Rendertarget Granularity


The SDK is designed with the assumption that you want to use your video memory as carefully as possible and
that you can create exactly the right render target size for your needs.

However, real video cards and real graphics APIs have size limitations (all have a maximum size; some also have
a minimum size). They might also have granularity restrictions, for example, only being able to create render
targets that are a multiple of 32 pixels in size or having a limit on possible aspect ratios. As an application
developer, you can also impose extra restrictions to avoid using too much graphics memory.

In addition to the above, the size of the actual render target surface in memory might not necessarily be the
same size as the portion that is rendered to. The latter may be slightly smaller. However, since it is specified as
a viewport, it typically does not have any granularity restrictions. When you bind the render target as a texture,
however, it is the full surface that is used, and so the UV coordinates must be corrected for the difference
between the size of the rendering and the size of the surface it is on. The API will do this for you, but you need
to tell it the relevant information.

The following code shows a two-stage approach for settings render target resolution. The code first calls
ovr_GetFovTextureSize to compute the ideal size of the render target. Next, the graphics library is called
to create a render target of the desired resolution. In general, due to idiosyncrasies of the platform and
hardware, the resulting texture size might be different from that requested.

// Get recommended left and right eye render target sizes.


Sizei recommenedTex0Size = ovr_GetFovTextureSize(session, ovrEye_Left,
session->DefaultEyeFov[0], pixelsPerDisplayPixel);
Sizei recommenedTex1Size = ovr_GetFovTextureSize(session, ovrEye_Right,
session->DefaultEyeFov[1], pixelsPerDisplayPixel);

// Determine dimensions to fit into a single render target.


Sizei renderTargetSize;
renderTargetSize.w = recommenedTex0Size.w + recommenedTex1Size.w;
renderTargetSize.h = max ( recommenedTex0Size.h, recommenedTex1Size.h );

// Create texture.
pRendertargetTexture = pRender->CreateTexture(renderTargetSize.w, renderTargetSize.h);

// The actual RT size may be different due to HW limits.


renderTargetSize.w = pRendertargetTexture->GetWidth();
renderTargetSize.h = pRendertargetTexture->GetHeight();

// Initialize eye rendering information.


// The viewport sizes are re-computed in case RenderTargetSize changed due to HW limitations.
ovrFovPort eyeFov[2] = { session->DefaultEyeFov[0], session->DefaultEyeFov[1] };

EyeRenderViewport[0].Pos = Vector2i(0,0);
EyeRenderViewport[0].Size = Sizei(renderTargetSize.w / 2, renderTargetSize.h);
EyeRenderViewport[1].Pos = Vector2i((renderTargetSize.w + 1) / 2, 0);
EyeRenderViewport[1].Size = EyeRenderViewport[0].Size;

This data is passed into ovr_SubmitFrame as part of the layer description.


You are free to choose the render target texture size and left and right eye viewports as you like, provided
that you specify these values when calling ovr_SubmitFrame using the ovrTexture. However, using
ovr_GetFovTextureSize will ensure that you allocate the optimum size for the particular HMD in use. The
following sections describe how to modify the default configurations to make quality and performance trade-
offs. You should also note that the API supports using different render targets for each eye if that is required
by your engine (although using a single render target is likely to perform better since it will reduce context
switches). OculusWorldDemo allows you to toggle between using a single combined render target versus
separate ones for each eye, by navigating to the settings menu (press the Tab key) and selecting the Share
RenderTarget option.
Oculus Rift | PC SDK Developer Guide | 65

Forcing a Symmetrical Field of View


Typically the API will return an FOV for each eye that is not symmetrical, meaning the left edge is not the same
distance from the center as the right edge.

This is because humans, as well as the Rift, have a wider FOV when looking outwards. When you look inwards,
your nose is in the way. We are also better at looking down than we are at looking up. For similar reasons, the
Rift’s view is not symmetrical. It is controlled by the shape of the lens, various bits of plastic, and the edges of
the screen. The exact details depend on the shape of your face, your IPD, and where precisely you place the
Rift on your face; all of this is set up in the configuration tool and stored in the user profile. All of this means
that almost nobody has all four edges of their FOV set to the same angle, so the frustum produced will be off-
center. In addition, most people will not have the same fields of view for both their eyes. They will be close, but
rarely identical.

As an example, on our first generation DK1 headset, the author’s left eye has the following FOV:

• 53.6 degrees up
• 58.9 degrees down
• 50.3 degrees inwards (towards the nose)
• 58.7 degrees outwards (away from the nose)

In the code and documentation, these are referred to as ‘half angles’ because traditionally a FOV is expressed
as the total edge-to-edge angle. In this example, the total horizontal FOV is 50.3+58.7 = 109.0 degrees, and
the total vertical FOV is 53.6+58.9 = 112.5 degrees.

The recommended and maximum fields of view can be accessed from the HMD as shown below:

ovrFovPort defaultLeftFOV = session->DefaultEyeFov[ovrEye_Left];

ovrFovPort maxLeftFOV = session->MaxEyeFov[ovrEye_Left];

DefaultEyeFov refers to the recommended FOV values based on the current user’s profile settings (IPD, eye
relief etc). MaxEyeFov refers to the maximum FOV that the headset can possibly display, regardless of profile
settings.

The default values provide a good user experience with no unnecessary additional GPU load. If your application
does not consume significant GPU resources, you might want to use the maximum FOV settings to reduce
reliance on the accuracy of the profile settings. You might provide a slider in the application control panel
that enables users to choose interpolated FOV settings between the default and the maximum. But, if your
application is heavy on GPU usage, you might want to reduce the FOV below the default values as described in
Improving Performance by Decreasing Field of View on page 67.

The FOV angles for up, down, left, and right (expressed as the tangents of the half-angles), is the most
convenient form to set up culling or portal boundaries in your graphics engine. The FOV values are also used
to determine the projection matrix used during left and right eye scene rendering. We provide an API utility
function ovrMatrix4f_Projection for this purpose:

ovrFovPort fov;

// Determine fov.
...

ovrMatrix4f projMatrix = ovrMatrix4f_Projection(fov, znear, zfar, 0);

It is common for the top and bottom edges of the FOV to not be the same as the left and right edges when
viewing a PC monitor. This is commonly called the ‘aspect ratio’ of the display, and very few displays are
66 | PC SDK Developer Guide | Oculus Rift

square. However, some graphics engines do not support off-center frustums. To be compatible with these
engines, you will need to modify the FOV values reported by the ovrHmdDesc struct. In general, it is better to
grow the edges than to shrink them. This will put a little more strain on the graphics engine, but will give the
user the full immersive experience, even if they won’t be able to see some of the pixels being rendered.
Some graphics engines require that you express symmetrical horizontal and vertical fields of view, and some
need an even less direct method such as a horizontal FOV and an aspect ratio. Some also object to having
frequent changes of FOV, and may insist that both eyes be set to the same. The following is a an example of
code for handling this restrictive case:

ovrFovPort fovLeft = session->DefaultEyeFov[ovrEye_Left];


ovrFovPort fovRight = session->DefaultEyeFov[ovrEye_Right];

ovrFovPort fovMax = FovPort::Max(fovLeft, fovRight);


float combinedTanHalfFovHorizontal = max ( fovMax.LeftTan, fovMax.RightTan );
float combinedTanHalfFovVertical = max ( fovMax.UpTan, fovMax.DownTan );

ovrFovPort fovBoth;
fovBoth.LeftTan = fovBoth.RightTan = combinedTanHalfFovHorizontal;
fovBoth.UpTan = fovBoth.DownTan = combinedTanHalfFovVertical;

// Create render target.


Sizei recommenedTex0Size = ovr_GetFovTextureSize(session, ovrEye_Left,
fovBoth, pixelsPerDisplayPixel);
Sizei recommenedTex1Size = ovr_GetFovTextureSize(session, ovrEye_Right,
fovBoth, pixelsPerDisplayPixel);

...

// Initialize rendering info.


ovrFovPort eyeFov[2];
eyeFov[0] = fovBoth;
eyeFov[1] = fovBoth;

...

// Compute the parameters to feed to the rendering engine.


// In this case we are assuming it wants a horizontal FOV and an aspect ratio.
float horizontalFullFovInRadians = 2.0f * atanf ( combinedTanHalfFovHorizontal );
float aspectRatio = combinedTanHalfFovHorizontal / combinedTanHalfFovVertical;

GraphicsEngineSetFovAndAspect ( horizontalFullFovInRadians, aspectRatio );


...

Note: You will need to determine FOV before creating the render targets, since FOV affects the size of
the recommended render target required for a given quality.

Improving Performance by Decreasing Pixel Density


The DK1 has a resolution of 1280x800 pixels, split between the two eyes. However, because of the wide FOV of
the Rift and the way perspective projection works, the size of the intermediate render target required to match
the native resolution in the center of the display is significantly higher.

For example, to achieve a 1:1 pixel mapping in the center of the screen for the author’s field-of-view settings
on a DK1 requires a much larger render target that is 2000x1056 pixels in size.

Even if modern graphics cards can render this resolution at the required 60Hz, future HMDs might have
significantly higher resolutions. For virtual reality, dropping below 60Hz provides a terrible user experience; it is
always better to decrease the resolution to maintain framerate. This is similar to a user having a high resolution
2560x1600 monitor. Very few 3D applications can run at this native resolution at full speed, so most allow the
user to select a lower resolution to which the monitor upscales to the fill the screen.
Oculus Rift | PC SDK Developer Guide | 67

You can use the same strategy on the HMD. That is, run it at a lower video resolution and let the hardware
upscale for you. However, this introduces two steps of filtering: one by the distortion processing and one by the
video upscaler. Unfortunately, this double filtering introduces significant artifacts. It is usually more effective to
leave the video mode at the native resolution, but limit the size of the intermediate render target. This gives a
similar increase in performance, but preserves more detail.
One way to resolve this is to allow the user to adjust the resolution through a resolution selector. However, the
actual resolution of the render target depends on the user’s configuration, rather than a standard hardware
setting This means that the ‘native’ resolution is different for different people. Additionally, presenting
resolutions higher than the physical hardware resolution might confuse some users. They might not understand
that selecting 1280x800 is a significant drop in quality, even though this is the resolution reported by the
hardware.

A better option is to modify the pixelsPerDisplayPixel value that is passed to the


ovr_GetFovTextureSize function. This could also be based on a slider presented in the applications render
settings. This determines the relative size of render target pixels as they map to pixels at the center of the
display surface. For example, a value of 0.5 would reduce the render target size from 2000x1056 to 1000x528
pixels, which might allow mid-range PC graphics cards to maintain 60Hz.

float pixelsPerDisplayPixel = GetPixelsPerDisplayFromApplicationSettings();

Sizei recommenedTexSize = ovr_GetFovTextureSize(session, ovrEye_Left, fovLeft,


pixelsPerDisplayPixel);

Although you can set the parameter to a value larger than 1.0 to produce a higher-resolution intermediate
render target, Oculus hasn't observed any useful increase in quality and it has a high performance cost.

OculusWorldDemo allows you to experiment with changing the render target pixel density. Navigate to the
settings menu (press the Tab key) and select Pixel Density. Press the up and down arrow keys to adjust the pixel
density at the center of the eye projection. A value of 1.0 sets the render target pixel density to the display
surface 1:1 at this point on the display. A value of 0.5 means sets the density of the render target pixels to half
of the display surface. Additionally, you can select Dynamic Res Scaling which will cause the pixel density to
automatically adjust between 0 to 1.

Improving Performance by Decreasing Field of View


In addition to reducing the number of pixels in the intermediate render target, you can increase performance
by decreasing the FOV that the pixels are stretched across.

Depending on the reduction, this can result in tunnel vision which decreases the sense of immersion.
Nevertheless, reducing the FOV increases performance in two ways. The most obvious is fillrate. For a fixed
pixel density on the retina, a lower FOV has fewer pixels. Because of the properties of projective math, the
outermost edges of the FOV are the most expensive in terms of numbers of pixels. The second reason is that
there are fewer objects visible in each frame which implies less animation, fewer state changes, and fewer draw
calls.

Reducing the FOV set by the player is a very painful choice to make. One of the key experiences of virtual
reality is being immersed in the simulated world, and a large part of that is the wide FOV. Losing that aspect is
not a thing we would ever recommend happily. However, if you have already sacrificed as much resolution as
you can, and the application is still not running at 60Hz on the user’s machine, this is an option of last resort.

We recommend giving players a Maximum FOV slider that defines the four edges of each eye’s FOV.

ovrFovPort defaultFovLeft = session->DefaultEyeFov[ovrEye_Left];


ovrFovPort defaultFovRight = session->DefaultEyeFov[ovrEye_Right];

float maxFovAngle = ...get value from game settings panel...;


float maxTanHalfFovAngle = tanf ( DegreeToRad ( 0.5f * maxFovAngle ) );
68 | PC SDK Developer Guide | Oculus Rift

ovrFovPort newFovLeft = FovPort::Min(defaultFovLeft, FovPort(maxTanHalfFovAngle));


ovrFovPort newFovRight = FovPort::Min(defaultFovRight, FovPort(maxTanHalfFovAngle));

// Create render target.


Sizei recommenedTex0Size = ovr_GetFovTextureSize(session, ovrEye_Left newFovLeft,
pixelsPerDisplayPixel);
Sizei recommenedTex1Size = ovr_GetFovTextureSize(session, ovrEye_Right, newFovRight,
pixelsPerDisplayPixel);

...

// Initialize rendering info.


ovrFovPort eyeFov[2];
eyeFov[0] = newFovLeft;
eyeFov[1] = newFovRight;

...

// Determine projection matrices.


ovrMatrix4f projMatrixLeft = ovrMatrix4f_Projection(newFovLeft, znear, zfar, 0);
ovrMatrix4f projMatrixRight = ovrMatrix4f_Projection(newFovRight, znear, zfar, 0);

It might be interesting to experiment with non-square fields of view. For example, clamping the up and down
ranges significantly (e.g. 70 degrees FOV) while retaining the full horizontal FOV for a ‘Cinemascope’ feel.
OculusWorldDemo allows you to experiment with reducing the FOV below the defaults. Navigate to the
settings menu (press the Tab key) and select the “Max FOV” value. Pressing the up and down arrows to change
the maximum angle in degrees.

Improving Performance by Rendering in Mono


A significant cost of stereo rendering is rendering two views, one for each eye.

For some applications, the stereoscopic aspect may not be particularly important and a monocular view might
be acceptable in return for some performance. In other cases, some users may get eye strain from a stereo view
and wish to switch to a monocular one. However, they still wish to wear the HMD as it gives them a high FOV
and head-tracking.

OculusWorldDemo allows the user to toggle mono render mode by pressing the F7 key.

To render in mono, your code should have the following changes:


• Set the FOV to the maximum symmetrical FOV based on both eyes.
• Call ovhHmd_GetFovTextureSize with this FOV to determine the recommended render target size.
• Configure both eyes to use the same render target and the same viewport when calling
ovr_SubmitFrame.
• Render the scene once to the shared render target.

This merges the FOV of the left and right eyes into a single intermediate render. This render is still distorted
twice, once per eye, because the lenses are not exactly in front of the user’s eyes. However, this is still a
significant performance increase.

Setting a virtual IPD to zero means that everything will seem gigantic and infinitely far away, and of course the
user will lose much of the sense of depth in the scene.

Note: It is important to scale virtual IPD and virtual head motion together so, if the virtual IPD is set
to zero, all virtual head motion due to neck movement is also be eliminated. Sadly, this loses much of
the depth cues due to parallax. But, if the head motion and IPD do not agree, it can cause significant
disorientation and discomfort. Experiment with caution!
Oculus Rift | PC SDK Developer Guide | 69

Protecting Content
There are some cases where you only want the content to display on the headset. The protected content
feature is designed to prevent any mirroring of the compositor.

To use the protected content feature, configure your application to create one or more ovrTextureSwapChain
objects with the ovrTextureMisc_ProtectedContent flag specified. Any submission which references a protected
swap chain is considered a protected frame; any protected frames are composited and displayed in the HMD,
but are not replicated to mirrors or available to the capture buffer API.

If the HMD is not HDCP compliant, the texture swap chain creation API will fail with
ovrError_ContentProtectionNotAvailable. If the textures can be created (HDCP-compliant HMD), but the
link is broken later, the next ovrSubmitFrame call that references protected texture swap chains will fail with
the ovrError_ContentProtectionNotAvailable error. Configure your application to respond according to your
requirements. For example, you might submit that next frame without protected swap chains, but at a lower
quality that doesn’t require protection. Or, you might stop playback and display an error or warning to the user.

Note: Because the mirror window is in the control of the application, if your application does not use
the mirror texture, it is up to your application to render something to the preview window. Make sure
your application does not display protected content. Since the surfaces are not known to be protected
by the OS, they will be displayed normally inside the application that created them.
To enable protected content, specify the

ovrTextureMisc_ProtectedContent

flag similarly to the following:

ovrTextureSwapChainDesc desc = {};


desc.Type = ovrTexture_2D;
desc.ArraySize = 1;
desc.Format = OVR_FORMAT_R8G8B8A8_UNORM_SRGB;
desc.Width = sizeW;
desc.Height = sizeH;
desc.MipLevels = 1;
desc.SampleCount = 1;
desc.MiscFlags = ovrTextureMisc_DX_Typeless | ovrTextureMisc_ProtectedContent;
desc.BindFlags = ovrTextureBind_DX_RenderTarget;
desc.StaticImage = ovrFalse;

ovrResult result = ovr_CreateTextureSwapChainDX(session, Device, &desc, &TextureChain);

VR Focus Management
When you submit your application to Oculus, you provide the application and metadata necessary to list it in
the Oculus Store and launch it from Oculus Home.

Once launched from Oculus Home, you need to write a loop that polls for session status. ovr_GetSessionStatus
returns a struct with the following booleans:

• ShouldQuit—True if the application should initiate shutdown.


• HmdPresent—True if an HMD is present.
• DisplayLost—True if the HMD was unplugged or the display driver was manually disabled or
encountered a TDR.
• HmdMounted—True if the HMD is on the user's head.
• IsVisible—True if the game or experience has VR focus and is visible in the HMD.
70 | PC SDK Developer Guide | Oculus Rift

• ShouldRecenter—True if the application should call ovr_RecenterTrackingOrigin. This is triggered


when the user initiates recentering through the Universal Menu.

Managing When a User Quits


If ShouldQuit is true, save the application state and shut down or shut down without saving the application
state. The user will automatically return to Oculus Home.

Depending on the type of application, you can prompt the user to start where he or she left off the next time it
is opened (e.g., a multi-level game) or you can just start from the beginning of the experience (e.g., a passive
video). If this is a multiplayer game, you might want to quit locally without ending the game.

Managing When a User Requests Recentering


If ShouldRecenter is true, the application should call ovr_RecenterTrackingOrigin or
ovr_SpecifyTrackingOrigin and be prepared for future tracking positions to be based on a different
origin.

Some applications may have reason to ignore the request or to implement it via an internal
mechanism other than via ovr_RecenterTrackingOrigin. In such cases the application can call
ovr_ClearShouldRecenterFlag to cause the recenter request to be cleared.

Managing an Unplugged Headset


If DisplayLost is true:

1. Pause the application, including audio.


2. Display a prompt on the monitor that says that the headset was unplugged.
3. Destroy any TextureSwapChains or mirror textures.
4. Call ovrDestroy.
5. Poll ovrSessionStatus::HmdPresent until true.
6. Call ovrCreate to recreate the session.
7. Recreate any TextureSwapChains or mirror textures.
8. Resume the application.

If ovrDetect doesn’t isn’t returned as true after a specified amount of time, act as though ShouldQuit
returned true. If the user takes no action after a specified amount of time, choose a default action (save the
session or close without saving) and close the application.

Note: For multiplayer games, you might want to follow the same process without pausing the game.

Managing an Unavailable Headset


When a user removes the headset or if your application does have VR focus, HmdMounted or IsVisible
returns false. Pause the application until they return true.

When your application loses VR focus, it automatically stops receiving input. If your application does not use
the Oculus input API, it will need to ignore any received input.
Note: For multiplayer games, you might want the game to continue without pausing.
Oculus Rift | PC SDK Developer Guide | 71

Managing Loss of Windows Focus


When your application loses Windows focus, the Oculus Remote, Xbox controller, and Touch controllers will
continue to work normally. However, the application will lose control of the mouse and keyboard.

If your application loses Windows focus and maintains VR focus (IsVisible), continue to process input and
allow the application to run normally. If the keyboard or mouse is needed to continue, prompt the user to
remove the headset and use Alt-Tab to regain Windows focus.

Code Sample

bool shouldQuit = false;

void RunApplication()
{
ovrResult result = ovr_Initialize();

if (OVR_SUCCESS(result))
{
ovrSession session;
ovrGraphicsLuid luid;
result = ovr_Create(&session, &luid);

if (OVR_SUCCESS(result))
{
ovrSessionStatus ss;

<create graphics device with luid>


<create render target via ovr_CreateTextureSwapChain>

while (!shouldQuit)
{
<get next frame pose, e.g. via ovr_GetEyePoses>
<render frame>

result = ovr_SubmitFrame(...);

if (result == ovrSuccess_NotVisible)
{
<turn off audio output>
do { // Wait until we regain visibility or should quit
<sleep>
result = ovr_GetSessionStatus(session, &ss);
if (ss.ShouldQuit)
shouldQuit = true;
} while (OVR_SUCCESS(result) && !ss.IsVisible && !shouldQuit);
<possibly re-enable audio>
}
else if (result == ovrError_DisplayLost)
{
// We can either immediately quit or do the following:
<destroy render target and graphics device>
ovr_Destroy(session);

do { // Spin while trying to recreate session.


result = ovr_Create(&session, &luid);
} while (OVR_FAILURE(result) && !shouldQuit);

if (OVR_SUCCESS(result))
{
<recreate graphics device with luid>
<recreate render target via ovr_CreateTextureSwapChain>
}
}
else if (OVR_FAILURE(result))
{
shouldQuit = true;
}

ovr_GetSessionStatus(session, &ss);
if (ss.ShouldQuit)
shouldQuit = true;
if (ss.ShouldRecenter)
{
72 | PC SDK Developer Guide | Oculus Rift

ovr_RecenterTrackingOrigin(session); // or ovr_ClearShouldRecenterFlag(session)
to ignore the request.
<do anything else needed to handle this>
}
}

<destroy render target via ovr_DestroyTextureSwapChain>


<destroy graphics device>

ovr_Destroy(session);
}

ovr_Shutdown();
}
}
Oculus Rift | PC SDK Developer Guide | 73

Oculus Guardian System


The Oculus Guardian System is designed to display in-application wall and floor markers when users get near
boundaries they defined. When the user gets too close to the edge of a boundary, translucent boundary
markers are displayed in a layer that is superimposed over the game or experience.

The following image shows the Guardian System activated. Note the floor boundaries (lines) and the wall
boundaries (crosses):

Figure 13: Oculus Boundary System

Setting Up the Guardian System on Your Rift


After the user sets up the boundaries, they will show up in a layer over any application whenever the user gets
too close.
Note: Users can set the boundaries to Walls and Floor, set them to Floor Only, or disable them entirely.

To set up the boundaries:

1. Open the Oculus App.


2. Select Settings -> Devices -> Run Full Setup.
3. Select Rift and Touch.
74 | PC SDK Developer Guide | Oculus Rift

4. Follow the on-screen instructions to confirm sensor tracking.


5. Continue until the Mark Your Boundaries page appears.
6. Follow the on-screen instructions, using the INDEX trigger button to draw the outer bounds of your play
area. Currently, there is no minimum width and depth.
When you are finished, click Next. Your boundaries are saved.
7. If you need to disable the Guardian System, toggle Guardian System Enabled/Disabled on the Universal
Menu.

Game Configuration
During initialization, your application can make an API request to get the outer boundary and play area. The
outer boundary is the space that the user defined during configuration. The play area is a rectangular space
within the outer boundary. With this information, your application can set up a virtual world with "barriers" that
align with the real world. For example, you can adjust the size of a cockpit based on the user-defined play area.

The following functions return information about the outer boundary and play area:

Function Description
ovr_GetBoundaryDimensions Returns the width, height, and depth of the play area
or outer boundary in meters.
ovr_GetBoundaryGeometry Returns the points that define the play area or outer
boundary. For the play area, it returns the four
points that define the rectangular area. For the outer
boundary, it returns all of the points that define the
outer boundary.

During runtime, your application can request whether the boundaries are visible using
ovr_GetBoundaryVisible. When visible, you can choose how the application will respond. For example,
you might choose to pause the application, slow the application, or simply display a message.

The boundary status information is returned in a struct which contains the following:

Member Type Description


IsTriggering ovrBool Returns whether the boundaries are
currently visible.
ClosestDistance float Distance to the closest play area or
outer boundary surface.
ClosestPoint ovrVector3f Closest point on the boundary
surface.
ClosestPointNormal ovrVector3f Unit surface normal of the closes
boundary surface.

Additionally, you can set the bounds to be visible to orient the user or explain how you will use the space by
setting ovr_RequestBoundaryVisible() to ovrTrue. When you are finished, simply pass ovrFalse.
Note: You can't force the boundaries off if they were triggered by user.

The default boundary color is cyan. To change the color, use ovr_SetBoundaryLookAndFeel().
Oculus Rift | PC SDK Developer Guide | 75

Code Sample
To help you get started, we provide a code sample at Samples/GuardianSystemDemo that shows usage of
the following APIs:

• ovr_TestBoundary
• ovr_TestBoundaryPoint
• ovr_SetBoundaryLookAndFeel
• ovr_RequestBoundaryVisible
• ovr_ResetBoundaryLookAndFee

Boxes collide with the boundary data using the test API, the boundary visibility and color changes every
second, and the simulation slows (and then stops) when the HMD or Touch controllers get too close to the
boundary.
76 | PC SDK Developer Guide | Oculus Rift

Rift Audio
When setting up audio for the Rift, you need to determine whether the Rift headphones are active and pause
the audio when your app doesn’t have focus.
The user can enable the Rift headphones and microphone in the Oculus App or use the default Windows audio
devices. Configuration is made the Oculus app in Settings -> Devices -> Headset. The following screenshot
shows the Rift headphones disabled and the microphone enabled:

Figure 14: Audio Configuration

The headphone setting is handled automatically by the function ovr_GetAudioDeviceOutGuid (located


in OVR_CAPI_Audio.h), which returns the GUID for the device to target when playing audio. Similarly, use
ovr_GetAudioDeviceInGuid to identify the microphone device used for input.
Oculus Rift | PC SDK Developer Guide | 77

FMOD
If you detect that the Rift headphones are enabled, use code similar to the following for FMOD:

ERRCHECK(FMOD::System_Create(&sys));
GUIDguid;
ovr_GetAudioDeviceOutGuid(&guid);

intdriverCount=0;
sys->getNumDrivers(&driverCount);

intdriver=0;
while(driver<driverCount)
{
charname[256]={0};
FMOD_GUIDfmodGuid={0};
sys->getDriverInfo(driver,name,256,&fmodGuid,nullptr,nullptr,nullptr);

if(guid.Data1==fmodGuid.Data1&&
guid.Data2==fmodGuid.Data2&&
guid.Data3==fmodGuid.Data3&&
memcmp(guid.Data4,fmodGuid.Data4,8)==0)
{
break;
}

++driver;
}

if(driver<driverCount)
{
sys->setDriver(driver);
}
else
{
// error rift not connected
}

Wwise
If you detect that the Rift headphones are enabled, use code similar to the following for Wwise:

AkInitSettings initSettings;
AkPlatformInitSettings platformInitSettings;
AK::SoundEngine::GetDefaultInitSettings( initSettings );
AK::SoundEngine::GetDefaultPlatformInitSettings( platformInitSettings );

// Configure initSettings and platformInitSettings...

WCHAR outStr[128];
if (OVR_SUCCESS(ovr_GetAudioDeviceOutGuidStr(outStr)))
{
initSettings.eMainOutputType = AkAudioAPI::AkAPI_Wasapi;
platformInitSettings.idAudioDevice = AK::GetDeviceIDFromName(outStr);
}

Unity 5
Audio input and output automatically use the Rift microphone and headphones unless configured to use the
Windows default audio device by the user in the Oculus app. Events OVRManager.AudioOutChanged and
AudioInChanged occur when audio devices change, making audio playback impossible without a restart.
78 | PC SDK Developer Guide | Oculus Rift

Unity 5 and Wwise


To configure Wwise to use to configured audio device in Unity 5, pass the user-configured audio device name/
GUID (set in the Oculus app) into the function AkSoundEngine.GetDeviceIDFromName(), located in
AkInitializer.cs.
To get the audio device GUID from libOVR, you must include the Oculus Utilities unitypackage, which exposes
that string through the class OVRManager.

The following function should be called before AkSoundEngine.Init(...):

void SetRiftAudioDevice(AkPlatformInitSettings settings)


{
string audioDevice = OVRManager.audioOutId;
uint audioOutId = AkSoundEngine.GetDeviceIDFromName (audioDevice);
settings.idAudioDevice = audioOutId;
}

Pass AkPlatformInitSettings into the function above and use it to initialize the Ak sound engine.
Note: OVRManager.audioOutId will be deprecated in the future. This minor change should not
impact the integration.

VR Audio Output in Oculus Store > Settings > Devices > Rift Headset may be used to configure which input
mic to use within the Ak sound engine. The GUID for this is exposed through OVRManager.audioInId.

Unity 4
Audio input and output automatically use the Rift microphone and headphones, unless configured to use the
Windows default audio device by the user in the Oculus app.

The Rift’s microphone is not used by default when calling Microphone.Start(null,..). Find the entry in
Microphone.devices that contains the word Rift and use it.

UE4
When Unreal PC applications are launched, if the OculusRift plugin is enabled and the Oculus VR Runtime
Service is installed, then the application will automatically override the default Windows graphics and audio
devices and target the Rift. The Oculus VR Runtime Service is installed with the Oculus App.

Unless your application is intended to run in VR, do not enable the OculusRift plugin. Otherwise, it is possible
that audio and/or video will be incorrectly targeted to the Oculus Rift when the application is run.

Alternatively, users can disable loading all HMD plugins by specifying "-nohmd" on the command line.

UE4 and Wwise


Use of Wwise with Unreal requires the WwiseUE4Plugin from AudioKinetic’s ‘master’ branch on GitHub. For
general information on how to install and use the Wwise Unreal Integration, see Unreal’s documentation on the
Installation GitHub site: https://github.com/audiokinetic/WwiseUE4Integration

Note: To access this documentation, you will need to associate your GitHub account with Epic Games.
For instructions on how to do so, see: https://www.unrealengine.com/ue4-on-github

UE4 and FMOD


For an illustration of how to target the Oculus Rift headphones using FMOD in UE4, see the FMOD section
above.
Oculus Rift | PC SDK Developer Guide | 79

VR Sound Level Best Practices


Audio is an important part of the virtual reality experience, and it is important that all VR apps provide a
comfortable listening level for users. Developers should target a reasonable sound level that is consistent
between different experiences. To achieve this, Oculus recommends the following best practices.
First, target -18 LUFS during final mix, using tools such as Avid's Pro Limited Plugin, Nugen's VisLM,
Klangfreund LUFS Meter, Audacity VuMeter, or similar loudness measurement tools.

Second, measure your experience against the audio levels in published Oculus experiences, especially the
ambient audio in Home and the Dreamdeck experience. Set overall system volume so that Dreamdeck and
Home sound comfortable, and then adjust your experience's mix to this volume.

Finally, mix your application using the Rift headphones. This ensures that the sounds you're creating and mixing
have a frequency content appropriate for the headphones most Oculus users will use.

By adhering to these guidelines, we can guarantee that our Oculus VR users will have a pleasant audio
experience.

Oculus Touch Controllers


This section describes Oculus Touch best practices gathered from developing and reviewing large numbers of
games and experiences. These are not requirements and we expect them to evolve over time.

Note: To view application requirements, go to https://developer.oculus.com/distribute/latest/concepts/


publish-rift-app-submission/

Input, Hands, and Controller Basics


When developing applications that use Oculus Touch, consider the following:

• If your application supports multiple input methods (such as Oculus Touch, Xbox 360 Controller, and Oculus
Remote), use ovrControllerType_Active to determine which one is in use. You can also use this to
determine if one or both Touch controllers are in use. Some applications render the Touch controllers
differently (e.g. as hands vs. controllers) depending on their in-use state. For more information, see this
section and https://developer.oculus.com/doc/1.11-libovr/_o_v_r___c_a_p_i_8h.html.
• Unless you have an uncommon use case, use the Avatar SDK to represent high quality hands and/or
controllers in your app.
• Map the grip button to grab actions. Although there are some exceptions to the rule (especially involving
throwing), the new user experience and most applications condition users to use it this way. If you break
expectations, make sure to educate your users.
• Prefer hands over controller models, especially for any application that involves social interaction or picking
things up.
• In-application hands and controllers should line up with the user’s real-world counterparts in position and
orientation as closely as possible. We call this “registration” and recommend the Avatar SDK as an example.
When testing your application, one technique is to hold your hands in front of your face and raise your
headset slightly so you can compare your real-world hands and your virtual hands.
• Fully animate all hands and controllers in the game, to reflect what the user is doing. Users will expect to
be able to grab, point, and thumbs-up with their hands. They will expect controller joysticks, buttons, and
triggers to animate on use. We have demos with full code and blueprints for doing this in UE4 and Unity.
• To support left- and right-handed players, we recommend designing objects for use with either hand. For
example, if the player is going to use a double rotating-wheel can opener to open a can in VR, make sure
the cutting wheel handle faces the other hand.
• When controllers are mirrored, make controls identical. When controllers have different functions, make sure
to establish a dominant and non-dominant hand (instead of favoring the right hand). For example, in the Toy
80 | PC SDK Developer Guide | Oculus Rift

Box demo, players usually use their non-dominant to hold the slingshot and their dominant hand to pull the
ammunition holder.
• Unless important to the experience, do not render additional body parts besides the hands in single player
games. Although inverse kinematics (i.e., extrapolating body part positions based on how the body can
and cannot move) are a tempting solution to represent more the player’s body, they become less accurate
as you try to simulate more of the body. The mismatch can end up being distracting and ironically less
immersive.

For multiplayer games, you can render the hands of the first person and the up to the full bodies of the
other players.

Tracking
The following are basic tracking tips:

• For front-facing apps, keep the action in front of the user. Don’t encourage users to do things that will break
the line of sight between the controllers and sensors, which will lead to poor tracking.
• Don’t require the user to interact with VR elements near the floor or far above their heads. Remove those
objects, or allow a distance grab. If you decide to allow interactions at a distance, make sure to indicate that
the object is available through a highlight, glow, shake, haptics, or another mechanism.
• Avoid interactions that encourage users to get too close or too far from one or more position trackers.

Large Play Areas


For applications that use a large amount of the tracking space, either horizontally or vertically, you’ll want to put
a little more effort into making the best use of the available space:

• Our user-facing description of “Standing” apps is “may require a step in any direction.” Even if your
application uses a lot of space, if it is going to be submitted as a standing app, it needs to meet that
description. Additionally, Standing apps must be fully usable in a 2m x 1.5 tracking area.
• If your application requires lots of space, use the user’s Guardian boundaries to make sure you place game
objects within reachable areas. For more information, see Oculus Guardian System on page 73.
• Applications can modify recenter behavior in a way that makes sense for the app. For example, an app
might clamp the recenter origin inside of the play area by some amount of padding, so that all gameplay
elements are reachable. Or it might choose to ignore the yaw component, to maintain an axis-aligned play
area. If an app substantially violates user expectations for recenter, it should inform the user about what’s
happening. For more information, see VR Focus Management on page 69.
• Applications can also use the above recenter functionality to set an optimal center position (overriding the
user’s home position) on launch. You can use the default recenter behavior on subsequent user-initiated
recenters, or continue to use modified recenter logic.
• If the user’s play space is undefined, create your own play space around the user’s recenter/home point.
This assumed play space should not exceed 2m x 1.5m (the recommended play space size). If your
application can use a smaller space, default to that.
• If your app requires high or low tracking, keep the shape of the sensor frustums in mind. For example, a
front-facing app that involves shooting basketballs will want to keep the user towards the rear of the play
area, where the tracking frustums are taller. For more information on sensor field of view and tracking
ranges, see: Initialization and Sensor Enumeration on page 42.
• Some users might not have the full 2m x 1.5m recommended play space, but will play your game despite
warnings. If your application requires this full amount of space, choose the best fallback. Possible solutions
include:

• It is generally better to overflow away from the sensors, not towards, since tracking is better further back
(and closer might put you past the sensor position).
• Users can attempt to use recenter to reach playable areas they would otherwise be unable to reach.
Oculus Rift | PC SDK Developer Guide | 81

Controller Data
The Oculus SDK provides APIs that return the position and state for each Oculus Touch controller.

This data is exposed through two locations:

• ovrTrackingState::HandPoses[2]—returns the pose and tracking state for each Oculus Touch controller.
• ovrInputState—structure returned by ovr_GetInputState that contains the Oculus Touch button, joystick,
trigger, and capacitive touch sensor state.

The controller hand pose data is separated from the input state because it comes from a different system and is
reported at separate points in time. Controller poses are returned by the constellation tracking system and are
predicted simultaneously with the headset, based on the absolute time passed into GetTrackingState. Having
both hand and headset data reported together provides a consistent snapshot of the system state.

Hand Tracking
The constellation sensor used to track the head position of the Oculus Rift also tracks the hand poses of the
Oculus Touch controllers.

For installations that have the Oculus Rift and Oculus Touch controllers, there will be at least two constellation
sensors to improve tracking accuracy and help with occlusion issues.
The SDK uses the same ovrPoseStatef struct as the headset, which includes six degrees of freedom (6DoF)
and tracking data (orientation, position, and their first and second derivatives).

Here’s an example of how to get tracking input:

ovrTrackingState trackState = ovr_GetTrackingState(session, displayMidpointSeconds, ovrTrue);


ovrPosef handPoses[2];
ovrInputState inputState;

In this code sample, we call ovr_GetTrackingState to get predicted poses. Hand controller poses are reported
in the same coordinate frame as the headset and can be used for rendering hands or objects in the 3D world.
An example of this is provided in the Oculus World Demo.

Button State
The input button state is reported based on the HID interrupts arriving to the computer and can be polled by
calling ovr_GetInputState.

The following example shows how input can be used in addition to hand poses:

double displayMidpointSeconds = ovr_GetPredictedDisplayTime(session, frameIndex);


ovrTrackingState trackState = ovr_GetTrackingState(session, displayMidpointSeconds, ovrTrue);
ovrPosef handPoses[2];
ovrInputState inputState;

// Grab hand poses useful for rendering hand or controller representation


handPoses[ovrHand_Left] = trackState.HandPoses[ovrHand_Left].ThePose;
handPoses[ovrHand_Right] = trackState.HandPoses[ovrHand_Right].ThePose;

if (OVR_SUCCESS(ovr_GetInputState(session, ovrControllerType_Touch, &inputState)))


{
if (inputState.Buttons & ovrButton_A)
{
// Handle A button being pressed
}
if (inputState.HandTrigger[ovrHand_Left] > 0.5f)
{
// Handle hand grip...
}
82 | PC SDK Developer Guide | Oculus Rift

The ovrInputState struct includes the following fields:

Field Type Description


TimeInSeconds double System time when the controller
state was last updated.
ControllerType unsigned int Described by ovrControllerType.
Indicates which controller types
are present; you can check the
ovrControllerType_LTouch bit,
for example, to verify that the left
touch controller is connected.
Options include:

• ovrControllerType_None
(0x0000)
• ovrControllerType_LTouch
(0x0001)
• ovrControllerType_RTouch
(0x0002)
• ovrControllerType_Touch
(0x0003)
• ovrControllerType_Remote
(0x0004)
• ovrControllerType_XBox
(0x0010)

Buttons unsigned int Button state described by


ovrButtons. A corresponding bit is
set if the button is pressed.
Touches unsigned int Touch values for buttons and
sensors as indexed by ovrTouch.
A corresponding bit is set if users
finger is touching the button or is
in a gesture state detectable by the
controller.
IndexTrigger[2] float Left and right finger trigger values
(ovrHand_Left and ovrHand_Right),
in the range 0.0 to 1.0f. A value of
1.0 means that the trigger is fully
pressed.
HandTrigger[2] float Left and right grip button values
(ovrHand_Left and ovrHand_Right),
in the range 0.0 to 1.0f. Hand
trigger is often used to grab items.
A value of 1.0 means that the
trigger is fully pressed.
Thumbstick[2] ovrVector2f Horizontal and vertical thumbstick
axis values (ovrHand_Left and
Oculus Rift | PC SDK Developer Guide | 83

Field Type Description


ovrHand_Right), in the range -1.0f
to 1.0f. The API automatically
applies the dead zone, so
developers don’t need to handle it
explicitly.
IndexTriggerNoDeadzone[2] float Left and right finger trigger values
(ovrHand_Left and ovrHand_Right),
in the range 0.0 to 1.0f, without a
deadzone. A value of 1.0 means
that the trigger is fully pressed.
HandTriggerNoDeadzone[2] float Left and right grip button values
(ovrHand_Left and ovrHand_Right),
in the range 0.0 to 1.0f, without
a deadzone. The grip button,
formerly known as the hand trigger,
is often used to grab items. A value
of 1.0 means that the button is fully
pressed.
ThumbstickNoDeadzone[2] ovrVector2f Horizontal and vertical thumbstick
axis values (ovrHand_Left and
ovrHand_Right), in the range -1.0f
to 1.0f, without a deadzone.
IndexTriggerRaw[2] float Raw left and right grip button
values (ovrHand_Left and
ovrHand_Right), in the range 0.0 to
1.0f, without a deadzone or filter. A
value of 1.0 means that the trigger
is fully pressed.
HandTriggerRaw[2] float Left and right grip button values
(ovrHand_Left and ovrHand_Right),
in the range 0.0 to 1.0f, without a
deadzone or filter. The grip button,
formerly known as the hand trigger,
is often used to grab items. A value
of 1.0 means that the button is fully
pressed.
ThumbstickRaw[2] ovrVector2f Horizontal and vertical thumbstick
axis values (ovrHand_Left and
ovrHand_Right), in the range -1.0f
to 1.0f, without a deadzone or
filter.

The ovrInputState structure includes the current state of buttons, thumb sticks, triggers and touch sensors on
the controller. You can check whether a button is pressed by checking against one of the button constants, as
was done for ovrButton_A in the above example. The following is a list of binary buttons available on touch
controllers:

Button Constant Description


ovrButton_A A button on the right Touch controller.
ovrButton_B B button on the right Touch controller.
84 | PC SDK Developer Guide | Oculus Rift

ovrButton_RThumb Thumb stick button on the right Touch controller.


ovrButton_X X button on the left Touch controller.
ovrButton_Y Y button on the left Touch controller.
ovrButton_LThumb Thumb stick button on the left Touch controller.
ovrButton_Enter Enter button on the left Touch controller. This is
equivalent to the Start button on the Xbox controller.

Button Touch State


In addition to buttons, Touch controllers can detect whether user fingers are touching some buttons or are in
certain positions.

These states are reported as bits in the Touches field, and can be checked through one of the following
constants:

ovrTouch_A User in touching A button on the right controller.


ovrTouch_B User in touching B button on the right controller.
ovrTouch_RThumb User has a finger on the thumb stick of the right
controller.
ovrTouch_RThumbRest User has a finger on the textured thumb rest of the
right controller.
ovrTouch_RIndexTrigger User in touching the index finger trigger on the right
controller.
ovrTouch_X User in touching X button on the left controller.
ovrTouch_Y User in touching Y button on the left controller.
ovrTouch_LThumb User has a finger on the thumb stick of the left
controller.
ovrTouch_LThumbRest User has a finger on the textured thumb rest of the
left controller.
ovrTouch_LIndexTrigger User in touching the index finger trigger on the left
controller.
ovrTouch_RIndexPointing Users right index finger is pointing forward past the
trigger.
ovrTouch_RThumbUp Users right thumb is up and away from buttons on the
controller, a gesture that can be interpreted as right
thumbs up.
ovrTouch_LIndexPointing Users left index finger is pointing forward past the
trigger.
ovrTouch_LThumbUp Users left thumb is up and away from buttons on the
controller, a gesture that can be interpreted as left
thumbs up.
Oculus Rift | PC SDK Developer Guide | 85

Haptic Feedback
In addition to reporting input state, Oculus touch controllers can provide haptic feedback through vibration.

The SDK supports two types of haptics: buffered and non-buffered. Buffered haptics are designed to change
rapidly (every 3.125ms) and work well for subtle effects. Non-buffered haptics are designed for simple effects
that don't change often (every 33ms).
Note: The buffered and non-buffered functions should not be used together, as they will result in
unpredictable haptic feedback.

Buffer-Based Haptics
Running at 320Hz, each sample is 3.125 milliseconds. Because these samples are added to a buffer that holds
256 samples, the buffer can hold up to 800 milliseconds of samples.

To check the status of the buffer, call ovr_GetControllerVibrationState:

ovr_GetControllerVibrationState(ovrSession session, ovrControllerType controllerType,


ovrHapticsPlaybackState* outState);

To submit to the buffer, call ovr_SubmitControllerVibration:

ovr_SubmitControllerVibration(ovrSession session, ovrControllerType controllerType, const


ovrHapticsBuffer* buffer);

The following code sample shows basic haptic submission as part of a game loop:

uint8_t amplitude = (uint8_t)round(handTrigger[t] * 255);

result = ovr_GetControllerVibrationState(Session, touchController[t], &state);


if (result != ovrSuccess || state.SamplesQueued >= kLowLatencyBufferSizeInSamples)
{
DefaultChannel.LogWarningF("%s Haptics skipped. Queue size %d", kTouchStr[t],
state.SamplesQueued);
continue;
}

for (int32_t i = 0; i < kLowLatencyBufferSizeInSamples; ++i)


samples.push_back(amplitude);

if (samples.size() > 0)
{
ovrHapticsBuffer buffer;
buffer.SubmitMode = ovrHapticsBufferSubmit_Enqueue;
buffer.SamplesCount = (uint32_t)samples.size();
buffer.Samples = samples.data();
result = ovr_SubmitControllerVibration(Session, touchController[t], &buffer);
if (result != ovrSuccess)
{
// Something bad happened
DefaultChannel.LogErrorF("%s: Haptics submit failed %d", kTouchStr[t], result);
}
}

Non-Buffered Haptics
Vibration can be enabled by calling ovr_SetControllerVibration:

ovr_SetControllerVibration( Hmd, ovrControllerType_LTouch, freq, trigger);

Vibration is enabled by specifying the frequency. Specifying 0.0f will vibrate at 160Hz. Specifying 1.0f will
vibrate at 320Hz.
86 | PC SDK Developer Guide | Oculus Rift

SDK Samples and Gamepad Usage


Some of the Oculus SDK samples use gamepad controllers to enable movement around the virtual world.
This section describes the devices that are currently supported and setup instructions.

Xbox 360 Wired Controller for Windows


To set up the controller:

• Plug the device into a USB port. Windows should recognize the controller and install any necessary drivers
automatically.

Logitech F710 Wireless Gamepad


To set up the gamepad for Windows:

1. Put the controller into ‘XInput’ mode by moving the switch on the front of the controller to the ‘X’ position.
2. Press a button on the controller so that the green LED next to the ‘Mode’ button begins to flash.
3. Plug the USB receiver into the PC while the LED is flashing.
4. Windows should recognize the controller and install any necessary drivers automatically.

Optimizing Your Application


To provide the best user experience, your application must meet or exceed the minimum requirements to be
considered for publication on the Oculus Store.

For more information about publishing requirements, see our Publishing documentation.

This section describes how to use tools we provide to optimize the performance of your application.

SDK Performance Statistics


The SDK performance statistics provide information about application and compositor performance on the
system.
Stats are populated after each call to ovr_SubmitFrame. To get performance stats, use
ovr_GetPerfStats().

To reset the stats, use ovr_ResetPerfStats.


The following table describes performance statistics:

Table 3: Statistics

Statistics
Description
ovrPerfStatsPerCompositorFrame The per-compositor frame statistics in the following
tables.
AnyFrameStatsDropped If the app calls ovr_SubmitFrame at a rate less
than 18 fps, then when calling ovr_GetPerfStats,
expect AnyFrameStatsDropped to become
Oculus Rift | PC SDK Developer Guide | 87

Statistics
Description
ovrTrue while FrameStatsCount is equal to
ovrMaxProvidedFrameStats.
AdaptiveGpuPerformanceScale An edge-filtered value that you can use to adjust
the graphics quality of the application to keep the
GPU utilization in check. The value is calculated as:
(desired_GPU_utilization / current_GPU_utilization).
When this value is 1.0, the GPU is doing the right
amount of work for the app. Lower values mean the
application needs to reduce the GPU utilization.

Note: If the app directly drives render-target


resolution using this value, make sure to take
the square-root of the value before scaling the
resolution with it.
Changing the render target resolution is only one
of the many things your application can do increase
or decrease the amount of GPU utilization. Since
AdaptiveGpuPerformanceScale is edge-filtered and
does not change rapidly (i.e., it reports non-1.0 values
once every couple of seconds), your application
can make the necessary adjustments and continue
watching the value to see if it has been satisfied.

AswIsAvailable Returns true if ASW is available for this system, based


on the user's GPU, operating system, and debug
override settings.

The following table describes statistics specific to your application's performance:

Table 4: Application Statistics

Statistic
Description
AppFrameIndex Index that increments with each ovr_SubmitFrame
call.
AppDroppedFrameCount Increments each time the application fails to submit a
new set of layers using ovr_SubmitFrame() before the
compositor is executed before each V-Sync (Vertical
Synchronization).
AppMotionToPhotonLatency Latency from when the last predicted tracking
information was queried by the application using
ovr_GetTrackingState() to when the middle scanline
of the target frame is illuminated on the HMD display.
This is the same information provided by the Latency
Timing HUD.
AppQueueAheadTime To improve CPU and GPU parallelism and increase
the amount of time that the GPU has to process a
frame, the SDK automatically applies queue ahead
up to 1 frame. This value displays the amount of
88 | PC SDK Developer Guide | Oculus Rift

Statistic
Description
queue ahead time being applied (in milliseconds). For
more information about adaptive queue ahead, see
Adaptive Queue Ahead on page 63.
AppCpuElapsedTime The time difference from when the application
continued execution on CPU after ovr_SubmitFrame()
returned the subsequent call to ovr_SubmitFrame().
This will show "N/A" if the latency tester is not
functioning as expected (e.g., HMD display is
sleeping due to prolonged inactivity). This includes
the IPC call overhead to the compositor after
ovr_SubmitFrame() is called by the client application.
AppGpuElapsedTime The total GPU time spent on rendering by the client
application. This includes the work done by the
application after returning from ovr_SubmitFrame(),
using the mirror texture if applicable.
It can also includes GPU command-buffer "bubbles"
if the application's CPU thread doesn't push data to
the GPU fast enough to keep it occupied. Similarly, if
the app pushes the GPU close to full-utilization, the
work on next frame (N+1) might be preempted by
the compositor's render work on the current frame
(N). Because of how the application GPU timing
query operates, this can lead to artificially inflated
application GPU times as they will start to include the
compositor GPU usage times.

The compositor operates asynchronously and will increment for each vsync, regardless of whether the
application calls ovr_SubmitFrame.

The following table describes compositor statistics:

Table 5: Compositor Statistics

Statistic
Description
CompositorFrameIndex Index that increments each time the SDK compositor
completes a distortion/TimeWarp pass.
CompositorDroppedFrameCount Increments each time the compositor fails to
present a new rendered frame at V-Sync (Vertical
Synchronization).
CompositorLatency Specifies the TimeWarp latency, which corrects app
latency and dropped frames.
CompositorCpuElapsedTime The amount of time in seconds spent on the GPU
by the SDK compositor. Any time spent on the
compositor takes available GPU time away from the
application.
CompositorGpuElapsedTime The amount of time the GPU spends executing the
compositor renderer. This includes TimeWarp and
Oculus Rift | PC SDK Developer Guide | 89

Statistic
Description
distortion of all layers submitted by the application.
The amount of active layers, their resolutions, and
the requested sampling quality can all affect the GPU
times.
CompositorCpuStartToGpuEndElapsedTime The amount of time from when the CPU kicks off
the compositor to when the compositor completes
distortion and TimeWarp on the GPU. If the time is
not available, it returns -1.0f.
CompositorGpuEndToVsyncElapsedTime The amount of time between when the GPU
completes the compositor rendering to the point in
time when V-Sync is hit and that buffer starts scanning
out on the HMD.

The Asynchronous SpaceWarp (ASW) HUD displays activity and tracking statistics for ASW, which runs as part
of the Oculus Runtime Compositor. ASW automatically activates when an application fails to meet the required
native frame rate for the connected HMD. Once active, ASW forces the application to run at half the normal
frame rate while extrapolating every other frame. This gives the application more time to complete its work.
The following table describes Asynchronous SpaceWarp (ASW) statistics:

Table 6: ASW Statistics

Statistic
Description
AswIsActive Shows the availability and current status of ASW.
"Not Available" can be due to the OS and/or GPU
type used on the PC. "Available - Not Active" will
mean the application is successfully maintaining the
required native refresh rate, so ASW is not activated.
AswActivatedToggleCount Tracks the number of times ASW has been activated
for the lifetime of the HMD.
AswPresentedFrameCount Tracks the number of frames extrapolated by ASW
that were displayed. When ASW is active, since the
app is forced to run at half-rate, expect this value to
increase by 45 fps on a 90 Hz refresh rate HMD.
AswFailedFrameCount Tracks the number of extrapolated frames ASW
needed to display, but failed to prepare in time.
This can occur for different reasons, but might be
caused by contention for OS resources or when the
capabilities of the system are exceeded.

Oculus Debug Tool


The Oculus Debug Tool enables you to view performance or debugging information within your game or
experience.

To use the tool:

1. Go to Tools directory of the Oculus SDK.


2. Double-click OculusDebugTool.exe. The Oculus Debug Tool opens.
90 | PC SDK Developer Guide | Oculus Rift

3. Select the Visible HUD to view. Options include: None (no HUD is displayed), Performance HUD, Stereo
Debug HUD, or Layer HUD.
4. If you selected Performance HUD, select which Performance HUD you want to view. Options include:
Latency Timing, Render Timing, Performance Headroom, and Version Information. For more information,
see Performance Head-Up Display on page 92.
The following is an example of the Performance HUD:

5. If you selected Stereo Debug HUD, configure the mode, size, position, and color from the Stereo Debug
HUD options.

The following is an example of the Stereo Debug HUD:


Oculus Rift | PC SDK Developer Guide | 91

6. If you selected Layer HUD. select the layer for which to show information or select the Show All check box.

The following is an example of the Layer HUD:


92 | PC SDK Developer Guide | Oculus Rift

7. Put on the headset and view the results.

Performance Head-Up Display


The Performance Head-Up Display (HUD) enables you or your users to view performance information for
applications built with the SDK.
The Performance HUD screens are rendered by the compositor, which enables them to be displayed with a
single SDK call. In the Oculus Debug Tool or OculusWorldDemo, you can toggle through the Performance
HUD screens by pressing F11.

Performance Summary
The Performance Summary HUD displays the frame rate of the application and the unused hardware
performance available. This HUD can be used by you or the user to tune an application's simulation and
graphics fidelity. Because the user cannot disable V-Sync, it can be used to gauge performance instead of a
frame rate counter. It is also useful for troubleshooting whether issues are related to the application or the
hardware setup.
Oculus Rift | PC SDK Developer Guide | 93

The following screenshot shows the Performance Summary HUD:

Figure 15: Performance Summary HUD

The following table describes each metric:

Metric Description
App Motion-to-Photon Latency Latency from when the last predicted tracking
information was queried by the application using
ovr_GetTrackingState() to when the middle scanline
of the target frame is illuminated on the HMD display.
This is the same information provided by the Latency
Timing HUD.
Unused performance Designed to help the user verify that the PC is
powerful enough to avoid dropping frames, this
displays the percentage of available PC performance
not used by the application and compositor. This is
calculated using the CPU and GPU time tracked by
the Application Render Timing HUD divided by the
native frame time (inverse of refresh rate) of the HMD.

Note: As GPU utilization approaches 100%,


adaptive queue ahead will choose an earlier
render start point. If this start point overlaps
with the compositor process in the previous
frame, the performance will appear spiky. If
you start to lower utilization, the graph will
show an initial drop before becoming more
linear.
94 | PC SDK Developer Guide | Oculus Rift

Metric Description
Application Frames Dropped Increments each time the application fails to submit a
new set of layers using ovr_SubmitFrame() before the
compositor is executed before each V-Sync (Vertical
Synchronization). This is identical to App Missed
Submit Count in the Application Render Timing HUD.
Compositor Frames Dropped Increments each time the compositor fails to
present a new rendered frame at V-Sync (Vertical
Synchronization). This is identical to Compositor
Missed V-Sync Count in the Compositor Render
Timing HUD.

Latency Timing
The Latency Timing HUD displays the App to Mid - Photon, Timewarp to Photon - Start, and Timewarp to
Photon - Start graphs.

The following screenshot shows the Latency Timing HUD:

Figure 16: Latency Timing

The following table describes each metric:

Table 7: Latency Timing HUD

Metric Description
App Tracking to Mid-Photon Latency from when the app called
ovr_GetTrackingState() to when the target frame
eventually was shown (i.e.illuminated) on the HMD
display - averaged mid - point illumination
Oculus Rift | PC SDK Developer Guide | 95

Metric Description
Timewarp to Mid-Photon Latency from when the last predictied tracking info
is fed to the GPU for timewarp execution to the
point when the middle scanline of the target frame is
illuminated on the HMD display
Flip to Photon - Start Time difference from the point the back buffer is
presented to the HMD to the point the target frame's
first scanline is illuminated on the HMD display

Application Render Timing


The Application Render Timing HUD displays application-specific render timing information.

The following screenshot shows the Application Render Timing HUD:

Figure 17: Application Render Timing

The following table describes each metric:

Table 8: Application Render Timing HUD

Metric Description
App Missed Submit Count Increments each time the application fails to submit
a new set of layers using ovr_SubmitFrame() before
the compositor is executed and before each V-Sync
(Vertical Synchronization).
App Frame-rate The rate at which application rendering calls
ovr_SubmitFrame(). It will never exceed the
native refresh rate of the HMD as the call to
96 | PC SDK Developer Guide | Oculus Rift

Metric Description
ovr_SubmitFrame() throttles the application's CPU
execution as needed.
App Render GPU Time
The total GPU time spent on rendering by the client
application. This includes the work done by the
application after returning from ovr_SubmitFrame(),
using the mirror texture if applicable.
It can also includes GPU command-buffer "bubbles"
if the application's CPU thread doesn't push data to
the GPU fast enough to keep it occupied. Similarly, if
the app pushes the GPU close to full-utilization, the
work on next frame (N+1) might be preempted by
the compositor's render work on the current frame
(N). Because of how the application GPU timing
query operates, this can lead to artificially inflated
application GPU times as they will start to include the
compositor GPU usage times.

App Render CPU Time The time difference from when the application
continued execution on CPU after ovr_SubmitFrame()
returned the subsequent call to ovr_SubmitFrame().
This will show "N/A" if the latency tester is not
functioning as expected (e.g., HMD display is
sleeping due to prolonged inactivity). This includes
the IPC call overhead to the compositor after
ovr_SubmitFrame() is called by the client application.
App Queue Ahead Time To improve CPU and GPU parallelism and increase
the amount of time that the GPU has to process a
frame, the SDK automatically applies queue ahead
up to 1 frame. This value displays the amount of
queue ahead time being applied (in milliseconds). For
more information about adaptive queue ahead, see
Adaptive Queue Ahead on page 63.

Compositor Render Timing


The Compositor Render Timing HUD displays render timing information for the Oculus Runtime Compositor.
The Oculus Compositor applies distortion and TimeWarp to the layered eye textures provided by the VR
application.
Oculus Rift | PC SDK Developer Guide | 97

The following screenshot shows the Compositor Render Timing HUD:

Figure 18: Render Timing

The following table describes each metric:

Table 9: Compositor Render Timing HUD

Metric Description
Compositor Missed V-Sync Count Increments each time the compositor fails to
present a new rendered frame at V-Sync (Vertical
Synchronization).
Compositor Frame-rate The rate of the final composition; this is independent
of the client application rendering rate. Because the
compositor is always locked to V-Sync, this value will
never exceed the native HMD refresh rate. But, if the
compositor fails to finish new frames on time, it can
drop below the native refresh rate.
Compositor GPU Time The amount of time the GPU spends executing the
compositor renderer. This includes TimeWarp and
distortion of all layers submitted by the application.
The amount of active layers, their resolutions, and
the requested sampling quality can all affect the GPU
times.
Comp Gpu-End to V-Sync The amount of time between when the GPU
completes the compositor rendering to the point in
time when V-Sync is hit and that buffer starts scanning
out on the HMD.
98 | PC SDK Developer Guide | Oculus Rift

Asynchronous SpaceWarp Stats


The Asynchronous SpaceWarp (ASW) HUD displays activity and tracking statistics for ASW, which runs as part
of the Oculus Runtime Compositor. ASW automatically activates when an application fails to meet the required
native frame rate for the connected HMD. Once active, ASW forces the application to run at half the normal
frame rate while extrapolating every other frame. This gives the application more time to complete its work.

The following screenshot shows the ASW HUD:

Figure 19: ASW Stats

The following table describes each metric:

Table 10: ASW Stats HUD

Metric Description
ASW Status Shows the availability and current status of ASW.
"Not Available" can be due to the OS and/or GPU
type used on the PC. "Available - Not Active" will
mean the application is successfully maintaining the
required native refresh rate, so ASW is not activated.
ASW Active-Toggle Count Tracks the number of times ASW has been activated
for the lifetime of the HMD.
ASW Presented-Frame Count Tracks the number of frames extrapolated by ASW
that were displayed. When ASW is active, since the
app is forced to run at half-rate, expect this value to
increase by 45 fps on a 90 Hz refresh rate HMD.
ASW Failed-Frame Count Tracks the number of extrapolated frames ASW
needed to display, but failed to prepare in time.
This can occur for different reasons, but might be
Oculus Rift | PC SDK Developer Guide | 99

Metric Description
caused by contention for OS resources or when the
capabilities of the system are exceeded.

Version Information
The Version Information HUD displays information about the HMD and the version of the SDK used to create
the app.

The following screenshot shows the Version Information HUD:

Figure 20: Version Info HUD

The following table describes each piece of information:

Name Description
OVR SDK Runtime Ver Version of the currently installed runtime. Every VR
application that uses the OVR SDK since 0.5.0 uses
this runtime.
OVR SDK Client DLL Ver The SDK version that the client app was compiled
against.
HMD Type The type of HMD.
HMD Serial The serial number of the HMD.
HMD Firmware The version of the installed HMD firmware.
Sensor Serial The serial number of the positional sensor.
Sensor Firmware The version of the installed positional sensor firmware.
100 | PC SDK Developer Guide | Oculus Rift

Performance Indicator
Asynchornous TimeWarp (ATW) can mask latency and judder issues that would normally be apparent. To help
you identify when your application or experience isn't performing and to test your game or experience before
submitting it, Oculus provides performance indicators.

When enabled, a letter code appears in the upper right of the headset whenever the application is
experiencing a performance issue. The following figure shows an example of a performance indicator with L
and F displayed:

Figure 21: Performance Indicator

To enable the performance indicator, set the following registry key:

HKLM\SOFTWARE\Oculus VR, LLC\LibOVR


FrameDropHUDEnabled (DWORD)

Enable = 1
Disable = 0

The performance indicator can return the following codes:

• L—a latency issue is occurring; more than one frame of queue ahead is being applied.
• F—the application is not maintaining frame rate.
• C—the compositor is not maintaining frame rate. Possible causes include:

• Programs, such as anti-virus software, are overloading the CPU.


• The CPU cannot handle the number of threads.
• The CPU or GPU does not meet the recommended specification.
• Certain patterns of GPU usage, such as heavy usage of shaders and tessellation, are affecting frame rate.
• There is an issue with the GPU driver.
• There is an unknown hardware issue.
• U—an unknown error occurred.
Oculus Rift | PC SDK Developer Guide | 101

Each warning lasts one frame. So, if L stays visible, the application is having continuous latency issues.

Pairing the Oculus Touch Controllers


After you receive your Touch Controllers, you need to pair them with the headset.

To pair your Touch controllers:

1. Make sure your headset is connected.


2. Launch the Oculus app.
3. Click the menu icon in the upper right to open the dropdown menu and select Settings.
4. Click Settings.
5. Click Devices.
6. Select Add Left Touch from Configure Rift and follow the on-screen instructions to pair the controller and
update the firmware. When the process is finished, the Left Touch Paired screen displays.
7. Select Add Right Touch from Configure Rift and follow the on-screen instructions to pair the controller and
update the firmware. When the process is finished, the Right Touch Paired screen displays.

Note: If the Add Left Touch and Add Right Touch options do not appear, please contact Developer
Relations.

Asynchronous SpaceWarp
Asynchronous SpaceWarp (ASW) enables users to run the Oculus Rift on lower specification hardware than our
current recommended specification.

Overview
ASW applies animation detection, camera translation, and head translation to previous frames in order to
predict the next frame. As a result, motion is smoothed and applications can run on lower performance
hardware.

The Rift operates at 90Hz. When an application fails to submit frames at 90Hz, the Rift runtime drops the
application down to 45Hz with ASW providing each intermediate frame.

By default, ASW is enabled for all supported Rifts.

ASW tends to predict linear motion better than non-linear motion. If your application is dropping frames, you
can either adjust the resolution or simply allow ASW to take over.

Requirements
ASW requires the following:

• Oculus Runtime 1.9 or later


• Windows 8 or later
• For Nvidia, driver 373.06 or later
• For AMD, driver 16.40.2311 or later

Until the minimum specification is released, we recommend the following GPUs for ASW testing:
102 | PC SDK Developer Guide | Oculus Rift

Manufacturer Series Minimum RAM Minimum Model


Nvidia Pascal 3GB 1060
Nvidia Maxwell 4GB 960
AMD Polaris 4GB 470

Testing ASW
To enable ASW testing:

1. Open your registry editor.


2. Navigate to HKLM\Software\Oculus VR, LLC\LibOVR.
3. Create the DWORD AswEnabled key and set it to 1.

While testing your application with ASW, you can switch between rendering modes:

• Control-Numpad1: Disables ASW and returns to the standard rendering mode.


• Control-Numpad2: Forces apps to 45Hz with ASW disabled. Depending on the application, you are likely to
experience judder.
• Control-Numpad3: Forces apps to 45Hz with ASW enabled. Enabling and disabling ASW will help you see
the effects of ASW.
• Control-Numpad4: Enables ASW. ASW automatically turns on and off, depending on whether the app
maintains a 90Hz frame rate. This is the default runtime rendering mode.
Oculus Rift | Reference Content | 103

Reference Content
This section contains reference material, including information about the data structures and files within the PC
SDK, links to PDFs, and changes to each version of the PC SDK.

Developer Reference
The PC SDK Developer Reference contains detailed information about the data structures and files within the
PC SDK.

To view the content, see Oculus SDK LibOVR Reference Manual.

Troubleshooting
Oculus Room Tiny Compile Errors
For Visual Studio 2010 through 2015, the Oculus Room Tiny (DX12) sample projects require you to specify an
appropriate Windows 10 SDK for the build.

Projects that use Windows SDK v10.0.10240.0 or later should compile without issue. Other SDKs might result in
receive compile errors, such as missing dx12.h.

For Visual Studio 2010 through 2013, edit the Samples\OculusRoomTiny\OculusRoomTiny


(DX12)\Projects\Windows\Windows10SDKPaths.props file with a text editor and update it to your
Windows 10 SDK, typically installed with headers at C:\Program Files (x86)\Windows Kits\10\Include.

For Visual Studio 2015, select Project Properties -> General -> Target. Then, select the platform version from
the list box.

Installation Repeatedly Fails


If installation repeatedly fails, check for a similar message in the Windows Event Viewer:

The OVRLibraryService service was unable to log on as NT SERVICE\OVRLibraryService with the currently
configured password due to the following error:
Log-on failure: the user has not been granted the requested log-on type at this computer.

If you encounter this message, make sure the NT SERVICE\ALL SERVICES account has Log on as a servicerights
through Local Security Settings (Secpol.msc).

If you previously set this right and the right was removed, contact your administrator to check if a Group Policy
Object (GPO) is removing the right.

PC SDK PDFs
This section provides a link to downloadable PDFs for this release.

Select from the following:


104 | Reference Content | Oculus Rift

• PC SDK Release Guide


• PC SDK Getting Started Guide
• PC SDK Developer Guide

You might also like