Unity Oculus SDK
Unity Oculus SDK
Version 1.25.0
2 | Introduction | Unity
OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC. (C) Oculus VR, LLC. All rights reserved.
BLUETOOTH is a registered trademark of Bluetooth SIG, Inc. All other trademarks are the property of their
respective owners. Certain materials included in this publication are reprinted with the permission of the
copyright holder.
2 | |
Unity | Contents | 3
Contents
Free and Professional versions of Unity Editor 5.1 or later support Rift, Oculus Go, and Samsung Gear VR
development out of the box. VR support is enabled by simply checking a checkbox in Player Settings.
• Learn More
• Download the Latest Utilities Version
6 | Oculus Unity Getting Started Guide | Unity
Samples
The Oculus Unity Sample Framework includes sample scenes and scripts illustrating common VR features such
as locomotion, in-app media players, crosshairs, UI, interaction with Game Objects with Oculus Touch, and
more.
• Learn More
• Download the Sample Framework
Unity | Oculus Unity Getting Started Guide | 7
Getting Started
If you are just getting started as a Unity developer, we recommend spending time learning the basics with
Unity’s excellent Documentation and Tutorials.
Our Oculus Unity Getting Started Guide on page 5 runs through environment setup, orientation,
frequently asked questions, and basic steps necessary for creating Oculus applications in Unity.
Run through our simple Tutorial: Build Your First VR App on page 22 for a quick hands-on run-through.
Most information in this guide applies equally to Rift and Mobile development. Exceptions are clearly indicated
where they occur. Unless otherwise noted, all instructions assume a Windows development environment.
For a complete reference for the C# scripts included in Oculus Utilities for Unity, see Unity Scripting Reference
on page 127.
Additional Resources
Oculus offers additional tools to assist Unity development, including a mobile performance analysis client,
Oculus Platform support for security and social features, an audio spatialization plugin, and more.
• Learn More
• Download
Getting Help
Visit our developer support forums at https://developer.oculus.com.
• 5.6.5p4
• 2017.1.3p3
• 2017.2.2p3
• 2017.3.2f1
• 2017.4.1f1
8 | Oculus Unity Getting Started Guide | Unity
• 2018.1.0f1
Our Release Notes describe known issues with any specific version.
All Unity versions 5.1 and later ship with the Oculus OVRPlugin, providing built-in support for Rift, Oculus Go,
and Samsung Gear VR.
The optional Oculus Utilities for Unity package offers additional developer resources, and includes the latest
version of OVRPlugin. When you import Utilities for Unity into a project, if the OVRPlugin version included with
the Utilities is later than the version built into your editor, a pop-up dialog will give you the option to update it
in your project. We always recommend using the latest available OVRPlugin version. For more information, see
OVRPlugin on page 33.
Legacy support is available for Unity 4 - see our Unity 4.x Legacy Integration Developer Guide on page 181
for more information.
For complete details Oculus SDK or Integration version compatibility with Unity, see Unity-SDK Version
Compatibility.
OS Compatibility
• Windows: Windows 7, 8, 10
• Mac: OS X Yosemite, El Capitan, Sierra
OS X development requires the Oculus Rift Runtime for OS X, available from our Downloads page. Note that
runtime support for OS X is legacy only. It does not support consumer versions of Rift.
Controllers
You may wish to have a controller for development or to use with the supplied demo applications. Available
controllers include the Oculus Touch or Xbox 360 controller for Rift, and the Gear VR Controller for mobile
development.
To enable VR support is enabled in the Unity Editor, check the Virtual Reality Supported checkbox in Player
Settings. Applications targeting the PC, Mac & Linux platform in Build Settings will now run on the Rift.
Unity | Oculus Unity Getting Started Guide | 9
Unity automatically applies position and orientation tracking, stereoscopic rendering, and distortion correction
to your main camera when VR support is enabled. For more details, see Unity VR Support.
If you have already set up your Oculus Rift for regular use, you are ready to begin Unity development.
Unity development for Rift requires the Oculus app (the PC application normally installed during Rift setup).
If you have not already installed it, download it from the Setup page and install it. When Unity requires the
Oculus runtime, it will automatically launch the Oculus app.
You may develop Rift apps on PCs that do not meet our minimum specifications or have a Rift connected.
We strongly recommend having a Rift available for development to preview your apps as you go, but it is not
required as long as the Oculus app is installed.
Advanced developers may find it useful to review our Oculus SDK documentation for more insight into the
rendering pipeline and underlying logic. If that interests you, we recommend Intro to VR and the PC Developer
Guide for a deeper dive into core Rift development concepts.
If you are interested in submitting an application to the Oculus Store, please see our Distribute Guide. We
recommend doing so before beginning development in earnest so you have a realistic sense of our guidelines
and requirements.
Note: If you are developing with a Unity Professional license, you will need an Android Pro Unity license
to build Android applications with Unity. The Free license includes basic Android support. For more
information, see License Comparisons Unity’s documentation.
We recommend reviewing Unity’s Getting started with Android development for general information on
Android development, but the essential setup steps are described below.
Once you have set up the Unity Editor for Android development, VR support is enabled by checking the Virtual
Reality Supported checkbox in Player Settings. Applications targeting the Android Platform will then run on
Gear VR.
Unity automatically applies orientation tracking, stereoscopic rendering, and distortion correction to your main
camera when VR support is enabled. For more details, see Unity VR Support.
If you are already prepared for Unity Android development, you are nearly ready to begin mobile development.
Android SDK
The Android SDK is required for mobile development with Unity. For setup instructions, Android Development
Software Setup in our Mobile SDK Developer Guide. Most Unity developers do not need to install Android
Studio or NDK.
Once you have installed the Standalone Android SDK tools, you may continue with this guide.
Once you have installed the Android SDK, you may wish to familiarize yourself with adb (Android Debug
Bridge), a useful tool used for communicating with your Android phone. For more information, see Adb in our
Mobile Developer Guide.
10 | Oculus Unity Getting Started Guide | Unity
Please see our osig self-service portal for more information and instructions on how to request an osig for
development: https://dashboard.oculus.com/tools/osig-generator/
Once you have downloaded an osig, be sure to keep a copy in a convenient location for reuse. You will only
need one osig per device for any number of applications.
<project>/Assets/Plugins/Android/assets/
If that directory does not exist, create it and copy the osig file to it. Note that the folder names are caps
sensitive and must be exactly as stated above.
If you attempt to run an Oculus mobile APK that has not been correctly signed with an osig, you will get the
error message “thread priority security exception make sure the apk is signed”.
We recommend removing your osig before building a final APK for submission to the Oculus Store. When your
application is approved, Oculus will modify the APK so that it can be used on all devices. See Building Mobile
Apps for Submission to the Oculus Store on page 22 for more information.
For more information and instructions, see Getting Started and Checking Entitlements in our Platform guide.
We strongly recommend carefully reviewing Mobile Development on page 78 and Best Practices for Rift
and Mobile on page 99 in our Developer Guide to be sure you understand our performance targets and
recommendations for mobile development. These sections contain important information that can help you
avoid mistakes that we see frequently.
more stringent mobile development best practices, as well as meeting the required 90 FPS required by the Rift.
This approach is not often taken in practice.
For information on core VR development concepts, see the Intro to VR in our PC Developer Guide. It is written
from the perspective of Rift development, but many contents apply equally well to mobile design.
Most Unity developers do not need to install the Oculus Mobile SDK. However, advanced developers may find
it useful to review our Mobile SDK Developer Guide for insight into the underlying logic. Developers interested
in the Android lifecycle and rendering path of Oculus mobile applications should review our documentation on
VrApi. Mobile Best Practices and General Recommendations may also be of interest.
If you are interested in submitting an application to the Oculus Store, please see our Distribute Guide. We
recommend doing so before beginning development in earnest so you have a realistic sense of our guidelines
and requirements.
For more details, see Oculus Utilities for Unity on page 33.
1. If you have previously imported an earlier Utilities version into your project, you must delete its content
before importing a new version.
2. Open the project you would like to import the Utilities into, or create a new project.
3. Import the Utilities Unity Package.
4. Update OVRPlugin (optional)
Be sure to close the Unity Editor, navigate to your Unity project folder, and delete the following:
Assets/Plugins Oculus.*
OVR.*
Assets/Plugins/ *Oculus*
Android/
AndroidManifest.xml
*vrapi*
*vrlib*
12 | Oculus Unity Getting Started Guide | Unity
*vrplatlib*
Assets/Plugins/x86/ Oculus.*
OVR.*
Assets/Plugins/ Oculus.*
x86_64/
OVR.*
Alternately, you can locate the .unityPackage file in your file system and double-click it to launch.
When the Importing package dialog box opens, leave all of the boxes checked and select Import. The import
process may take a few minutes to complete.
We also include the latest version in the Utilities for Unity package, and if the Utilities version is later than the
detected version in the Editor, you will be given the option to automatically update your project to the latest
version. We always recommend using the latest available version.
Build Settings
Click on File > Build Settings... and select one of the following:
For Windows, set Target Platform to Windows and set Architecture to either x86 or x86 64.
Unity | Oculus Unity Getting Started Guide | 13
We recommend unchecking Development Build for your final build, as it may impact performance.
Note: Be sure to add any scenes you wish to include in your build to Scenes In Build..
Unity | Oculus Unity Getting Started Guide | 15
Player Settings
Within the Build Settings pop-up, click Player Settings. In the Other Settings frame, verify that Virtual
Reality Supported is checked. All additional required settings are enforced automatically.
Quality Settings
Navigate to Edit > Project Settings > Quality. We recommend the following settings:
The Anti Aliasing setting is particularly important. It must be increased to compensate for stereo rendering,
which reduces the effective horizontal resolution by 50%. An anti-aliasing value of 2X is ideal, 4x may be used if
you have performance to spare. We do not recommend 8x.
For more information on our recommended settings, see Best Practices for Rift and Mobile on page 99.
To run your application, you must allow apps that have not been reviewed by Oculus to run on your Rift:
You may wish to disable the Unknown Sources option when you are not doing development work.
Note: If you have run an application from an unknown source at least once, it will then appear in the
Library section of Home and the Oculus app, and may be launched normally, as long as Unknown
Sources is enabled.
To run your application, navigate to the target folder of your build and launch the executable.
Android Manifest
During the build process, Unity projects with VR support enabled are packaged with an automatically-
generated manifest which is configured to meet our requirements (landscape orientation, vr_only, et cetera). All
other values, such as Entitlement Check settings, are not modified. Do not add the noHistory attribute to your
manifest.
To build an application for submission to the Oculus Store, you must build a custom manifest using the Oculus
Utilities for Unity. See Building Mobile Apps for Submission to the Oculus Store on page 22 for details.
Build Settings
From the File menu, select Build Settings…. Select Android as the platform. Set Texture Compression to
ASTC.
We recommend unchecking Development Build for your final build, as it may impact performance.
Player Settings
1. Click the Player Settings… button and select the Android tab.
Unity | Oculus Unity Getting Started Guide | 19
All required settings are enforced automatically, but you may wish to make additional settings as appropriate,
such as enabling Multithreaded Rendering. For more information on our recommended settings, see the Best
Practices for Rift and Mobile on page 99 section.
20 | Oculus Unity Getting Started Guide | Unity
Quality Settings
Navigate to Edit > Project Settings > Quality. We recommend the following settings:
The Anti-aliasing setting is particularly important. It must be increased to compensate for stereo rendering,
which reduces the effective horizontal resolution by 50%. An anti-aliasing value of 2X is ideal, 4x may be used if
you have performance to spare. We do not recommend 8x.
For more information on our recommended settings, see Best Practices for Rift and Mobile on page 99.
1. Save the project before continuing. If you are not already connected to your phone via USB, connect now.
Unlock the phone lock screen.
2. On some Samsung models, you must set the default USB connection from Connected for charging or
similar to Software installation or similar in the Samsung pulldown menu.
3. From the File menu in the Unity Editor, select Build Settings…. While in the Build Settings menu, add your
scenes to Scenes in Build if necessary.
4. Verify that Android is selected as your Target Platform and select Build and Run. If asked, specify a name
and location for the APK.
5. The APK will be installed and launched on your Android device.
To run your application later, remove your phone from the Gear VR headset and launch the app from the
phone desktop or Apps folder. Then insert the device into the Gear VR when prompted to do so.
Note that your will not see your application listed in your Oculus Home Library - only applications approved
and published by Oculus are visible there.
Once you have built an APK on your local system, you may copy it to your phone by following the instructions
in Using adb to Install Applications in our Mobile SDK Developer Guide.
VR applications run in Developer Mode play with distortion and stereoscopic rendering applied, and with
limited orientation tracking using the phone's sensors.
For instructions on setting your device to Developer Mode, see Developer Mode: Running Apps Outside of the
Gear VR Headset in our Mobile SDK Developer Guide.
22 | Oculus Unity Getting Started Guide | Unity
For more information on the submission process, see our Publishing Guide.
Once you have a done so, in the Unity Editor, select Tools > Oculus > Create store-compatible
AndroidManfiest.xml. Then build your project normally.
When you finish, you will have a working VR application that you can play on your Rift or Gear VR device, to the
amazement of your friends and loved ones.
Requirements
• Unity 5 (see Compatibility and Version Requirements on page 7 for version recommendations)
• Rift or Gear VR
• Compatible gamepad: optional for Rift, but a Bluetooth gamepad is required to control the player on Gear
VR.
Unity | Oculus Unity Getting Started Guide | 23
Preparation
Before beginning, you will need to set up your development environment.
If you are building for Rift, please follow the instructions in Preparing for Rift Development on page 8. Be
sure to configure the Oculus app to run apps from unknown sources, as described in that section.
If you are building for mobile, please follow the instructions in Preparing for Mobile Development on page
9. You should be able to communicate with your Samsung phone via USB before beginning. To verify this,
retrieve the device ID from your phone by connecting via USB and sending the command adb devices from
a command prompt. The phone should return its device ID. Make a note of it - you will need it later to request
a Oculus Signature File.
In this part of the tutorial, we’ll build a simple play area consisting of a floor and four walls, and add a sphere as
a player.
1. Launch the Unity Editor and, in the initial launch dialog, create a new project by clicking New. Give it
a creative name like VRProject, and select a location for the files to live in. Verify that the 3D option is
selected, and click Create Project.
2. Save the Scene.
a. Select the File pulldown menu, and click Save Scene as….
b. Give the scene a creative name like VRScene.
3. Let’s create a floor.
c. Find Plane in the Hierarchy View and right-click it. Select Rename in the context menu, and rename it
Floor.
4. Now we’ll create the first wall.
a. Right-click Wall1 in the Hierarchy View and select Duplicate in the context menu. You will see a wall
named Wall1 (1).
b. Right-click Wall1 (1) in the Hierarchy View and select Rename in the context menu, and rename it Wall3.
c. Select Wall3 in the Hierarchy View or in the Scene View.
d. In the Inspector, set the Y value of Rotation to 90 under Transform.
e. Select Wall3 in the Hierarchy View or in the Scene View.
f. In the Inspector, set the X value to 4.5 and the Z value to 0 in Position, under Transform.
8. Make a fourth wall and move it into place.
a. Right-click Wall3 in the Hierarchy View and select Duplicate in the context menu. You will see a wall
named Wall3 (1).
b. Right-click Wall3 (1) in the Hierarchy View and select Rename in the context menu. Name it Wall4.
c. Select Wall4 in the Hierarchy View or in the Scene View.
d. In the Inspector, set the X value of Position to -4.5 under Transform.
9. Now we have a play area with a floor surrounded by walls. Let’s add a sphere player.
In this part of the tutorial, we will prepare the Player so we can control its movement programmatically, based
on user input from keyboard or gamepad.
1. Add a RigidBody component to the Player. This will allow us to move it (for more details, RigidBody in
Unity’s manual).
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
void Start () {
}
}
3. Add a new function to move your Player. Add the text in bold:
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
}
}
d. Set speed to 3.
At this point if you preview your game in the Game View by pressing the Play button, you’ll find you can
control the Player with the arrow keys or W-A-S-D on your keyboard. If you have a Unity-compatible gamepad
controller you can control it with a joystick. Try it out!
2. In the Inspector window, locate the platform selection tabs. If developing for the PC, select the PC platform
tab. It looks like a download icon. If developing for Android, select the Android platform tab. It looks like an
Android icon.
3. In Other Settings > Rendering, select the Virtual Reality Supported check box.
Unity | Oculus Unity Getting Started Guide | 29
That’s it! That’s all you need to do to make your game into a VR application.
If you have a Rift, go ahead and try it out. Press the Play button to launch the application in the Play View. If the
Oculus app is not already running, it will launch automatically, as it must be running for a VR application to play.
Go ahead and put on the Rift and accept the Health and Safety Warnings. The game will be viewable in the
Rift, and the Unity Game View will show the undistorted left eye buffer image.
Play
If you are developing for Rift, follow the Building Rift Applications on page 12 to build an executable file.
30 | Oculus Unity Getting Started Guide | Unity
If you are developing for mobile, follow the Preparing for Mobile Development on page 9 instructions to
build an APK and load it onto your phone, then launch your game. Be sure to copy your osig to the specified
folder as described in that section, or you will not be able to run your application.
Once you have run your application, it will then be available in your Oculus Library, and you may re-launch it
from there.
All materials are available for download from our Developer site.
The Oculus Integration, available from the Unity Asset Store here, provides several unityPackages in a single
download, including our Utilities for Unity, Oculus Platform SDK Unity plugin, Oculus Avatar SDK Unity Plugin,
and the Oculus Native Spatializer Plugin. The Unity Sample Framework is also available from the Asset Store
here.
Samples
• Oculus Sample Framework for Unity - several sample scenes and guidelines for common and important
features. See Unity Sample Framework on page 115 for more information.
Mobile Resources
• Oculus Remote Monitor debugging client for mobile development. See Oculus Remote Monitor (Mobile) on
page 106 for more information).
Platform SDK
• The Oculus Platform supports features related to security, community, revenue, and engagement such as
matchmaking, in-app purchase, entitlement checking, VoIP, and cloudsaves. For more information on the
Platform Unity plugin, see our Platform Developer Guide.
Avatar SDK
• The Oculus Avatar SDK includes a Unity package to assist developers with implementing first-person hand
presence for the Rift and Touch controllers. It includes avatar hand and body assets viewable by other users
in social applications for Rift and mobile. The first-person hand models and third-person hand and body
models supported by the Avatar SDK automatically pull the avatar configuration choices the user has made
in Oculus Home to provide a consistent sense of identity across applications. The SDK includes a Unity
package with scripts, prefabs, art assets, and sample scenes. For more information, see our Avatar SDK
Developer Guide.
• The Avatar SDK includes a Social Scene sample, which is a Unity project illustrating basic use of Oculus
Avatars and Platform features such as VOiP. See the sample documentation for more information.
Audio Resources
• Oculus Native Spatializer Plugin (ONSP) for Unity provides easy-to-use high-quality audio spatialization. The
ONSP is included with the Audio SDK Plugins. See Oculus Native Spatializer for Unity for more information.
Unity | Oculus Unity Getting Started Guide | 31
The Unity Editor includes a built-in basic version of the ONSP with a subset of features. See Unity Audio on
page 82 for more information.
• Oculus OVRLipSync: is a plugin for animating avatar mouths to match speech sounds (Unity 5.1 or later).
• Oculus OVRVoiceMod: a plugin used to modify audio signals, making a person’s voice sound like a robot,
for example, or changing their voice from male to female, or vice-versa (Unity 5 or later).
Additional Resources
• The Facebook 360 Capture SDK allows game and virtual reality developers to easily and quickly integrate
360 photo/video capture capability into their applications. It is available for use with Unreal VR applications,
and may be downloaded from the Facebook GitHub repository.
Answer: Browse through this FAQ, and check out our Oculus Unity Getting Started Guide on page 5.
Read through Unity’s excellent documentation and try out some of their introductory tutorials to get acquainted
with Unity development.
When you're ready to get into the trenches, find out what the version of Unity we recommend at our
Compatibility and Requirements page, then download and install it. Next, build your own simple VR game by
following the instructions in our Tutorial: Build Your First VR App on page 22.
Question: What are the system requirements for Unity development for Oculus? What operating systems are
supported for Unity development?
Answer: For the most up-to-date information, see Unity Compatibility and Requirements. We currently support
Windows and OS X for development. The Oculus Rift requires Windows 7, 8 or 10.
Answer: Our latest version recommendations may be found in our Unity Compatibility and Requirements
document. Be sure to check back regularly, as we update it frequently as new SDKs and Unity versions are
released. You can find an archive of information in our Unity-SDK Version Compatibility list.
Question: What other tools and resources do you provide to Unity developers?
Answer: To find the latest tools we provide, check out our Other Oculus Resources for Unity Developers on
page 30.
Question: What do I need to run Rift applications that I build with Unity?
Answer: You will need a compatible Windows PC, a Rift, and the Oculus software. For more details, see
Preparing for Rift Development on page 8
Question: I want to focus on mobile development for the Oculus Go or Samsung Gear VR. What do I need to
do to get started? Do I need to download the Oculus Mobile SDK?
Answer: The Android SDK is required for mobile development with Unity. However, most Unity developers
do not need to download the Oculus Mobile SDK, or to install Android Studio or NDK. For more details, see
Preparing for Mobile Development on page 9.
Question: Can I develop a single application that woks on Oculus Go, Samsung Gear VR, and the Oculus Rift?
32 | Oculus Unity Getting Started Guide | Unity
Answer: Yes, but when developing for both Rift and mobile platforms, keep in mind that the requirements for
PC and mobile VR applications differ substantially. If you would like to generate builds for both PC and mobile
from a single project, it is important to follow the more stringent mobile development best practices, as well as
meeting the required 90 fps required by the Rift.
Question: What is the difference between the Oculus Unity 4 Legacy Integration and the Oculus Utilities for
Unity? How do the differences between Unity 4 and 5 affect development?
Answer: All developers should use Unity 5 or later. The Unity 4 Integration is maintained for legacy purposes
only.
In Unity 4, you must import our Legacy Integration for Unity 4 unitypackage and add the supplied VR camera
and player controller to your application to add Rift or mobile support.
Answer: Visit our developer support forums at https://developer.oculus.com. Our Support Center can be
accessed at https://support.oculus.com.
Unity | Unity Developer Guide | 33
This guide describes development using Unity's first-party Oculus support, and the contents and features of the
Oculus Utilities for Unity package.
Unity VR Support
All Unity versions 5.1 and later ship with a bundled version of the Oculus OVRPlugin that provides built-in
support for Rift, Oculus Go, and Samsung Gear VR support. Oculus support is enabled by checking Virtual
Reality Supported in the Other Settings > Configuration tab of Player Settings.
When Unity virtual reality support is enabled, any camera with no render texture is automatically rendered in
stereo to your device. Positional and head tracking are automatically applied to your camera, overriding your
camera’s transform.
Unity applies head tracking to the VR camera within the reference frame of the camera's local pose when
the application starts. If you are using OVRCameraRig, that reference frame is defined by the TrackingSpace
GameObject, which is the parent of the CenterEyeAnchor GameObject that has the Camera component.
The Unity Game View does not apply lens distortion. The image corresponds to the left eye buffer and uses
simple pan-and-scan logic to correct the aspect ratio.
You may update the Oculus OVRPlugin version of your Unity Editor at any time by installing the most recent
Utilities for Unity package for access to the latest features. For more information, see OVRPlugin on page
33.
For more information and instructions for using Unity’s VR support, see the Virtual Reality section of the Unity
Manual.
Note: Unity’s VR support is not compatible with Oculus Legacy Integrations for Unity 4.
The Utilities package is available from our Unity Downloads page, and as part of the Oculus Integration
available from the Unity Asset Store here.
OVRPlugin
OVRPlugin provides Rift and mobile support to the Unity Editor.
All Unity versions 5.1 and later ship with a bundled version of the Oculus OVRPlugin that provides built-in
support for Rift, Oculus Go, and Samsung Gear VR.
Utilities versions 1.14 and later include the latest version of OVRPlugin. When you import Utilities for Unity
into a Unity project, if the OVRPlugin version included with the Utilities package is later than the version built
into your Editor, a pop-up dialog will give you the option to automatically update it. Note that your project is
updated, not your Editor - you may work with different projects using different versions of OVRPlugin with the
same Editor. However, we always recommend using the latest available OVRPlugin version.
34 | Unity Developer Guide | Unity
If you decline to update OVRPlugin during the import process, you may update it later by selecting Tools >
Oculus > Update OVR Utilities Plugin.
If you update OVRPlugin using the Utilities package and later wish to roll back to the version included with the
Editor for any reason, you may easily do so by selecting Tools > Oculus > Disable OVR Utilities Plugin.
The update feature is currently not supported on OS X/macOS.
Note: The latest OVRPlugin version number may be a version or two behind the Utilities version
number.
OVR
The contents of the OVR folder in OculusUtilities.unitypackage are uniquely named and should be safe to
import into an existing project.
Editor Scripts that add functionality to the Unity Editor and enhance several C# component
scripts.
Materials Materials used for graphical components within the Utilities package, such as the main
GUI display.
Prefabs The main Unity prefabs used to provide the VR support for a Unity scene:
OVRCameraRig and OVRPlayerController.
Scripts C# files used to tie the VR framework and Unity components together. Many of these
scripts work together within the various Prefabs.
Note: We strongly recommend that developers not directly modify the included OVR scripts.
Plugins
The Plugins folder contains the OVRGamepad.dll, which enables scripts to communicate with the Xbox
gamepad on Windows (both 32 and 64-bit versions).
This folder also contains the plugin for Mac OS: OVRGamepad.bundle.
Unity | Unity Developer Guide | 35
Prefabs
This section gives a general overview of the Prefabs provided by the Utilities package including
OVRCameraRig, which provides an interface to OVRManager, and OVRPlayerController.
• OVRCameraRig
• OVRPlayerController
• OVRCubemapCaptureProbe
To use, simply drag and drop one of the prefabs into your scene.
OVRCameraRig
OVRCameraRig is a custom VR camera that may be used to replace the regular Unity Camera in a scene. Drag
an OVRCameraRig into your scene and you will be able to start viewing the scene.
The primary benefit to using OVRCameraRig is that it provides access to OVRManager, which provides the
main interface to the VR hardware. If you do not need such access, a standard Unity camera may be easily
configured to add basic VR support; see Unity VR Support for more information.
Note: Make sure to turn off any other Camera in the scene to ensure that OVRCameraRig is the only
one being used.
OVRCameraRig contains one Unity camera, the pose of which is controlled by head tracking; two “anchor”
Game Objects for the left and right eyes; and one “tracking space” Game Object that allows you to fine-
tune the relationship between the head tracking reference frame and your world. The rig is meant to be
attached to a moving object, such as a character walking around, a car, a gun turret, et cetera. This replaces the
conventional Camera.
The following scripts (components) are attached to the OVRCameraRig prefab:
• OVRCameraRig.cs
• OVRManager.cs
Learn more about the OVRCameraRig and OVRManager components in Unity Components on page 39.
OVRPlayerController
The OVRPlayerController is the easiest way to start navigating a virtual environment. It is basically an
OVRCameraRig prefab attached to a simple character controller. It includes a physics capsule, a movement
system, a simple menu system with stereo rendering of text fields, and a cross-hair component.
To use, drag the player controller into an environment and begin moving around using a gamepad, or a
keyboard and mouse.
• OVRPlayerController.cs
Figure 6: OVRPlayerController
38 | Unity Developer Guide | Unity
OVRCubemapCaptureProbe
This prefab allows you to capture a static 360 screenshot of your application while it is running,
either at a specified time after launch, when a specified key is pressed, or when the static function
OVRCubemapCapture.TriggerCubemapCapture is called. For more information on this function, see our Unity
Developer Reference.
OVRCubemapCaptureProbe is based on OVR Screenshot (see Cubemap Screenshots on page 96 for more
information).
Screenshots are taken from the perspective of your scene camera. They are written to a specified directory and
may be either JPEG or PNG. File type is specified by the file extension entered in the Path Name field; default
is PNG. Resolution is configurable.
Basic Use
Drag OVRCubemapCaptureProbe into the scene and set the parameters as desired in the Inspector view.
Parameters
Auto Trigger After Select to enable capture after a delay specified in Auto Trigger Delay.
Launch Otherwise capture is triggered by the keypress specified in Triggered by
Key.
Auto Trigger Delay Specify delay after application launch before cubemap is taken. (requires
Auto Trigger After Launch selected).
Triggered By Key Specifies key to trigger image capture (requires Auto Trigger After Launch
not selected).
Unity | Unity Developer Guide | 39
Path Name Specifies directory, file name, and file type (JPEG or PNG) for screen
capture.
Windows default path: C:\Users\$username\AppData\LocalLow\Sample\
$yourAppName\OVR_ScreenShot306\
Android default path: /storage/emulated/0/Android/data/com.unity3d.
$yourAppName/files/OVR_ScreenShot360/
Cubemap Size Specify size (2048 x 2048 is default, and is the resolution required for
preview cubemaps submitted to the Oculus Store).
Unity Components
This section gives a general overview of the Components provided by the Utilities package.
OVRCameraRig
OVRCameraRig is a Component that controls stereo rendering and head tracking. It maintains three child
"anchor" Transforms at the poses of the left and right eyes, as well as a virtual "center" eye that is halfway
between them.
This Component is the main interface between Unity and the cameras. It is attached to a prefab that makes it
easy to add comfortable VR support to a scene.
Important: All camera control should be done through this component. You should understand this script when
implementing your own camera control mechanism.
Updated Anchors Allows clients to filter the poses set by tracking. Used to modify or ignore positional
tracking.
TrackingSpace A Game Object that defines the reference frame used by tracking. You can move
this relative to the OVRCameraRig for use cases in which the rig needs to respond to
tracker input. For example, OVRPlayerController changes the position and rotation of
TrackingSpace to make the character controller follow the yaw of the current head pose.
OVRManager
OVRManager is the main interface to the VR hardware. It is a singleton that exposes the Oculus SDK to Unity,
and includes helper functions that use the stored Oculus variables to help configure camera behavior.
This component is added to the OVRCameraRig prefab. It can be part of any application object. However, it
should only be declared once, because it includes public members that allow for changing certain values in the
Unity Inspector.
Queue Ahead When enabled, distortion rendering work is submitted a quarter-frame early to avoid
(Deprecated) pipeline stalls and increase CPU-GPU parallelism.
Use Recommended When enabled, Unity will use the optimal antialiasing level for quality/performance on
MSAA Level the current hardware.
Enable Adaptive Enable to configure app resolution to scale down as GPU exceeds 85% utilization, and
Resolution (Rift only) to scale up as it falls below 85% (range 0.5 - 2.0; 1 = normal density). Requires Unity 5.4
or later.
Min Render Scale (Rift Sets minimum bound for Adaptive Resolution (default = 0.7).
only)
Max Render Scale (Rift Sets maximum bound for Adaptive Resolution (default = 1.0).
only)
Enable Mixed Reality (Rift Enables mixed reality capture. See Unity Mixed Reality Capture on page 86 for more
only) information.
Use Direct Composition Opens mixed reality capture view in direct composition mode. Deselect to set to
(Rift only) external composition mode.
Green Screen Color Sets how heavily to weigh non-green values in a pixel for mixed reality capture. (direct
Tolerance A (Rift only) composition only)
Green Screen Color Sets how heavily to weigh green values in a pixel for mixed reality capture. (direct
Tolerance B (Rift only) composition only)
Green Screen Color Alpha Alpha cutoff is evaluated after the chroma-key evaluation and before the bleed test to
Cutoff (Rift only) take pixels with a low alpha value and fully discard them. (For mixed reality capture,
direct composition only.)
Green Screen Color Shadow threshold reduces dark pixels to mitigate the shadow casting issues. (For mixed
Shadows (Rift only) reality capture, direct composition only.)
Tracking Origin Type Set to Eye Level to track the position and orientation y-axis relative to the HMD’s
position. Set to Floor Level to track position and orientation relative to the floor,
based on the user’s standing height as specified in the Oculus Configuration Utility.
Default is Eye Level.
Use Position Tracking Disables the IR tracker and causes head position to be inferred from the current rotation
using the head model.
Use IPD in Position If enabled, the distance between the user's eyes will affect the position of each
Tracking OVRCameraRig's cameras.
Unity | Unity Developer Guide | 41
Reset Tracker On Load When disabled, subsequent scene loads will not reset the tracker. This will keep the
tracker orientation the same from scene to scene, as well as keep magnetometer
settings intact.
8-bit sRGB framebuffers work well for non-VR apps. However, due to the light-locked nature of VR, when sRGB
buffers are used in VR apps, the user’s eyes can adjust to the dark, allowing them to detect subtle differences in
dark areas of the scene, including color banding artifacts.
You may use OVRManager to submit floating-point format eye buffers to the Oculus runtime, which helps
eliminate color banding in dark areas that might have been visible with 8-bit sRGB eye buffers.
• R11G11B10_FP: this format has same bandwidth cost with regular 8 bits sRGB framebuffer, so there
should be no extra performance cost for regular rendering. This format should be good enough for most
applications to remove color banding. However, as the name indicates, there is no alpha channel for this
format, so if your application requires destination alpha blending or needs to sample the frame buffer alpha
channel, you might prefer R16G16B16A16_FP
• R16G16B16A16_FP: this format has full alpha channel with 16-bit floating point precision, which should be
compatible with your application even if it requires framebuffer alpha channel. The bandwidth cost is 2x of 8
bit sRGB format.
Note: The new eye texture format only works under Linear color space. If you need alpha channel in
your frame buffer, you must use OVRManager.eyeTextureFormat = R16G16B16A16_FP.
Helper Classes
In addition to the above components, your scripts can always access the HMD state via static members of
OVRManager. For detailed information, see our Unity Scripting Reference on page 127.
OVRTracker Provides the pose, frustum, and tracking status of the infrared tracking sensor.
Rift Recentering
OVRManager.display.RecenterPose() recenters the head pose and the tracked controller pose, if
present (see OVRInput on page 44 for more information on tracking controllers).
Recenter requests are passed to the Oculus C API. For a more detailed description of what happens
subsequently, please see VR Focus Management in our PC SDK Developer Guide.
42 | Unity Developer Guide | Unity
Utilities
OVRPlayerController contains a few variables attached to sliders that change the physics
properties of the controller. This includes Acceleration (how fast the player will increase
speed), Dampening (how fast a player will decrease speed when movement input is not
activated), Back and Side Dampen (how much to reduce side and back Acceleration),
Rotation Amount (the amount in degrees per frame to rotate the user in the Y axis)
and Gravity Modifier (how fast to accelerate player down when in the air). When HMD
Rotates Y is set, the actual Y rotation of the cameras will set the Y rotation value of the
parent transform that it is attached to.
OVRGridCube OVRGridCube is a helper class that shows a grid of cubes when activated. Its main
purpose is to be used as a way to know where the ideal center of location is for the
user's eye position. This is especially useful when positional tracking is activated. The
cubes will change color to red when positional data is available, and will remain blue if
position tracking is not available, or change back to blue if vision is lost.
ForwardDirection An empty Game Object attached to the OVRPlayerController prefab containing the
matrix upon which motor control bases its direction. This Game Object should also
house the body geometry which will be seen by the player.
A handful of rudimentary sample scenes are provided in Assets/OVR/Scenes. They illustrate simple
implementations of basic components and may be useful for verifying that VR functionality is working properly.
For much more detailed samples including scripts and assets that you may re-use in your own applications, see
Unity Sample Framework on page 115.
Trivial An empty scene with one cube and a plain Unity camera. If this scene fails to render
normally, Unity’s VR support is not working.
Room A cubic room formed from six cubes enclosing an OVRPlayerController. Includes the
scripts OVRGrabber and OVRGrabbable, enabling users to pick up cubes with Touch
controllers.
Scripts for assisting with mobile development are located in Assets/OVR/Scripts/. Scripts included in the folder
that are not intended for developers to use are omitted from this list.
OVRBoundary Exposes an API for interacting with the Oculus Guardian System. For more information,
see OVRBoundary Guardian System API on page 63.
OVRGrabber Allows grabbing and throwing of objects with the OVRGrabbable component on them
using Oculus Touch. Requires OVRGrabbable to use.
OVRGrabbable Attach to objects to allow them to be grabbed and thrown with the Oculus Touch.
Requires OVRGrabber to use.
OVRHaptics Programmatically controls haptic feedback to Touch controllers. For more information,
see OVRHaptics for Oculus Touch on page 64.
OVRHapticsClip Programmatically controls haptic feedback to Touch controllers. For more information,
see OVRHaptics for Oculus Touch on page 64.
OVRInput Exposes a unified input API for multiple controller types. For more information, see
OVRInput on page 44.
OVROverlay.cs Add to a Game Object to render as a VR Compositor Layer instead by drawing it into
the eye buffer. For more information, see VR Compositor Layers on page 69
OVRPlatformMenu.cs Helper component for detecting Back Key long-press to bring-up the Universal Menu
and Back Key short-press to bring up the Confirm-Quit to Home Menu. Additionally
implements a Wait Timer for displaying Long Press Time. For more information on
interface guidelines and requirements, please review Interface Guidelines and Universal
Menu in the Mobile SDK documentation.
Simple scripts for assisting with mobile development are located in Assets/OVR/Scripts/Util. Scripts included in
the folder that are not intended for developers to use are omitted from this list.
OVRChromaticAberration.csDrop-in component for toggling chromatic aberration correction on and off for Android.
OVRDebugGraph.cs Drop-in component for toggling the TimeWarp debug graph on and off. Information
regarding the TimeWarp Debug Graph may be found in the TimeWarp technical note
in the Mobile SDK documentation.
OVRModeParms.cs Example code for de-clocking your application to reduce power and thermal load as
well as how to query the current power level state.
OVRMonoscopic.cs Drop-in component for toggling Monoscopic rendering on and off for Android.
44 | Unity Developer Guide | Unity
See our Oculus Utilities for Unity Reference Manual for a more detailed look at these and other C# scripts.
Undocumented scripts may be considered internal, and should generally never be modified.
Input
This guide describes Unity input features supported by the Oculus integration.
Oculus Touch
We have provided several resources and samples to help get you started using Touch with Rift.
OVRInput, our unified input API for interacting with Touch, is included with our Utilities for Unity package. See
OVRInput on page 44 for more information.
The Oculus Avatar SDK, includes a Unity package to assist developers with implementing first-person hand
presence for the Rift and Touch controllers. It includes avatar hand and body assets viewable by other users in
social applications for Rift and Gear VR. The first-person hand models and third-person hand and body models
supported by the Avatar SDK automatically pull the avatar configuration choices the user has made in Oculus
Home to provide a consistent sense of identity across applications. The SDK includes a Unity package with
scripts, prefabs, art assets, and sample scenes. For more information, see our Avatar SDK Developer Guide.
Our Unity Sample Framework includes samples demonstrating important Touch functionality. For example,
the AvatarWithGrab sample uses the Avatar SDK and the scripts OVRGrabber and OVRGrabbable to add
the ability to pick up and throw objects in the scene to the Avatar hand assets. The DistanceGrab sample
demonstrates a method for interacting with and grasping distant objects in a scene. See Unity Sample
Framework on page 115 for more information.
Oculus Utilities for Unity, Avatar SDK, and Unity Sample Framework are available with our Oculus Integration on
the Unity Asset Store, or from our Downloads page.
Touch may be used to emulate Microsoft XInput API gamepads without any code changes. However, you must
account for the missing logical and ergonomic equivalences between the two types of controllers. For more
information, see Emulating Gamepad Input with Touch in our PC SDK Developer Guide.
For more useful recommendations, have a look at the Oculus Developer Blog for several relevant posts.
OVRInput
OVRInput exposes a unified input API for multiple controller types.
It may be used to query virtual or raw controller state, such as buttons, thumbsticks, triggers, and capacitive
touch data. It currently supports the Oculus Touch, Microsoft Xbox controllers, and the Oculus remote on
desktop platforms. For mobile development, it supports the Gear VR Controller and Oculus Go Controller as
well as the touchpad and back button on the Gear VR headset. Gamepads must be Android compatible and
support Bluetooth 3.0.
For keyboard and mouse control, we recommend using the UnityEngine.Input scripting API (see Unity’s Input
scripting reference for more information).
Mobile input bindings are automatically added to InputManager.asset if they do not already exist.
Unity | Unity Developer Guide | 45
For more information, see OVRInput in the Unity Scripting Reference on page 127. For more information on
Unity’s input system and Input Manager, documented here: http://docs.unity3d.com/Manual/Input.html and
http://docs.unity3d.com/ScriptReference/Input.html.
Requirements
To use OVRInput, you must either:
Controller poses are returned by the constellation tracking system and are predicted simultaneously with
the headset. These poses are reported in the same coordinate frame as the headset, relative to the initial
center eye pose, and may be used for rendering hands or objects in the 3D world. They are also reset by
OVRManager.display.RecenterPose(), similar to the head and eye poses.
The controller is positioned relative to the user by using a body model to estimate the controller’s position.
Whether the controller is visualized on the left or right side of the body is determined by left-handedness versus
right-handedness, which is specified by users during controller pairing.
For example:
Use OVRInput.Get() to query controller touchpad input. You may query the input position with Axis2D:
OVRInput.Get(OVRInput.Axis2D.PrimaryTouchpad, OVRInput.Controller.RTrackedRemote);
A touchpad touch occurs when the user’s finger makes contact with the touchpad without actively
clicking it. Touches may be queried with OVRInput.Get(OVRInput.Touch.PrimaryTouchpad).
Touchpad clicks are alias to virtual button One clicks, and may be queried with
OVRInput.Get(OVRInput.Button.PrimaryTouchpad).
The volume and home buttons are reserved on the Gear VR Controller. The Oculus button is reserved on the
Go Controller.
OVRInput Usage
The primary usage of OVRInput is to access controller input state through Get(), GetDown(), and GetUp().
Control Enumerates
OVRInput.RawButton
OVRInput.RawTouch
OVRInput.RawNearTouch
OVRInput.RawAxis1D
OVRInput.RawAxis2D
The first set of enumerations provides a virtualized input mapping that is intended to assist developers with
creating control schemes that work across different types of controllers. The second set of enumerations
provides raw unmodified access to the underlying state of the controllers. We recommend using the first set of
enumerations, since the virtual mapping provides useful functionality, as demonstrated below.
Unity | Unity Developer Guide | 47
The Gear VR Controller touchpad may be queried for both touch status and click status, where “touch” refers
to the user’s finger making contact with the touchpad without actively clicking it.
Example Usage
// returns true if the primary button (typically “A”) was pressed this frame.
OVRInput.GetDown(OVRInput.Button.One);
// returns a Vector2 of the primary (typically the Left) thumbstick’s current state.
// (X/Y range of -1.0f to 1.0f)
OVRInput.Get(OVRInput.Axis2D.PrimaryThumbstick);
// returns true if the primary thumbstick has been moved upwards more than halfway.
// (Up/Down/Left/Right - Interpret the thumbstick as a D-pad).
OVRInput.Get(OVRInput.Button.PrimaryThumbstickUp);
// returns a float of the secondary (typically the Right) index finger trigger’s current state.
// (range of 0.0f to 1.0f)
OVRInput.Get(OVRInput.Axis1D.SecondaryIndexTrigger);
// returns true if the left index finger trigger has been pressed more than halfway.
// (Interpret the trigger as a button).
OVRInput.Get(OVRInput.RawButton.LIndexTrigger);
// returns true if the secondary gamepad button, typically “B”, is currently touched by the user.
OVRInput.Get(OVRInput.Touch.Two);
// returns true on the frame when a user’s finger pulled off Gear VR touchpad controller on a swipe
down
OVRInput.GetDown(OVRInput.Button.DpadDown);
// returns true the frame AFTER user’s finger pulled off Gear VR touchpad controller on a swipe right
OVRInput.GetUp(OVRInput.RawButton.DpadRight);
// If no controller is specified, queries the touchpad position of the active Gear VR Controller
48 | Unity Developer Guide | Unity
OVRInput.Get(OVRInput.Axis2D.PrimaryTouchpad);
// recenters the active Gear VR Controller. Has no effect for other controller types.
OVRInput.RecenterController();
// returns true on the frame when a user’s finger pulled off Gear VR Controller back button
OVRInput.GetDown(OVRInput.Button.Back);
In addition to specifying a control, Get() also takes an optional controller parameter. The list of supported
controllers is defined by the OVRInput.Controller enumeration (for details, refer to OVRInput in the Unity
Scripting Reference on page 127.
Specifying a controller can be used if a particular control scheme is intended only for a certain controller type.
If no controller parameter is provided to Get(), the default is to use the Active controller, which corresponds
to the controller that most recently reported user input. For example, a user may use a pair of Oculus Touch
controllers, set them down, and pick up an Xbox controller, in which case the Active controller will switch to
the Xbox controller once the user provides input with it. The current Active controller can be queried with
OVRInput.GetActiveController() and a bitmask of all the connected Controllers can be queried with
OVRInput.GetConnectedControllers().
Example Usage:
// returns a float of the Hand Trigger’s current state on the Left Oculus Touch controller.
OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, OVRInput.Controller.Touch);
// returns a float of the Hand Trigger’s current state on the Right Oculus Touch controller.
OVRInput.Get(OVRInput.Axis1D.SecondaryHandTrigger, OVRInput.Controller.Touch);
Querying the controller type can also be useful for distinguishing between equivalent buttons on different
controllers. For example, if you want code to execute on input from a gamepad or Touch controller, but not on
a Gear VR Touchpad, you could implement it as follows:
Note that the Oculus Touch controllers may be specified either as the combined pair (with
OVRInput.Controller.Touch), or individually (with OVRInput.Controller.LTouch and RTouch). This
is significant because specifying LTouch or RTouch uses a different set of virtual input mappings that allow
more convenient development of hand-agnostic input code. See the virtual mapping diagrams in Touch Input
Mapping for an illustration.
Example Usage:
// returns a float of the Hand Trigger’s current state on the Left Oculus Touch controller.
OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, OVRInput.Controller.LTouch);
// returns a float of the Hand Trigger’s current state on the Right Oculus Touch controller.
OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, OVRInput.Controller.RTouch);
This can be taken a step further to allow the same code to be used for either hand by specifying the controller
in a variable that is set externally, such as on a public variable in the Unity Editor.
Example Usage:
// public variable that can be set to LTouch or RTouch in the Unity Inspector
public Controller controller;
…
Unity | Unity Developer Guide | 49
// returns a float of the Hand Trigger’s current state on the Oculus Touch controller
// specified by the controller variable.
OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, controller);
// returns true if the primary button (“A” or “X”) is pressed on the Oculus Touch controller
// specified by the controller variable.
OVRInput.Get(OVRInput.Button.One, controller);
This is convenient since it avoids the common pattern of if/else checks for Left/Right hand input mappings.
OVRInput Haptics
Use SetControllerVibration() provided in OVRInput to start and stop haptics for the controller.
Expected values for amplitude and frequency are between 0-1. The greater the value, the stronger or more
frequent the vibration in the controller. To end the vibration, set both amplitude and frequency to 0. Controller
vibration automatically end 2 seconds after the last input. See the OVRInput reference in the Unity Scripting
Reference on page 127 for more information.
Note: If you're designing your app for Touch, consider using the updated OVRHaptics for Oculus Touch
instead.
Raw Mapping
The raw mapping directly exposes the Touch controllers. The layout of the Touch controllers closely matches
the layout of a typical gamepad split across the Left and Right hands.
52 | Unity Developer Guide | Unity
Unity | Unity Developer Guide | 53
Virtual Mapping
54 | Unity Developer Guide | Unity
Raw Mapping
Virtual Mapping
This diagram shows a common implementation of Xbox controller input bindings using
OVRInput.Controller.Gamepad.
Unity | Unity Developer Guide | 55
Raw Mapping
The raw mapping directly exposes the Xbox controller.
56 | Unity Developer Guide | Unity
Virtual Mapping
This diagram shows a common implementation of Gear VR Controller input bindings using
OVRInput.Controller.RTrackedRemote.
Unity | Unity Developer Guide | 57
Raw Mapping
The raw mapping directly exposes the Gear VR Controller. Note that this assumes a right-handed controller.
58 | Unity Developer Guide | Unity
Virtual Mapping
This diagram shows a common implementation of Go Controller input bindings using
OVRInput.Controller.RTrackedRemote.
Unity | Unity Developer Guide | 59
Raw Mapping
The raw mapping directly exposes the Go Controller. Note that this assumes a right-handed controller.
60 | Unity Developer Guide | Unity
Note that a back-button long-press is reserved and is automatically handled by the Gear VR VrApi. For more
information, see Universal Menu and Volume in our Mobile SDK Developer Guide.
Unity | Unity Developer Guide | 61
Virtual Mapping
62 | Unity Developer Guide | Unity
Raw Mapping
Unity | Unity Developer Guide | 63
The Oculus Guardian System is an in-VR visualization of Play Area bounds for Touch users. The boundary
visualization is handled automatically by Oculus software, but developers may interact with the Guardian
System in various ways using the OVRBoundary API. Possible use cases include pausing the game if the
user leaves the Play Area, or placing geometry in the world based on boundary points to create a “natural”
integrated barrier with in-scene objects.
For a sample illustrating how to use OVRBoundary, see the Guardian Boundary Sample in the Unity Sample
Framework on page 115.
During Touch setup, users define an interaction area by drawing a perimeter called the Outer Boundary in
space with the controller. An axis-aligned bounding box called the Play Area is calculated from this perimeter.
When tracked devices approach the Outer Boundary, the Oculus runtime automatically provides visual cues to
the user demarcating the Outer Boundary. This behavior may not be disabled or superseded by applications,
though the Guardian System visualization may be disabled via user configuration in the Oculus app.
Note: The Guardian System visualization is not visible in the Play View of the Editor, but behaves
normally
Basic Use
Boundaries are BoundaryType.OuterBoundary and BoundaryType.PlayArea.
64 | Unity Developer Guide | Unity
Applications may query the location of nodes relative to the Outer Boundary or Play Area by using
OVRBoundary.BoundaryTestResult TestNode(), which takes the node and boundary type as
arguments.
Applications may also query arbitrary points relative to the Play Area or Outer Boundary using
OVRBoundary.BoundaryTestResult TestPoint(), which takes the point coordinates in the tracking
space as a Vector3 and boundary type as arguments.
Results are returned as a struct called OVRBoundary.BoundaryTestResult, which includes the following
members:
IsTriggering bool Returns true if the node or point triggers the queried boundary
type.
ClosestDistance float Distance between the node or point and the closest point of the
test area.
ClosestPoint Vector3 Describes the location in tracking space of the closest boundary
point to the queried node or point.
ClosestPointNormal Vector3 Describes the normal of the boundary point that is closest to the
queried node or point.
Applications may query the current state of the boundary system using OVRBoundary.GetVisible().
Additional Features
You may set the boundary color of the automated Guardian System visualization using
OVRBoundary.SetLookAndFeel(). Alpha is unaffected. Use ResetLookAndFeel() to reset.
OVRBoundary.GetGeometry() returns an array of up to 256 points that define the Boundary Area or Play
Area in clockwise order at floor level. You may query the dimensions of a Boundary Area or Play Area using
OVRBoundary.GetDimensions(), which returns a Vector3 containing the width, height, and depth in
tracking space units, with height always returning 0.
Haptics Clips
Haptics clips specify the data used to control haptic vibrations in Touch controllers.
Unity | Unity Developer Guide | 65
Vibrations are specified by an array of bytes or “samples,” which specify vibration strength from 0-255. This
data can be sent to the left and right touch controllers independently, which process the amplitudes at a
sample rate of 320 Hz. The duration of vibration is determined by the number of bytes sent to the devices.
Haptics clips may be created in different ways, depending on your needs. For example, you may manually
create a clip with a pre-allocated fixed size buffer, and then write in bytes procedurally. This allows you to
generate vibrations on a frame-by-frame basis.
The OVRHaptics class is used to produce the actual vibrations. It defines a LeftChannel and a RightChannel.
You can also access these channels through the aliased Channels property, where Channels[0] maps to
LeftChannel, and Channels[1] maps to RightChannel. This alias is useful when using a variable for the channel
index in a script that can be associated with either hand..
Once you have selected a haptics channel, you may perform four operations with the following
OVRHapticsChannel member functions:
See our Developer Reference for API documentation and details on the relevant classes and members.
OVRHapticsClip reads in an audio clip, downsamples the audio data to a sequence of bytes with the expected
sample rate and amplitude range, and feeds that data into the clip’s internal amplitude buffer.
We generally recommend AudioClip-generated haptics clips for static sound effects such as gunshots or
music that do not vary at runtime. However, you may wish to write your own code to pipe the audio output
of an audio source in realtime to a OVRHapticsClip, allowing you near-realtime conversion of audio into
corresponding haptics data.
Best Practices
The Rift must be worn in order for haptics to work, as the Oculus runtime only allows the currently-focused VR
app to receive Touch haptics.
It is important to keep your sample pipeline at around the right size. Assuming a haptic frequency of 320 Hz
and an application frame rate of 90 Hz, we recommend targeting a buffer size of around 10 clips per frame. This
allows you to play 3-4 haptics clips per frame, while preserving a buffer zone to account for any asynchronous
interruptions. The more bytes you queue, the safer you are from interruptions, but you add additional latency
before newly queued vibrations will be played.
Note: For use with Oculus Touch only.
66 | Unity Developer Guide | Unity
Unlike some other forms of foveation technologies, Oculus Go's Fixed Foveation system is not based on eye
tracking. The high-resolution pixels are fixed in the center of the eye texture.
Fixed Foveated Rendering is only available on the Oculus Go. A detailed look at the benefits of using FFR can
be found in our Optimizing Oculus Go for Performance blog post.
OVRManager.tiledMultiResLevel = OVRManager.TiledMultiResLevel.{Off/LMSLow/LMSMedium/LMSHigh};
These values set the degree of foveation. The images below demonstrate the degree to which the resolution
will be affected.
Unity | Unity Developer Guide | 67
In the images above, the white areas at the center of our FOV, the resolution is native: every pixel of the
texture will be computed independently by the GPU. However, in the red areas, only 1/2 of the pixels will
be calculated, 1/4 for the green areas, 1/8 for the blue areas, and 1/16 for the magenta tiles. The missing
pixels will be interpolated from the calculated pixels at resolve time, when the GPU stores the result of its
computation in general memory.
You may choose to change the degree of foveation based on the scene elements. Apps or scenes with high
pixel shader costs will see the most benefit from using FFR. Apps with very simple shaders may see a net
performance loss from the overhead of using FFR. Proper implementation of FFR requires testing and tuning to
balance visual quality and GPU performance.
72 Hz Mode
The Oculus Go can optionally render your application at 72 frames per second rather than the normal 60
frames. The resulting output has lower latency and less flicker, which improves comfort and reduces eye strain.
To change the refresh rate, update displayFrequency in OVRDisplay. For example, to set the refresh rate to
72 Hz on a Go app:
OVRManager.display.displayFrequency = 72.0f;
An app rendering at 72 Hz may require additional performance optimizations to maintain the same framerate
as an app rendering at 60 Hz. Our Optimizing Oculus Go for Performance blog post contains recommendations
for when and how to use 72 Hz mode.
Unity | Unity Developer Guide | 69
VR Compositor Layers
OVROverlay is a script in OVR/Scripts that allows you to render Game Objects as VR Compositor Layers instead
of drawing them to the eye buffer.
OVROverlay
Game Objects rendered as VR compositor layers render at the frame rate of the compositor instead of
rendering at the application frame rate. They are less prone to judder, and they are raytraced through the
lenses, improving the clarity of textures displayed on them. This is useful for displaying easily-readable text.
Quadrilateral, cubemap, and cylinder compositor layers are currently supported by Rift and mobile.
Equirectangular and offset cubemap compositor layers are currently available in mobile only.
Overlay sample: A sample illustrating the use of quad and cylinder VR Compositor Layers for a UI is included
with the rendering samples of our Unity Sample Framework. See Unity Sample Framework on page 115 for
more information.
All layer types support both stereoscopic and monoscopic rendering, though stereoscopic rendering only
makes sense for cubemaps in most cases. Stereoscopically-rendered overlays require two textures, specified by
setting Size to 2 in the Textures field of OVROverlay in the Inspector.
70 | Unity Developer Guide | Unity
Gaze cursors and UIs are good candidates for rendering as quadrilateral compositor layers. Cylinders may be
useful for smooth-curve UI interfaces. Cubemaps may be used for startup scenes or skyboxes.
We recommend using a cubemap compositor layer for your loading scene, so it will always display at a steady
minimum frame rate, even if the application performs no updates whatsoever.
Applications may add three compositor layers to a scene. You may use no more than one cylinder and one
cubemap compositor layer per scene.
Unity | Unity Developer Guide | 71
Note that if a compositor layer fails to render (e.g., you attempt to render more than three compositor layers),
only quads will currently fall back and be rendered as scene geometry. Cubemaps and cylinders will not display
at all, but similar results can be achieved with scene geometry such as Unity’s Skybox component or Cylinder
MeshFilter.
You may use OVRRTOverlayConnector to render textures to a compositor layer. See OVRRTOverlayConnector
below for more information.
Overlays are world-locked by default. To make a head-locked overlay, make the Quad a child of the
OVRCameraRig's center eye anchor.
By default, VR compositor layers are displayed as overlays in front of the eye buffer. To place them behind the
eye buffer, set Current Overlay Type to Underlay in the Inspector. Note that underlay compositor layers are
more bandwidth-intensive, as the compositor must “punch a hole” in the eye buffer with an alpha mask so
that underlays are visible. Texture bandwidth is often a VR bottleneck, so use them with caution and be sure to
assess their impact on your application.
Underlays depend on the alpha channel of the render target. If a scene object that should occlude an underlay
is opaque, set its alpha to 1. If the occluder is transparent, you must use the OVRUnderlayTransparentOccluder
shader provided in the Utilities in Assets/OVR/Shaders. Overlays do not require any special handling for
transparency.
Compositor layers are depth ordered by the sequence in which they are enabled in the scene, but the order is
reversed for overlays and underlays. Underlays should be enabled in the scene in the sequence in which you
want them to appear, enabling the underlays in front first and the layers in the back last. Overlays should be
enabled in the opposite order.
Basic usage
Example
In this example, most of the scene geometry is rendered to the eye buffer. The application adds a gaze cursor
as a quadrilateral monoscopic overlay and a skybox as a monoscopic cubemap underlay behind the scene.
Note the dotted sections of the eye buffer, indicating where OVROverlay has “punched a hole” to make the
skybox visible behind scene geometry.
In this scene, the quad would be set to Current Overlay Type: Overlay and the cubemap would be set to
Current Overlay Type: Underlay. Both would be disabled, then the quad overlay enabled, then the skybox
enabled.
Note that if the cubemap in our scene were transparent, we would need to use the
OVRUnderlayTransparentOccluder, which is required for any underlay with alpha less than 1. If it were
stereoscopic, we would need to specify two textures and set Size to 2.
To use a cylinder overlay, your camera must be placed inside the inscribed sphere of the cylinder. The overlay
will fade out automatically when the camera approaches to the inscribed sphere's surface.
Unity | Unity Developer Guide | 73
74 | Unity Developer Guide | Unity
Only half of the cylinder may be displayed, so the arc angle must be smaller than 180 degrees.
Offset cubemap compositor layers are useful for increasing resolution for areas of interest/visible areas by
offsetting the cubemap sampling coordinate.
They are similar to the same as standard cubemap compositor layers. Attach the OVROverlay script to an
Empty Game Object, and specify the texture coordinate offset in the Position Transform. For more information,
see OVROverlay in our Unity Scripting Reference on page 127.
OVRRTOverlayConnector
OVRRTOverlayConnector is a helper class in OVR/Scripts/Util used to link a Render Texture to an OVROverlay
Game Object. Attach this script to your camera object, and specify your overlay owner object in Ovr Overlay
Obj.
The overlay camera must use Render Texture, and must be rendered before the Main Camera (e.g., using
camera depth), so the Render Texture will be available before being used.
OVRRTOverlayConnector triple-buffers the render results before sending them to the overlay, which is a
requirement for time warping a render target. It also clears the render Texture's border to alpha = 0 to avoid
artifacts on mobile.
For more information, see "OVRRTOverlayConnector" in our Unity Scripting Reference on page 127.
In typical OpenGL stereo rendering, each eye buffer must be rendered in sequence, doubling application and
driver overhead. When Single Pass is enabled, objects are rendered once to the left eye buffer, then duplicated
to the right buffer automatically with appropriate modifications for vertex position and view-dependent
variables such as reflection.
While Single Pass rendering primarily reduces CPU usage, GPU usage may be affected. On Exynos devices,
Single Pass rendering slightly reduces the GPU load as well as the CPU load. Unfortunately, on Qualcomm
devices, Single Pass currently causes a slight increase in GPU load of a few percent. Qualcomm is looking into
optimizing this to reduce the increase in GPU load.
For additional technical information, you may wish to review Multi-View in our Mobile SDK documentation,
which discusses the underlying framework that makes Single Pass rendering possible in our Unity integration.
For additional information, see Single-Pass Stereo Rendering and Single-Pass Stereo Rendering for Android in
Unity's documentation.
Requirements
Single Pass rendering is currently supported by Note5, S6, S7, S7 Edge, S8 and S8+ phones using ARM Exynos
processors and running Android M or N. It is also supported on S7 and S7 Edge phones using Qualcomm
processors and running Android N.
Although it can substantially reduce CPU overhead, keep in mind that applications submitted to the Oculus
Store must maintain minimum frame rate per our requirements, even on devices that do not support multi-view.
A fix is being deployed, but it will take some time for users to get it from OTA.
Issues
1. When “Standard Shader Quality” is low in your graphics config, the standard shader may appear black.
76 | Unity Developer Guide | Unity
2. If you create tree objects, using the default “Optimized Bark Material” may cause the tree to disappear.
1. Download the build-in shader package from Unity website for 5.6.0p2.
2. Copy the following files from the shader package to your project folder:
a. All files under Assets\Shaders\CGIncludes. Not every file is necessary, but we recommend simply
copying all of them, as the dependencies can be complicated.
b. \DefaultResourcesExtra\Standard.shader
3. In your project, modify the file UnityStandardCoreForwardSimple.cginc in Assets\Shaders\CGIncludes\ by
adding the following code to the end of FragmentSetupSimple() before return s;
#if defined(UNITY_STEREO_MULTIVIEW_ENABLED)
s.smoothness = saturate(s.smoothness);
#endif
Shader "Standard"
to
Shader "StandardS8"
.
6. After you finished modifying the shaders, you need apply them. Change the shader for any affected material
from Standard to StandardS8.
If you have a lot of affected materials, it may be easier to use an editor script to apply these changes, such as
this:
[MenuItem("Tools/Oculus/ApplyS8Workaround")]
static void ApplyS8Workaround()
{
Renderer[] _renderers = Component.FindObjectsOfType<Renderer>();
foreach (Renderer _renderer in _renderers)
{
Material[] _materials = _renderer.sharedMaterials;
foreach (Material _material in _materials)
{
Unity | Unity Developer Guide | 77
if (_material.shader.name.Equals("Standard"))
{
_material.shader = Shader.Find("StandardS8");
}
}
}
1. Download the build-in shader package from Unity website for 5.6.0p2.
2. Copy the following files from the shader package to your project folder
a. \DefaultResourcesExtra\Standard.shader
b. \DefaultResourcesExtra\Nature\TreeCreator\TreeCreatorBarkOptimized.shader
3. In your project, modify the file TerrainEngine.cginc in Assets\Shaders\CGIncludes\ by adding the following
code right after the line float2 vWavesIn = _Time.yy + float2(fVtxPhase, fBranchPhase ):
#if defined(UNITY_STEREO_MULTIVIEW_ENABLED)
vWavesIn.x += saturate(fVtxPhase) * 0.00000001f;
#endif
to
6. After you finished modifying the shaders, you will need apply them. Change the shader for any affected
material from Tree Creator Bark Optimized to Tree Creator Bark Optimized S8.
If you have a lot of affected materials, you may wish to write an Editor script to do this conversion similar to the
example given in the standard shader issue workaround above.
Compiling Issues
When Single Pass is enabled in Unity 5.6.0p2, building mobile projects will fail if either of the two cases are
true:
1. You are using Standard Shader and have enabled both specular highlights and normal mapping; or
2. You are using the old mobile bumped diffuse detail shader.
You will see a shader error referring to bumped detail that says “Duplicated input semantics can't change type,
size, or layout ('TEXCOORD7').”
Workaround
The actual details will differ depending on which shader you have problems with - these instructions use
Mobile-Bumped.shader as an example.
78 | Unity Developer Guide | Unity
1. Download the build-in shader package from Unity website for 5.6.0p2.
2. Copy any shaders reported by the compiler error the following files from the shader package to your project
folder, such as Mobile-Bumped.shader (for example).
3. In the file \Assets\Shaders\CGIncludes\UnityInstancing.cginc, replace
with:
Shader "Mobile-Bumped"
to
Shader "Mobile-BumpedS8"
6. After you finished modifying the shaders, you need apply them. Change the shader for any affected material
from Mobile-Bumped to Mobile-BumpedS8.
If you have a lot of affected materials, you may wish to write an Editor script to do this conversion similar to the
example given in the standard shader issue workaround above.
Mobile Development
This section provides guidelines to help your Unity app perform well with Samsung Gear VR.
Good performance is critical for all VR applications, but the limitations inherent to mobile development warrant
special consideration.
Other Resources
Related resources:
Rendering Optimization
This section describes recommended targets and settings for mobile projects.
Unity provides several built-in features to help reduce draw calls such as batching and culling.
Static batching is used for objects that will not move, rotate or scale, and must be set explicitly per object. To
mark an object static, select the Static checkbox in the object Inspector.
Dynamic batching is used for moving objects and is applied automatically when objects meet certain criteria,
such as sharing the same material, not using real-time shadows, or not using multipass shaders. More
information on dynamic batching criteria may be found here: https://docs.unity3d.com/Documentation/Manual/
DrawCallBatching.html
Culling
Unity offers the ability to set manual per-layer culling distances on the camera via Per-Layer Cull Distance.
This may be useful for culling small objects that do not contribute to the scene when viewed from a given
distance. More information about how to set up culling distances may be found here: https://docs.unity3d.com/
Documentation/ScriptReference/Camera-layerCullDistances.html.
Unity also has an integrated Occlusion Culling system. The advice to early VR titles is to favor modest “scenes”
instead of “open worlds,” and Occlusion Culling may be overkill for modest scenes. More information about
the Occlusion Culling system can be found here: https://docs.unity3d.com/Manual/OcclusionCulling.html.
Verify your vertex shaders are mobile friendly. And, when using built-in shaders, favor the Mobile or Unlit
version of the shader.
Bake as much detail into the textures as possible to reduce the computation per vertex: https://
docs.unity3d.com/430/Documentation/Manual/iphone-PracticalRenderingOptimizations.html
Be mindful of Game Object counts when constructing your scenes. The more Game Objects and Renderers in
the scene, the more memory consumed and the longer it will take Unity to cull and render your scene.
Verify your fragment shaders are mobile friendly. And, when using built-in shaders, favor the Mobile or Unlit
version of the shader.
Overdraw: Objects in the Unity opaque queue are rendered in front to back order using depth-testing to
minimize overdraw. However, objects in the transparent queue are rendered in a back to front order without
depth testing and are subject to overdraw.
Avoid overlapping alpha-blended geometry (e.g., dense particle effects) and full-screen post processing
effects.
To set your clock level in Unity apps, set OVRManager.cpuLevel ( ) and OVRManager.gpuLevel ( ).
Best Practices
This section describes best practices for mobile projects.
• Be batch friendly. Share materials and use a texture atlas when possible.
• Prefer lightmapped, static geometry.
• Prefer lightprobes instead of dynamic lighting for characters and moving objects.
• Always use baked lightmaps rather than precomputed GI.
• Use Non-Directional Lightmapping.
• Bake as much detail into the textures as possible. E.g., specular reflections, ambient occlusion.
• Only render one view per eye. No shadow buffers, reflections, multi-camera setups, et cetera.
Unity | Unity Developer Guide | 81
• Keep the number of rendering passes to a minimum. No dynamic lighting, no post effects, don't resolve
buffers, don’t use grabpass in a shader, et cetera.
• Avoid alpha tested / pixel discard transparency. Alpha-testing incurs a high performance overhead. Replace
with alpha-blended if possible.
• Keep alpha blended transparency to a minimum.
• Be sure to use texture compression. We recommend using ASTC texture compression as a global setting.
• Check the Disable Depth and Stencil* checkbox in the Resolution and Presentation pane in Player
Settings.
• Be sure to initialize GPU throttling, and avoid dangerous values (-1 or >3) See Power Management in our
Mobile SDK Developer Guide for more information.
• Avoid full screen image effects.
• Be careful using multiple cameras with clears - doing so may lead to excessive fill cost.
• Avoid use of LoadLevelAsync or LoadLevelAdditiveAsync. This has a dramatic impact on framerate, and it is
generally better to go to black and load synchronously.
• Avoid use of Standard shader or Standard Specular shader.
• Avoid using Projectors, or use with caution - they can be very expensive.
• Avoid Unity’s Default-Skybox, which is computationally expensive for mobile. We recommend setting
Skybox to None (Material), and Ambient Source to Color in Window > Lighting. You may also wish to set
Camera.clearFlags to SolidColor (never Skybox).
CPU Optimizations
• Be mindful of the total number of Game Objects and components your scenes use.
• Model your game data and objects efficiently. You will generally have plenty of memory.
• Minimize the number of objects that actually perform calculations in Update() or FixedUpdate().
• Reduce or eliminate physics simulations when they are not actually needed.
• Use object pools to respawn frequently-used effects or objects instead of allocating new ones at runtime.
• Use pooled AudioSources versus PlayOneShot sounds, as the latter allocate a Game Object and destroy it
when the sound is done playing.
• Avoid expensive mathematical operations whenever possible.
• Cache frequently-used components and transforms to avoid lookups each frame.
• Use the Unity Profiler to:
Startup Sequence
For good VR experiences, all graphics should be rendered such that the user is always viewing a proper three-
dimensional stereoscopic image. Additionally, head-tracking must be maintained at all times. We recommend
82 | Unity Developer Guide | Unity
considering using a cubemap overlay for your startup screen (see VR Compositor Layers on page 69), which
will render at a consistent frame rate even if the application is unavailable to update the scene.
An example of how to do this during application start up is demonstrated in the SDKExamples Startup_Sample
scene:
• Solid black splash image is shown for the minimum time possible.
• A small test scene with 3D logo and 3D rotating widget or progress meter is immediately loaded.
• While the small start up scene is active, the main scene is loaded in the background.
• Once the main scene is fully loaded, the start scene transitions to the main scene using a fade.
Reserved Interactions
For more information about the Ouclus reserved interactions, see Universal Menu and Reserved User
Interactions in our Mobile Developer Guide.
See the class description of OVRPlatformMenu in our Unity Scripting Reference on page 127 for details about
the relevant public members.
Volume
The volume buttons are reserved, and volume adjustment on the Samsung device is handled automatically.
Volume control dialog is also handled automatically by the VrApi as of Mobile SDK 1.0.3, supported by Utilities
for Unity versions 1.5.0 and later. Do not implement your own volume handling display, or users will see two
juxtaposed displays.
Unity Audio
This guide describes guidelines and resources for creating a compelling VR audio experience in Unity.
If you’re unfamiliar with Unity’s audio handling, we recommend starting with the Unity Audio guide.
For instructions on using Unity and Wwise with the Rift, see Rift Audio in the PC SDK Developer Guide.
Our ability to localize audio sources in three-dimensional space is a fundamental part of how we experience
sound. Spatialization is the process of modifying sounds to make them localizable, so they seem to originate
from distinct locations relative to the listener. It is a key part of creating presence in virtual reality games and
applications.
For a detailed discussion of audio spatialization and virtual reality audio, we recommend reviewing our
Introduction to Virtual Reality Audio guide.
The Oculus Native Spatializer Plugin (ONSP) is an plugin for Unity that allows monophonic sound sources to be
spatialized in 3D relative to the user's head location.
The ONSP is built on Unity’s Native Audio Plugin, which removes redundant spatialization logic and provides a
first-party HRTF.
The ONSP Audio SDK also supports early reflections and late reverberations using a simple 'shoebox model,'
consisting of a virtual room centered around the listener's head, with four parallel walls, a floor, and a ceiling at
varying distances, each with its own distinct reflection coefficient.
The ONSP is available with the Audio SDK Plugins package from our Downloads page. To learn more about it,
see our Oculus Audio SDK Guide and our Oculus Native Spatializer for Unity Guide.
For more information, see VR Audio Spatializers in the Unity Manual, or First-Party Audio Spatialization (Beta) in
our Oculus Native Spatializer for Unity guide.
Rift Core 2.0 introduces substantial changes to Oculus Home and replaces the Universal Menu with Oculus
Dash. This page describes how you can support Dash in your Unity app.
84 | Unity Developer Guide | Unity
Dash re-implements Universal Menu as a VR compositor layer. Have a look at the “Introducing Oculus Dash”
video in our Welcome to Rift Core 2.0 blog post to get a sense of how it works.
Beginning with runtime 1.22, when users pause an application, instead of rendering the Universal Menu in an
empty room, one of two things will happen:
• If your application supports Dash, the application will pause and the Dash menu UI will be drawn over your
paused application.
• If your application does not support Dash, your application will be paused by the runtime and the user
will be presented with the Dash menu UI in an empty room, similar to the way the Universal Menu was
previously displayed.
When the Dash UI is active, the runtime will render tracked controllers in the scene to interact with the menu.
To check if your app has focus input, query OVRManager.hasInputFocus every frame. If your app has focus
hasInputFocus will return true. If the user's focus is elsewhere, like when the user opens the Dash menu or
removes their HMD, hasInputFocus will return false.
In single-player apps or experiences, you can pause the app, mute audio playback, and stop rendering any
tracked controllers/hands present in the scene (Dash will use a separate set of hands).
Unity | Unity Developer Guide | 85
Multiplayer experiences may wish to handle the loss of input focus differently. You're required to hide the
hands and ignore any input while the app does not have focus input, but you may wish to continue audio
playback and the match in the background.
For more information on OVRManager.hasInputFocus, see Application Lifecycle Handling on page 86.
Unity custom builds are available, providing Dash support for the 5.6, 2017.1, and 2017.2 release channels. For
a custom build, see Oculus Dash Support (5.6, 2017.1, 2017.2) in the Unity Forum.
Share Depth Buffer: Depth information helps avoid depth conflicts between the Dash UI rendered in the scene
and objects in the scene, and enables compositor layer depth testing.
Dash Support: Check this box to configure your application to signal the Oculus runtime that the application
is Dash-compatible. Do not check this box in builds intended for store submission until you have tested your
application with Dash and verified correct functionality.
Dash support is enabled by default in all custom Unity builds with Dash support as well as Unity 2017.3b11 and
later, and versions 2017.3f1-2. It is off by default in all other versions, and we plan to disable it by default in
2017.3f3 and later.
In addition to these checkbox settings, you may also enable Dash support by launching your application with
the launch parameter -oculus-focus-aware true.
Input Focus
With Dash we introduced the concept of input focus, or whether the user is focused on your app, or elsewhere.
Adding Dash support to your app means correctly handling the times when your app is running, but the user's
focus is elsewhere.
To check if your app has focus input, query OVRManager.hasInputFocus every frame. If your app has focus
hasInputFocus will return true. If the user's focus is elsewhere, like when the user opens the Dash menu or
removes their HMD, hasInputFocus will return false.
In single-player apps or experiences, you can pause the app, mute audio playback, and stop rendering any
tracked controllers/hands present in the scene (Dash will use a separate set of hands).
Multiplayer experiences may wish to handle the loss of input focus differently. You're required to hide the
hands and ignore any input while the app does not have focus input, but you may wish to continue audio
playback and the match in the background.
For more information, see HasInputFocus under OVRManager in our Unity Scripting Reference on page
127.
See the Input Focus sample in the Unity Sample Framework on page 115 for an example of a typical
implementation and the Oculus Dash in Unity on page 83 guide for other information you should know
about being focus aware.
VR Focus
Similar to Input Focus, the runtime will also tell you if your app has VR Focus, or if any part of your app is visible
to the user.
Your app could lose VR focus for a number of reasons, the most common is if the user exits to Home or
switches to another app. To check if your app has VR input, query OVRManager.hasVRFocus every frame.
OVRManager.hasVrFocus() will return false when you app is no longer visible.
When you lose VR focus the user can no longer see your app. You should stop submitting frames, drop audio,
and stop tracking input. You may also wish to save the game state so you can return the user to where they left
off in your app.
For more information, see HasVRFocus under OVRManager in our Unity Scripting Reference on page 127.
Introduction
Mixed reality capture places real-world people and objects in VR. It allows live video footage of a Rift user to be
composited with the output from a game to create combined video that shows the player in a virtual scene.
Unity | Unity Developer Guide | 87
Live video footage may be captured with a stationary or tracked camera. For more information and complete
setup instructions, see the Mixed Reality Capture Setup Guide.
Once an application is configured by developers to use mixed reality capture, users can launch applications
with the feature enabled and control several relevant settings with external configuration files or command-line
parameters. See "Using Mixed Reality Capture" and "External Configuration File" below for more information.
Preparation
This guide assumes that you have built a functioning Unity VR app. Verify that Virtual Reality Supported is
selected in Player Settings as described in the Oculus Unity Getting Started Guide on page 5.
Download and import Oculus Utilities for Unity version 1.18 or later from the Downloads page. If you have
previously imported an earlier version of Utilities for Unity into your project, be sure to remove it from your
project before updating by following the steps described in Importing the Oculus Utilities Package on page 11.
You must run the CameraTool prior to launching your mixed reality capture application to configure the
external camera and VR Object. See the Mixed Reality Capture Setup Guide for setup information.
Mixed reality capture may be used by any application that includes an instance of OVRManager. Mixed
reality capture is disabled by default. To enable it, launch the application using the command line argument
-mixedreality, or with external configuration file. For more information, see "Using Mixed Reality" and
"External Configuration File" below.
88 | Unity Developer Guide | Unity
For more polished composition, we recommend using external composition mode. In this mode, the
application outputs two windows. The MirrorWindow displays the application. The second window displays
the foreground content from the video stream on the left against a green background, and displays the
background content on the right. The second window is illustrated below:
Third-party composition software such as OBS Studio or XSplit is required to clip the green screen and combine
the images.
In direct composition mode, the application streams camera footage to your scene directly and displays the
composited image. Direct composition requires use of a green screen for video capture, and the composited
image may exhibit some latency from the video stream. See "Features" below for a discussion of latency
correction settings.
We recommend using direct composition if complicated transparent objects exist between the player and
camera.
Sandwich composition mode is similar to direct composition mode, but the scene is rendered in three video
layers, with the application providing the foreground and background layers, and the camera providing the
middle layer. This places greater demands on memory than direct composition (see "Latency Control" below
for details), but allows for more powerful latency correction, closer to what can be achieved with third-party
composition software. See "Features" below for more information.
Because they composite the video frame inside of Unity, direct and sandwich composition support more
features than external mode, including virtual green screen and dynamic lighting.
-mixedreality Open the mixed reality capture view in the MirrorWindow when
app starts.
-directcomposition, - When used with -mixedreality, opens the mixed reality view in the
externalcomposition, - specified composition mode.
sandwichcomposition
Most application settings may be controlled with command-line parameters when launching the application.
This allows users who have no access to Unity or the project to use and configure mixed reality capture on their
systems.
For convenience, you may wish to create a launch shortcut and append the -mixedreality parameter or
other settings. Alternately, you may wish to create a batch file that launches the application with the desired
parameters. Or you may provide a configuration file that users can modify. See "External Configuration File" for
more information.
If you are running an application in the Editor, check the Enable Mixed Reality checkbox while the game
is running to switch dynamically to mixed reality capture mode. You may set your application to launch with
mixed reality capture enabled in the Editor for debugging purposes only. The setting is automatically disabled
when you build your application.
To create an mrc.config file, configure your project with the desired settings and launch the application with the
-create_mrc_config parameter. This will create an mrc.config file in the application’s data folder. You may
modify the file as you wish.
Most settings in the mrc.config file are the same as the corresponding OVRManager component properties.
This table indicates the exceptions:
Setting Options
This is a typical mrc.config file for an application set to sandwich composition mode:
{
"enableMixedReality": true,
"extraHiddenLayers": {
"serializedVersion": "2",
"m_Bits": 0
},
"compositionMethod": 2,
"capturingCameraDevice": 2,
"flipCameraFrameHorizontally": false,
"flipCameraFrameVertically": false,
"handPoseStateLatency": 0.0,
"sandwichCompositionRenderLatency": 0.1,
"sandwichCompositionBufferedFrames": 8,
"chromaKeyColor": {
"r": 0.0,
"g": 1.0,
"b": 0.0,
"a": 1.0
},
"chromaKeySimilarity": 0.6000000238418579,
"chromaKeySmoothRange": 0.029999999329447748,
"chromaKeySpillRange": 0.05999999865889549,
"useDynamicLighting": false,
"depthQuality": 1,
"dynamicLightingSmoothFactor": 8.0,
"dynamicLightingDepthVariationClampingValue": 0.0010000000474974514,
"virtualGreenScreenType": 1,
"virtualGreenScreenTopY": 10.0,
"virtualGreenScreenBottomY": -10.0,
"virtualGreenScreenApplyDepthCulling": false,
"virtualGreenScreenDepthTolerance": 0.20000000298023225
}
Unity | Unity Developer Guide | 91
Features
The following mixed reality properties are available via the OVRManager component. To unhide these settings
in the Inspector, select OVRManager and check the Show Properties box under Mixed Reality Capture.
Functions and settings for mixed reality capture may be found under OVRManager in our Unity Scripting
Reference on page 127.
Note: If they conflict with command-line parameters, these settings will be overridden.
bool enableMixedReality
Set to true to enable mixed reality capture.
CompositionMethod
compositionMethod
Select External, Direct, or Sandwich.
bool useHiddenLayerInMixedReality
If true, objects in layers specified by
hiddenLayerInMixedReality will be hidden from mixed
reality capture.
int hiddenLayerInMixedReality
Select a layer to hide from mixed reality capture.
LayerMask extraHiddenLayers
Select any layers which you want to hide from mixed reality
capture.
Chroma Key
Chroma key settings allow for fine-tuned control of how the video and application streams are composited.
Use these settings to set the reference color of the green screen and control various thresholds at which video
pixels are included or excluded from the final frame.
Dynamic Lighting
When Dynamic Lighting is enabled, video captured by the physical camera is illuminated in the composted
scene by light effects and flashes within the application. For example, a player would briefly be brightly lit
during an explosion in the game.
Lighting is applied to video on a flat plane parallel to the camera unless a depth-sensing camera is used (ZED
camera), in which case object depth is taken into account.
When enabled, Virtual Green Screen crops video footage that falls outside of the Guardian System Outer
Boundary or Play Area configured by the user. The Outer Boundary is the actual perimeter drawn by the user
during Touch setup, while the Play Area is a rectangle calculated from the Outer Boundary. Note that the Outer
Boundary and Play Area are two-dimensional shapes in the x and z axis. The top and bottom of the y value of
the virtual green screen must be specified.
Latency Control
When using direct or sandwich composition, there is typically a lag between the camera video and the
rendered application. The amount of lag depends on the equipment and configuration of each system, and for
best results, must be configured on a case-by-case basis.
In direct composition, users may slow the rendering of tracked controllers relative to the video stream by
entering a value for handPoseStateLatency (in milliseconds). The composited scene will then use slightly
stale pose data to render controllers in the composited scene.
The following controls may be configured through OVRManager for preview in the Unity Editor, or configured
by launch parameter or external configuration file.
Unity | Unity Developer Guide | 93
94 | Unity Developer Guide | Unity
CameraDevice
capturingCameraDevice Select the physical camera device
used to capture the real world image.
float dynamicLightingDepthVariationClampingValue
Sets the maximum depth variation
across edges (smaller values set
smoother edges).
The following properties are available in sandwich composition only. Note that all properties in the above table
are also available for sandwich composition.
Cubemap Screenshots
The OVR Screenshot Wizard allows you to easily export a 360 screenshot in cubemap format.
Cubemap previews may be submitted with applications to provide a static in-VR preview for the Oculus Store.
For more information, see Oculus Store Art Guidelines (PDF).
You may also use OVRCubemapCaptureProbe to take a 360 screenshot from a running Unity app. (see Prefabs
on page 35 for more information).
Unity | Unity Developer Guide | 97
Basic Usage
When you import the Oculus Utilities OVRScreenshotWizard into your project, it will add a new Tools pull-down
menu to your menu bar. Select Tools > Oculus > OVR Screenshot Wizard to launch the tool.
By default, the screenshot will be taken from the perspective of your Main Camera. To set the perspective to a
different position, assign any Game Object to the Render From field in the Wizard and click Render Cubemap
to save.
98 | Unity Developer Guide | Unity
The generated cubemap may be saved either as a Unity Game Object, or as a horizontal 2D atlas texture in
PNG or JPG format with the following face order (Horizontal left to right): +x, -x, +y, -y, +z, -z.
Options
Render From: You may use any Game Object as the "camera" that defines the position from which the
cubemap will be captured.
To assign a Game Object to function as the origin perspective, select any instantiated Game Object in the
Hierarchy View and drag it here to set it as the rendering position in the scene. You may then position the
Game Object anywhere in the scene.
If you do not specify a Game Object in this field, the screenshot will be taken from the Main Camera.
Note: If the Game Object extends into the visible area of the scene, it will be included in the capture.
This may be useful if you wish to lock art to the origin point, e.g., if you wished to show looking out on
the scene from a cage, for example. If you do not want the Game Object to be visible, be sure to use a
simple object like a cube or a sphere, or simply use the scene Main Camera.
Size: Sets the resolution for each "tile" of the cubemap face. For submission to the Oculus Store, select 2048
(default, see Oculus Store Art Guidelines for more details).
Save Mode
Unity | Unity Developer Guide | 99
Cube Map Folder: The directory where OVR Screenshot Wizard creates the Unity format Cubemap. The path
must be under the root asset folder "Assets"
Texture Format: Sets the image format of 2D atlas texture (PNG or JPEG).
Note: If Save Mode is set to Save Cube Map Screenshot or Both, a pop-up dialog allows you to specify
the destination folder where the 2D atlas texture will be generated. You may save it outside of Assets
folder if you wish.
General Tips
VR application debugging is a matter of getting insight into how the application is structured and executed,
gathering data to evaluate actual performance, evaluating it against expectation, then methodically isolating
and eliminating problems.
When analyzing or debugging, it is crucial to proceed in a controlled way so that you know specifically what
change results in a different outcome. Focus on bottlenecks first. Only compare apples to apples, and change
one thing at a time (e.g., resolution, hardware, quality, configuration).
Always be sure to profile, as systems are full of surprises. We recommend starting with simple code, and
optimizing as you go - don’t try to optimize too early.
We recommend creating a non-VR version of your camera rig so you can swap between VR and non-VR
perspectives. This allows you to spot check your scenes, and it may be useful if you want to do profiling with
third-party tools (e.g., Adreno Profiler).
It can be useful to disable Multithreaded Rendering in Player Settings during performance debugging. This
will slow down the renderer, but also give you a clearer view of where your frame time is going. Be sure to turn
it back on when you’re done!
Performance Targets
Before debugging performance problems, establish clear targets to use as a baseline for calibrating your
performance.
These targets can give you a sense of where to aim, and what to look at if you’re not making frame rate or are
having performance problems.
Below you will find some general guidelines for establishing your baselines, given as approximate ranges unless
otherwise noted.
Mobile
• 60 FPS (required by Oculus)
• 50-100 draw calls per frame
• 50,000-100,000 triangles or vertices per frame
PC
• 90 FPS (required by Oculus)
• 500-1,000 draw calls per frame
• 1-2 million triangles or vertices per frame
Unity Profiler
Unity comes with a built-in profiler (see Unity’s Profiler manual). The Unity Profiler provides per-frame
performance metrics, which can be used to help identify bottlenecks.
Unity Pro comes with a built-in profiler. The profiler provides per-frame performance metrics, which can be used
to help identify bottlenecks.
PC Setup
To use Unity Profiler with a Rift application, select Development Build and Autoconnect Profiler in Build
Settings and build your application. When you launch your application, the Profiler will automatically open.
Mobile Setup
You may profile your application as it is running on your Android device using adb or Wi-Fi. For steps on
how to set up remote profiling for your device, please refer to the Android section of the following Unity
documentation: https://docs.unity3d.com/Documentation/Manual/Profiler.html.
The Unity Profiler displays CPU utilization for the following categories: Rendering, Scripts, Physics,
GarbageCollector, and Vsync. It also provides detailed information regarding Rendering Statistics, Memory
Usage (including a breakdown of per-object type memory usage), Audio and Physics Simulation statistics.
102 | Unity Developer Guide | Unity
The Unity profiler only displays performance metrics for your application. If your app isn’t performing as
expected, you may need to gather information on what the entire system is doing.
In this mode, translucent colors will accumulate providing an overdraw “heat map” where more saturated
colors represent areas with the most overdraw.
Unity | Unity Developer Guide | 103
To use this profiler, connect to your device over Wi-Fi using ADB over TCPIP as described in the Wireless usage
section of Android’s adb documentation. Then run adb logcat while the device is docked in the headset.
See Unity’s Measuring Performance with the Built-in Profiler for more information. For more on using adb and
logcat, see Android Debugging in the Mobile SDK documentation.
104 | Unity Developer Guide | Unity
For example, after running the Performance Auditing Tool, you may be prompted to use ASTC compression,
or to disable the built-in Unity Skybox. For a look at many of the recommendations we used to establish the
auditing baseline, see Best Practices for Rift and Mobile on page 99.
This tool is intended to help verify that your application is performant, and will not specifically evaluate it for
submission to the Oculus Store.
To audit a Rift project, select File > Build Settings…, and under Platform, select PC, Mac, & Linux
Standalone. If Switch Platform is not grayed out, click it.
To audit a mobile project, select File > Build Settings…, and under Platform, select Android. If Switch
Platform is not grayed out, click it.
Use
Once you have verified your Build Settings are configured properly, run the Performance Auditing Tool.
For each issue the tool finds, you will be provided with the option of automatically updating your settings to
bring them in line with our recommendations.
You may perform the same fix to multiple issues of the same type. Some issues allow multiple resolutions, such
as “Optimize Light Baking” shown in the example below:
Unity | Unity Developer Guide | 105
Click on any object with a reported issue to see the object highlighted in the scene in the Editor so you can
quickly find and take a closer look at any affected object. For example, in the report above, clicking on “Couch
Hot Rect 2 (Box Collider)” on the left will highlight that Game Object in the Editor.
Compositor Mirror
The compositor mirror is an experimental tool for viewing exactly what appears in the headset, with
Asynchronous TimeWarp and distortion applied.
The compositor mirror is useful for development and troubleshooting without having to wear the headset.
Everything that appears in the headset will appear, including Oculus Home, Guardian boundaries, in-
game notifications, and transition fades. The compositor mirror is compatible with any game or experience,
regardless of whether it was developed using the native PC SDK or a game engine.
For more details, see the Compositor Mirror section of the PC SDK Guide.
106 | Unity Developer Guide | Unity
The VrCapture library is automatically included in projects built by Unity 5 or later, so setup and use of the
Oculus Remote Monitor is easy.
Unity | Unity Developer Guide | 107
Oculus Remote Monitor is available from our Downloads page. For more information about setup, features, and
use, see Oculus Remote Monitor in our Mobile SDK guide.
108 | Unity Developer Guide | Unity
Feature Highlights
• The Frame Buffer Viewer provides a mechanism for inspecting the frame buffer as the data is received in
real-time, which is particularly useful for monitoring play test sessions. When enabled, the Capture library
will stream a downscaled pre-distortion eye buffer across the network.
• The Performance Data Viewer provides real-time and offline inspection of the following on a single,
contiguous timeline:
• CPU/GPU events
• Sensor readings
• Console messages, warnings, and errors
• Frame buffer captures
• The Logging Viewer provides raw access to various messages and errors tracked by thread IDs.
• Nearly any constant in your code may be turned into a knob that can be updated in real-time during a play
test.
OVR Metrics Tool reports application frame rate, heat, GPU and CPU throttling values, and the number of tears
and stale frames per second. It is available for download from our Downloads page.
Unity | Unity Developer Guide | 109
OVR Metrics Tool can be run two modes. In Report Mode, it displays performance report about a VR session
after it is complete. Report data may be easily exported as a CSV and PNG graphs.
110 | Unity Developer Guide | Unity
In Performance HUD Mode, OVR Metrics Tool renders performance graphs as a VR overlay over any running
Oculus application.
For more information, see OVR Metrics Tool in our Mobile SDK Guide.
Event Tracing for Windows (ETW) is a trace utility provided by Windows for performance analysis. GPUView
view provides a window into both GPU and CPU performance with DirectX applications. It is precise, has low
overhead, and covers the whole Windows system.
Most Unity developers will find the Unity Profiler sufficient, but in some cases ETW and GPUView may be useful
for debugging problems such as system-level contention with background processes. For a detailed description
of how to use ETW with our native Rift SDK, see VR Performance Optimization in our PC SDK Developer Guide.
Not all of the content will be relevant to the Unity developer, but it contains a lot of applicable conceptual
material that may be very useful.
Systrace
Unity | Unity Developer Guide | 111
NSight is a CPU/GPU debug tool for NVIDIA users, available in a Visual Studio version and an Eclipse version.
Mac OpenGL Monitor
APITrace
https://apitrace.github.io/
Analyzing Slowdown
In this guide, we take a look at three of the areas commonly involved with slow application performance: pixel
fill, draw call overhead, and slow script execution.
Pixel Fill
Pixel fill is a function of overdraw and of fragment shader complexity. Unity shaders are often implemented
as multiple passes (draw diffuse part, draw specular part, and so forth). This can cause the same pixel to be
touched multiple times. Transparency does this as well. Your goal is to touch almost all pixels on the screen
only one time per frame.
Unity's Frame Debugger (described in Unity Profiling Tools on page 101) is very useful for getting a sense
of how your scene is drawn. Watch out for large sections of the screen that are drawn and then covered, or for
objects that are drawn multiple times (e.g., because they are touched by multiple lights).
Z-testing is faster than drawing a pixel. Unity does culling and opaque sorting via bounding box. Therefore,
large background objects (like your Skybox or ground plane) may end up being drawn first (because the
bounding box is large) and filling a lot of pixels that will not be visible. If you see this happen, you can move
those objects to the end of the queue manually. See Material.renderQueue in Unity's Scripting API Reference
for more information.
Frame Debugger will clearly show you shadows, offscreen render targets, et cetera.
Draw Calls
Modern PC hardware can push a lot of draw calls at 90 fps, but the overhead of each call is still high enough
that you should try to reduce them. On mobile, draw call optimization is your primary scene optimization.
Draw call optimization is usually about batching multiple meshes together into a single VBO with the same
material. This is key in Unity because the state change related to selecting a new VBO is relatively slow. If you
select a single VBO and then draw different meshes out of it with multiple draw calls, only the first draw call is
slow.
Unity batches well when given properly formatted source data. Generally:
• Batching is only possible for objects that share the same material pointer.
• Batching doesn't work on objects that have multiple materials.
• Implicit state changes (e.g. lightmap index) can cause batching to end early.
• Use as few textures in the scene as possible. Fewer textures require fewer unique materials, so they are
easier to batch. Use texture atlases.
112 | Unity Developer Guide | Unity
• Bake lightmaps at the largest atlas size possible. Fewer lightmaps require fewer material state changes. Gear
VR can push 4096 lightmaps without too much trouble, but watch your memory footprint.
• Be careful not to accidentally instance materials. Note that accessing Renderer.material automatically
creates an instance (!) and opts that object of batching. Use Renderer.sharedMaterial instead
whenever possible.
• Watch out for multi-pass shaders. Add noforwardadd to your shaders whenever you can to prevent more
than one directional from applying. Multiple directionals generally break batching.
• Mark all mesh that never moves as Static in the editor. Note that this will cause the mesh to be combined
into a mega mesh at build time, which can increase load time and app size on disk, though usually not in a
material way. You can also create a static batch at runtime (e.g., after generating a procedural level out of
static parts) using StaticBatchingUtility.
• Watch your static and dynamic batch count vs the total draw call count using the Profiler, internal profiler
log, or stats gizmo.
Script Performance
Unity's C# implementation is fast, and slowdown from script is usually the result of a mistake and/or an
inadvertent block on slow external operations such as memory allocation. The Unity Profiler can help you find
and fix these scripts.
Try to avoid foreach, lamda, and LINQ structures as these allocate memory needlessly at runtime. Use a for
loop instead. Also, be wary of loops that concatenate strings.
Game Object creation and destruction takes time. If you have a lot of objects to create and destroy (say, several
hundred in a frame), we recommend pooling them.
Don't move colliders unless they have a rigidbody on them. Creating a rigidbody and setting isKinematic
will stop physics from doing anything but will make that collider cheap to move. This is because Unity maintains
two collider structures, a static tree and a dynamic tree, and the static tree has to be completely rebuilt every
time any static object moves.
Note that coroutines execute in the main thread, and you can have multiple instances of the same coroutine
running on the same script.
We recommend targeting around 1-2 ms maximum for all Mono execution time.
PC Debug Workflow
In this guide, we’ll use the example of a hypothetical stuttering app scene and walk through basic steps
debugging steps.
Where to Start
Begin by running the scene with the Oculus Performance HUD.
If the scene drops more than one frame every five seconds, check the render time. If it’s more than 8 ms, have a
look at GPU utilization. Otherwise, look at optimizing CPU utilization. If observed latency is greater than 30 ms,
have a look at queuing.
If you find garbage collection spikes, don’t allocate memory each frame.
Unity | Unity Developer Guide | 113
Mobile Tips
This section describes basic techniques for performance analysis for mobile development.
Use Oculus Remote Monitor (Mobile) on page 106 for VRAPI, render times, and latency. Systrace shows CPU
queueing.
It is a common problem to see Gfx.WaitForPresent appear frequently in Oculus Remote Monitor. This reports
the amount of time the render pipeline is stalled, so begin troubleshooting by understanding your scene is
assembled by Unity - the Unity Frame Debugger is a good starting place. See Unity Profiling Tools on page
101 for more information.
Rift
The app does not launch as a VR app.
Verify that you have installed the Oculus app and completed setup as described in Preparing for Rift
Development on page 8.
Verify that you have selected Virtual Reality Supported in Player Settings.
are using a compatible runtime - see Compatibility and Requirements for more details.
Mobile
The app does not launch as a VR app.
Verify that you selected Virtual Reality Supported in Player Settings before building your APK.
Applications fail to launch on Gear VR with error message "thread priority security exception make sure
the apk is signed”.
You must sign your application with an Oculus Signature File (osig). See "Sign your App with an Oculus
Signature File" in Preparing for Mobile Development on page 9 for instructions.
General Issues
Unity 5 hangs while importing assets from SDKExamples.
Be sure to delete any previously-imported Utilities packages from your Unity project before importing a new
version. If you are receiving errors and have not done so, delete the relevant folders in your project and re-
import Utilities. For more information, please see Importing the Oculus Utilities Package on page 11.
Contact Information
Questions?
The Unity Sample Framework can guide developers in producing reliable, comfortable applications and
avoiding common mistakes. The assets and scripts included with the Sample Framework may be reused in your
applications per the terms of our SDK 3.4 license. Note that some folders of the Sample Framework include
a more permissive BSD license - this license supersedes the SDK 3.4 license in folders in which it occurs. See
LICENSE.txt in the relevant folders for additional details.
The Unity Sample Framework is available as a Unity Package for developers who wish to examine how the
sample scenes were implemented, and as binaries for Rift, Oculus Go, and Gear VR for developers to explore
the sample scenes entirely in VR.
116 | Unity Sample Framework | Unity
The Sample Framework Unity Package is available from our Downloads Center and from the Unity Asset Store
here. The Rift executable and mobile application are available from the Oculus Store in the Gallery section.
The Unity Sample Framework works on all currently-supported versions of Unity. Please check Compatibility
and Requirements for supported versions.
Sample Scenes
In the Unity project, the following scenes are found in /Assets/SampleScenes:
Per-Eye Cameras Cameras/ Using different cameras for each eye for a specific object in the
scene.
Outdoor Motion First Person/ Basic forms of movement, and the effects a variety of design
choices may have on comfort.
Teleport First Person/ A teleportation locomotion scene that reduces the risk of
Locomotion/ discomfort.
TeleportAvatar First Person/ A teleportation scene that supports switching between any
Locomotion2/ combination of teleports and linear motion at run time.
Guardian Boundary Guardian Boundary Illustrates use of OVRBoundary API to interact with Guardian
System System/ System Outer Boundary and Play Area.
AvatarWithGrab Hands/ Uses the Unity Avatar SDK and the scripts OVRGrabber and
OVRGrabbable to illustrate hands presence with Touch. Pick up
and throw blocks from a table using the Touch grip buttons. This
sample requires importing the Oculus Avatar SDK.
CustomHands Hands/ Uses low-resolution custom hand models and the scripts
OVRGrabber and OVRGrabbable to illustrate hands presence
with Touch. Pick up and throw blocks from a table using the
Touch grip buttons. May be used as a reference for implementing
your own hand models.
Distance Grab Hands/ Illustrates selecting and grabbing distant objects with hand
models using Touch. For more information, see Distance Grab
Sample on our Developer Blog.
Input Tester Input/ This scene assists with testing input devices, displaying axis
values in real time.
Input Focus Input Focus/ Illustrates typical handling for loss of Input Focus, such as when
a Dash overlay is present. The application is paused, muted, and
tracked controllers are hidden.
Splash Screen OVRHarness/ Splash screen modal that supports custom images and content.
Scenes/
Loading.unity*
Movie Player Rendering/ Video rendering to a 2D textured quad using the Android Media
Surface Plugin. Source for the plugin ships with the Mobile SDK
in \VrAppSupport\MediaSurfacePlugin.
Surface Detail Rendering/ Different ways to create surface detail with normal, specular,
parallax, and displacement mapping.
Pointers UI/ How UI elements can be embedded in a scene and interact with
different gaze controllers.
Pointers - Gaze Click UI/ An extension of the Pointers scene, with gaze selection.
Tracking Volume UI/ Different ways to indicate the user is about to leave the position
tracking volume.
Note: * The Splash Screen sample is found in a different location from the other samples listed in this
table.
A Note on Comfort
These samples are intended to be tools for exploring design ideas in VR, and should not necessarily be
construed as design recommendations. The Sample Framework allows you to set some parameters to values
that will reliably cause discomfort in most users - they are available precisely to give developers an opportunity
to find out how much is too much.
It is as important to play test your game on a range of players throughout development to ensure your game
is a comfortable experience. We have provided in-game warnings to alert you to potentially uncomfortable
scenes.
Unity | Unity Sample Framework | 119
Note: You will need to enable running applications from unknown sources in the Oculus app settings.
Launch the Oculus app, and in the “gear” pull-down menu in the upper right, select Settings > General
and toggle Unknown Sources on to allow. You may wish to disable this setting after use for security
reasons.
1. Download the Unity Sample Framework PC Binary.
2. Unzip the contents.
3. Run the OculusSampleFramework.exe. Note that it must be run from a directory location that also includes
the OculusSampleFramework_Data folder.
1. Verify that you have installed the latest-recommended version of Unity 5 or later (see Compatibility and
Requirements for up-to-date information).
2. Download the Unity Sample Framework Project from our Downloads Center.
3. Launch the Unity Editor and create a new project.
4. Import the Unity Sample Framework Unity Package by selecting Assets > Import Package > Custom
Package… and selecting the Unity Sample Framework.
Note: You will need to enable running applications from unknown sources in the Oculus app settings.
Launch the Oculus app, and in the “gear” pull-down menu in the upper right, select Settings > General
and toggle Unknown Sources on to allow. You may wish to disable this setting after use for security
reasons.
1. Open the Sample Framework project as described above.
2. From the Editor menu bar, select OVR > Samples Build Config > Configure Build.
3. Build and run the project normally.
f. Close Settings.
g. Open Apps.
h. Select Gear VR Service.
i. Select Oculus Sample Framework to launch.
We have provided a Windows executable for use with the Oculus or DK2. A mobile app may be downloaded
for free from the Gallery Apps section of the Oculus Store. These applications are simply builds of the Unity
project.
Navigation
Launch the Sample Framework on Rift, Oculus Go, or Gear VR to load the startup scene. You will see the
Inspector, a three-pane interface providing controls for scene settings, documentation, and navigation controls
for browsing to other scenes. Making selections on the top-level menu on the left panel changes the content of
Unity | Unity Sample Framework | 121
the other two panels. The center panel is a contextual menu, and the right panel displays notes and instructions
for the current scene.
Inspector navigation is primarily gaze-controlled, supplemented by a mouse and keyboard, a gamepad (PC or
Gear VR), or the Gear VR touchpad.
To launch a scene from the center panel, you may select and click the scene with a mouse, gaze at the scene
name and press the A button on a gamepad, or tap the Gear VR touchpad.
Some scenes are grouped into folders (displayed as buttons). When browsing from a folder, select the “..”
button to navigate one level up in the scene hierarchy.
Scrolling
122 | Unity Sample Framework | Unity
Some panels support vertical scrolling. Several methods of scrolling are supported in order to illustrate some of
the available options for implementing this feature. The following methods are supported:
1. Gaze at the bottom or top of the panel.
2. Gaze over the panel and swipe up or down on the Gear VR touch pad.
3. Position the mouse pointer over the panel and use the mouse scroll wheel.
4. Click and drag the panel using the mouse pointer.
5. Click and drag by using the gaze pointer and “clicking” with the Gear VR touch pad, the gamepad A button,
or your space bar.
6. Gaze over the panel and scroll with the right thumbstick.
Unity | Unity Reference Content | 123
We currently support Unity 5.6.x, 2017.1.x, 2017.2.x, and 2017.3. For developers in the 5.6.x release channel,
we strongly recommend using 5.6.4p2. Our Release Notes describe known issues with any version.
The following chart describes which Oculus SDKs are included in each OVRPlugin version, and which Oculus
SDKs they utilize. Any Unity version may be updated to a later version of Oculus support by importing the
Utilities package. OVRPlugin is released first with Utilities for Unity, and is typically bundled in the next Unity
Editor version one to three weeks later. For more information, see OVRPlugin on page 33.
Release Date Utilities Version OVRPlugin PC SDK Mobile SDK Min. Required
Version Unity Version
Release Date Utilities Version OVRPlugin PC SDK Mobile SDK Min. Required
Version Unity Version
Legacy Compatibility
This section describes Oculus SDK and Integration compatibility with Unity Editor versions prior to the Oculus
1.14.0 release, when we began shipping OVRPlugin in the Utilities for Unity package.
Release Notes
This section describes changes for each version release.
New Features
• The Oculus Unity Integration supports the following new Oculus Go features -
Integration Changes
• The Oculus Unity packaging structure has changed. When upgrading to 1.25 we recommend deleting your
old copy of the Utilities and restarting Unity, then adding the 1.25 package.
Bug Fixes
• Fixed a bug where the screen briefly flashes white when launching a Go application.
• Fixed a bug where UnityEngine.GL rendered over a scene, switched to using MeshRender that respects
depth and the render queue.
• Fixed a race condition that could result in inconsistent handling of system UI requests, such as transitioning
to the Confirm Quit UI menu.
• Fixed an issue where OVRScreenFade MonoBehaviour was incorrectly enabling MeshRenderer when the
camera was switched and not setting the alpha to 0. This may result in the user’s view of the scene being
blocked.
Known Issues
• If you experience long UI stalls or poor performance with the Unity Editor when targeting Oculus Rift on
Windows 10, please run Windows Update to ensure that you have the latest version of Windows 10.
• Unity 5 5.6.3p2 - 5.6.4p1, Unity 2017.1 - 2017.1.2p1, 2017.2.0f3 - 2017.2.0p2:
• The Unity engine uses projection matrix calculations that are at variance with the Oculus SDK, causing VR
scenes to have the wrong parallax, which may cause discomfort.
• All Unity versions with Oculus runtime 1.17+ and Windows 10 + Creators Update
• This combination results in spurious WM_DEVICECHANGE reports in the Editor, even in non-VR projects.
Many users will notice no impact, but users connected to certain USB devices may find the Editor
becomes non-responsive and needs to be terminated from Task Manager. To mitigate, please update to
the Beta runtime available on our Public Test channel. We are currently working with Unity and Microsoft
on a permanent solution.
• Cubemap VR compositor layers currently do not work in mobile applications, or in Rift applications using
Unity versions earlier than 2017.1.
• Adaptive Resolution currently works only with Unity 2017.1.1p1 and later, you might experience some
slightly pixel shaking when resolution is changing, this is a known issue, we are working with Unity to resolve
it.
Unity | Unity Reference Content | 129
• Unity 5.6 and later: If you have updated your OVRPlugin version from Utilities, you may see a spurious error
message when the Editor first launches saying “Multiple plugins with the same name 'ovrplugin'”. Please
disregard.
• Unity has a known issue such that parenting one VR camera to another will compound tracking twice. As a
workaround, make them siblings in the GameObject hierarchy.
• Rift
• All Unity versions prior to 5.4.3p3 leak 5MB/s if you have a Canvas in your scene, enable Run In
Background, and dismount the Rift. You can check OVRManager.hasVrFocus in an Update function to
disable your Canvases while the HMD is dismounted.
• Transparent VR Compositor Layers do not currently support multiple layers of occlusion.
• The following versions of Unity require the Visual C++ Redistributable for Visual Studio 2015 or Rift
builds will fail to run in VR, and the error “Security error. This plugin is invalid!” will be reported in
output_log.txt:
• 5.3.6p3-5.3.6p7
• 5.4.0f1-5.4.2p1
• 5.5.0b1-5.5.0b8
• Gear VR
• 5.4.6p2 and 2017.3: A Unity issue may cause mobile builds to fail with the error "Failed to Repackage
Resources" due to the erroneous insertion of the keyword density into the Android manifest. Until this
is fixed in the engine, as a workaround you can install Android Build Tools v.24 or later. Note that Build
Tools v24 requires JDK 1.8 or later.
• Mobile developers should not use Unity versions 5.3.6p1-2 and 5.4.0p1-2 due to incorrect positional
movement of the head.
• Unity 5.3.4-5.3.6p3 and Unity 5.4.0b16-Unity 5.4.0p3: Do not set DSP Buffer Size to Best in Audio
Manager in the Inspector for now or you will encounter audio distortion. Set it to Good or Default
instead.
• Gear VR applications built with Unity 5.6.0f2 crash immediately upon launch, and Gear VR applications
built with 5.6.0p1 may crash when Multi-View is enabled.
• A known bug in Unity causes a deterioration of performance in mobile applications when the back
button is used to enter the Universal Menu, and then to return to the application. It particularly affects
applications that use multi-threading or which use high CPU utilization, and S7 (Europe) and S8 (global)
phones. This bug is fixed in 5.6.4p2, 2017.1.2p4 , 2017.3.0b9 , 2017.2.0p3.
• Do not use Utilities 1.11.0 due to a crash when returning to focus from Universal Menu or Quit to Home
dialog.
• Single-Pass Stereo Rendering
• When Single Pass is enabled, building mobile projects will fail with the error message “Shader error in
'Mobile/Bumped Detail Diffuse'” in certain cases. For more information, see “Known Issues” in Single
Pass Stereo Rendering (Preview, Mobile Only) on page 75.
• Two graphics driver issues affect mobile applications with Single Pass enabled using some S8 or S8+
phones with Unity 5.6.0p2-3. They can occur when Standard Shader Quality is set to low, or when you
are using tree objects. For more information and workarounds, see “Known Issues” in Single Pass Stereo
Rendering (Preview, Mobile Only) on page 75.
• Mixed Reality Capture
• ZED Camera users should upgrade their SDK version to 2.3.1. All previous versions are not compatible.
130 | Unity Reference Content | Unity
New Features
• The Oculus Unity Integration now supports 64bit development.
Integration Changes
• Removed support for OVRTouchpad, use OVRInput instead. This will remove support for Android
MotionEvents.
• Updates to the Android Manifest settings to support the latest mobile APIs.
Bug Fixes
• Fixed a bug where GetThreadState returned an error when called before any updates are made.
• Fixed a bug where UnityEngine.GL rendered over a scene, switched to using MeshRender that respects
depth and the render queue.
• Fixed an issue where ovrpLayerSubmitFlag_HeadLocked was set when rotation tracking was disabled.
• Fixed a bug where Unity may crash when an incompatible version of the ZED SDK is installed.
Known Issues
• If you experience long UI stalls or poor performance with the Unity Editor when targeting Oculus Rift on
Windows 10, please run Windows Update to ensure that you have the latest version of Windows 10.
• Unity 5 5.6.3p2 - 5.6.4p1, Unity 2017.1 - 2017.1.2p1, 2017.2.0f3 - 2017.2.0p2:
• The Unity engine uses projection matrix calculations that are at variance with the Oculus SDK, causing VR
scenes to have the wrong parallax, which may cause discomfort. Note that object size in affected scenes
is not quite correct, and take this into consideration when thinking about design. We are working with
Unity to fix this as soon as possible.
• All Unity versions with Oculus runtime 1.17+ and Windows 10 + Creators Update
• This combination results in spurious WM_DEVICECHANGE reports in the Editor, even in non-VR projects.
Many users will notice no impact, but users connected to certain USB devices may find the Editor
becomes non-responsive and needs to be terminated from Task Manager. To mitigate, please update to
the Beta runtime available on our Public Test channel. We are currently working with Unity and Microsoft
on a permanent solution.
• Cubemap VR compositor layers currently do not work in mobile applications, or in Rift applications using
Unity versions earlier than 2017.1.
• Adaptive Resolution currently works only with Unity 2017.1.1p1 and later, you might experience some
slightly pixel shaking when resolution is changing, this is a known issue, we are working with Unity to resolve
it.
• Unity 5.6 and later: If you have updated your OVRPlugin version from Utilities, you may see a spurious error
message when the Editor first launches saying “Multiple plugins with the same name 'ovrplugin'”. Please
disregard.
• Unity has a known issue such that parenting one VR camera to another will compound tracking twice. As a
workaround, make them siblings in the GameObject hierarchy.
• Go
• Oculus Go apps may experience a performance regression in the following versions of Unity:
• 2017.1.3p2 +
Unity | Unity Reference Content | 131
• 2017.2.2p1 +
• 2017.3.1p3 +
• 2018.1.0b9 +
• 2018.2.0a3 +
• Rift
• All Unity versions prior to 5.4.3p3 leak 5MB/s if you have a Canvas in your scene, enable Run In
Background, and dismount the Rift. You can check OVRManager.hasVrFocus in an Update function to
disable your Canvases while the HMD is dismounted.
• Transparent VR Compositor Layers do not currently support multiple layers of occlusion.
• The following versions of Unity require the Visual C++ Redistributable for Visual Studio 2015 or Rift
builds will fail to run in VR, and the error “Security error. This plugin is invalid!” will be reported in
output_log.txt:
• 5.3.6p3-5.3.6p7
• 5.4.0f1-5.4.2p1
• 5.5.0b1-5.5.0b8
• Gear VR
• When Single Pass is enabled, building mobile projects will fail with the error message “Shader error in
'Mobile/Bumped Detail Diffuse'” in certain cases. For more information, see “Known Issues” in Single
Pass Stereo Rendering (Preview, Mobile Only) on page 75.
• Two graphics driver issues affect mobile applications with Single Pass enabled using some S8 or S8+
phones with Unity 5.6.0p2-3. They can occur when Standard Shader Quality is set to low, or when you
are using tree objects. For more information and workarounds, see “Known Issues” in Single Pass Stereo
Rendering (Preview, Mobile Only) on page 75.
Bug Fixes
• Fixed an OVROverlay crash with error “Graphics.CopyTexture called with mismatching sizes “.
132 | Unity Reference Content | Unity
• Fixed an issue where adding OVROverlay to any Quad (for example the OverlayUIDemo scene in the
Sample Framework) and enabling single-pass on Android, the texture image appeared halved (without
mipmapping) and shifted to the upper-right quadrant.
Known Issues
• Unity 5 5.6.3p2 - 5.6.4p1, Unity 2017.1 - 2017.1.2p1, 2017.2.0f3 - 2017.2.0p2:
• The Unity engine uses projection matrix calculations that are at variance with the Oculus SDK, causing VR
scenes to have the wrong parallax, which may cause discomfort. Note that object size in affected scenes
is not quite correct, and take this into consideration when thinking about design. We are working with
Unity to fix this as soon as possible.
• Unity Utilities 1.18 includes a VR Compositor Overlay bug which causes mobile applications using gamma
space lighting to be too dark. This bug is fixed in 1.19.0.
• All Unity versions with Oculus runtime 1.17+ and Windows 10 + Creators Update
• This combination results in spurious WM_DEVICECHANGE reports in the Editor, even in non-VR projects.
Many users will notice no impact, but users connected to certain USB devices may find the Editor
becomes non-responsive and needs to be terminated from Task Manager. To mitigate, please update to
the Beta runtime available on our Public Test channel. We are currently working with Unity and Microsoft
on a permanent solution.
• Cubemap VR compositor layers currently do not work in mobile applications, or in Rift applications using
Unity versions earlier than 2017.1.
• Adaptive Resolution currently works only with Unity 2017.1.1p1 and later, you might experience some
slightly pixel shaking when resolution is changing, this is a known issue, we are working with Unity to resolve
it.
• Unity 5.6 and later: If you have updated your OVRPlugin version from Utilities, you may see a spurious error
message when the Editor first launches saying “Multiple plugins with the same name 'ovrplugin'”. Please
disregard.
• Unity has a known issue such that parenting one VR camera to another will compound tracking twice. As a
workaround, make them siblings in the GameObject hierarchy.
• Rift
• All Unity versions prior to 5.4.3p3 leak 5MB/s if you have a Canvas in your scene, enable Run In
Background, and dismount the Rift. You can check OVRManager.hasVrFocus in an Update function to
disable your Canvases while the HMD is dismounted.
• Transparent VR Compositor Layers do not currently support multiple layers of occlusion.
• The following versions of Unity require the Visual C++ Redistributable for Visual Studio 2015 or Rift
builds will fail to run in VR, and the error “Security error. This plugin is invalid!” will be reported in
output_log.txt:
• 5.3.6p3-5.3.6p7
• 5.4.0f1-5.4.2p1
• 5.5.0b1-5.5.0b8
• Gear VR
• Gear VR applications built with Unity 5.6.0f2 crash immediately upon launch, and Gear VR applications
built with 5.6.0p1 may crash when Multi-View is enabled.
• A known bug in Unity causes a deterioration of performance in mobile applications when the back
button is used to enter the Universal Menu, and then to return to the application. It particularly affects
applications that use multi-threading or which use high CPU utilization, and S7 (Europe) and S8 (global)
phones. This bug is fixed in 5.6.4p2, 2017.1.2p4 , 2017.3.0b9 , 2017.2.0p3.
• Do not use Utilities 1.11.0 due to a crash when returning to focus from Universal Menu or Quit to Home
dialog.
• Single-Pass Stereo Rendering
• When Single Pass is enabled, building mobile projects will fail with the error message “Shader error in
'Mobile/Bumped Detail Diffuse'” in certain cases. For more information, see “Known Issues” in Single
Pass Stereo Rendering (Preview, Mobile Only) on page 75.
• Two graphics driver issues affect mobile applications with Single Pass enabled using some S8 or S8+
phones with Unity 5.6.0p2-3. They can occur when Standard Shader Quality is set to low, or when you
are using tree objects. For more information and workarounds, see “Known Issues” in Single Pass Stereo
Rendering (Preview, Mobile Only) on page 75.
Version 1.22 also increased the recommended MSAA level to 4x to reduce aliasing issues and improve quality.
New Features
• Increased the recommended MSAA level to 4x to reduce aliasing issues and improve quality.
Bug Fixes
• Improved Occlusion Mesh handling that provides a significant GPU performance improvement.
• Fixed an UnderlayQuad crash caused by insufficient mip levels.
• Fixed an issue in version 2017.3.x where the active controller switches to gamepad when using the GearVR
Controller or HMT input.
• Fixed an issue where OVRPlugin's PreIntialize() failed to return an error on initialization failure.
Known Issues
• Do not use Utilities 1.16.0-beta. If you are using that version, please update to a later version.
• Unity 5 5.6.3p2 - 5.6.4p1, Unity 2017.1 - 2017.1.2p1, 2017.2.0f3 - 2017.2.0p2:
• The Unity engine uses projection matrix calculations that are at variance with the Oculus SDK, causing VR
scenes to have the wrong parallax, which may cause discomfort. Note that object size in affected scenes
is not quite correct, and take this into consideration when thinking about design. We are working with
Unity to fix this as soon as possible.
• Unity Utilities 1.18 includes a VR Compositor Overlay bug which causes mobile applications using gamma
space lighting to be too dark. This bug is fixed in 1.19.0.
• All Unity versions with Oculus runtime 1.17+ and Windows 10 + Creators Update
• This combination results in spurious WM_DEVICECHANGE reports in the Editor, even in non-VR projects.
Many users will notice no impact, but users connected to certain USB devices may find the Editor
134 | Unity Reference Content | Unity
becomes non-responsive and needs to be terminated from Task Manager. To mitigate, please update to
the Beta runtime available on our Public Test channel. We are currently working with Unity and Microsoft
on a permanent solution.
• Cubemap VR compositor layers currently do not work in mobile applications, or in Rift applications using
Unity versions earlier than 2017.1.
• Adaptive Resolution currently works only with Unity 2017.1.1p1 and later, you might experience some
slightly pixel shaking when resolution is changing, this is a known issue, we are working with Unity to resolve
it.
• Unity 5.6 and later: If you have updated your OVRPlugin version from Utilities, you may see a spurious error
message when the Editor first launches saying “Multiple plugins with the same name 'ovrplugin'”. Please
disregard.
• Unity has a known issue such that parenting one VR camera to another will compound tracking twice. As a
workaround, make them siblings in the GameObject hierarchy.
• Rift
• All Unity versions prior to 5.4.3p3 leak 5MB/s if you have a Canvas in your scene, enable Run In
Background, and dismount the Rift. You can check OVRManager.hasVrFocus in an Update function to
disable your Canvases while the HMD is dismounted.
• Transparent VR Compositor Layers do not currently support multiple layers of occlusion.
• The following versions of Unity require the Visual C++ Redistributable for Visual Studio 2015 or Rift
builds will fail to run in VR, and the error “Security error. This plugin is invalid!” will be reported in
output_log.txt:
• 5.3.6p3-5.3.6p7
• 5.4.0f1-5.4.2p1
• 5.5.0b1-5.5.0b8
• Gear VR
• When Single Pass is enabled, building mobile projects will fail with the error message “Shader error in
'Mobile/Bumped Detail Diffuse'” in certain cases. For more information, see “Known Issues” in Single
Pass Stereo Rendering (Preview, Mobile Only) on page 75.
• Two graphics driver issues affect mobile applications with Single Pass enabled using some S8 or S8+
phones with Unity 5.6.0p2-3. They can occur when Standard Shader Quality is set to low, or when you
are using tree objects. For more information and workarounds, see “Known Issues” in Single Pass Stereo
Rendering (Preview, Mobile Only) on page 75.
Unity | Unity Reference Content | 135
• If you add OVROverlay to any Quad (for example the OverlayUIDemo scene in the Sample Framework)
and enable single-pass on Android, the texture image may appear halved (without mipmapping) and
shifted to the upper-right quadrant.
We recommend adding Dash support to your application to provide the best possible user experience.
Developers who would like to add Dash support before it rolls out in early 2018 can test it using Oculus runtime
1.21, which includes preview support. Runtime 1.21 is now available through our opt-in Public Test Channel
(PTC).
Note that Dash support is enabled by default in Unity 2017.3b11 and later, in versions 2017.3f1-2, and in Unity
Dash custom builds. It is off by default in all other versions, and we plan to disable it by default in 2017.3f3 and
later.
For more information, please see Oculus Dash in Unity on page 83.
New Features
• Added Equirectangular support to VR Compositor Layers (mobile only).
Bug Fixes
• Unity 2017.2: Fixed issue causing OVRInput.GetDown() to return incorrect values.
Known Issues
• Do not use Utilities 1.16.0-beta. If you are using that version, please update to a later version.
• Unity 5.6 versions 5.6.3p2 and later, and Unity 2017.1 versions 2017.1.0p3 and later:
• The Unity engine uses projection matrix calculations that are at variance with the Oculus SDK, causing VR
scenes to have the wrong parallax, which may cause discomfort. Note that object size in affected scenes
136 | Unity Reference Content | Unity
is not quite correct, and take this into consideration when thinking about design. We are working with
Unity to fix this as soon as possible.
• Unity Utilities 1.18 includes a VR Compositor Overlay bug which causes mobile applications using gamma
space lighting to be too dark. This bug is fixed in 1.19.0.
• All Unity versions with Oculus runtime 1.17+ and Windows 10 + Creators Update
• This combination results in spurious WM_DEVICECHANGE reports in the Editor, even in non-VR projects.
Many users will notice no impact, but users connected to certain USB devices may find the Editor
becomes non-responsive and needs to be terminated from Task Manager. To mitigate, please update to
the Beta runtime available on our Public Test channel. We are currently working with Unity and Microsoft
on a permanent solution.
• Cubemap VR compositor layers currently do not work in mobile applications, or in Rift applications using
Unity versions earlier than 2017.1.
• Adaptive Resolution currently works only with Unity 2017.1.1f2 and later.
• Unity 5.6 and later: If you have updated your OVRPlugin version from Utilities, you may see a spurious error
message when the Editor first launches saying “Multiple plugins with the same name 'ovrplugin'”. Please
disregard.
• Unity has a known issue such that parenting one VR camera to another will compound tracking twice. As a
workaround, make them siblings in the GameObject hierarchy.
• Rift
• All Unity versions prior to 5.4.3p3 leak 5MB/s if you have a Canvas in your scene, enable Run In
Background, and dismount the Rift. You can check OVRManager.hasVrFocus in an Update function to
disable your Canvases while the HMD is dismounted.
• Transparent VR Compositor Layers do not currently support multiple layers of occlusion.
• The following versions of Unity require the Visual C++ Redistributable for Visual Studio 2015 or Rift
builds will fail to run in VR, and the error “Security error. This plugin is invalid!” will be reported in
output_log.txt:
• 5.3.6p3-5.3.6p7
• 5.4.0f1-5.4.2p1
• 5.5.0b1-5.5.0b8
• Gear VR
• 5.4.6p2 and 2017.3: A Unity issue may cause mobile builds to fail with the error "Failed to Repackage
Resources" due to the erroneous insertion of the keyword density into the Android manifest. Until this
is fixed in the engine, as a workaround you can install Android Build Tools v.24 or later. Note that Build
Tools v24 requires JDK 1.8 or later.
• Unity 2017.2/OVRPlugin 1.18.1: GetActiveController switches to gamepad following Gear VR Controller
button release. With OVRPlugin 1.21 or later, this is only expected to occur if the user has a gamepad
and Gear VR Controller connected at the same time.
• Mobile developers should not use Unity versions 5.3.6p1-2 and 5.4.0p1-2 due to incorrect positional
movement of the head.
• Unity 5.3.4-5.3.6p3 and Unity 5.4.0b16-Unity 5.4.0p3: Do not set DSP Buffer Size to Best in Audio
Manager in the Inspector for now or you will encounter audio distortion. Set it to Good or Default
instead.
• Gear VR applications built with Unity 5.6.0f2 crash immediately upon launch, and Gear VR applications
built with 5.6.0p1 may crash when Multi-View is enabled.
• A known bug in Unity 2017.x causes a deterioration of performance in mobile applications when the back
button is used to enter the Universal Menu, and then to return to the application. It particularly affects
applications that use multi-threading or which use high CPU utilization, and S7 (Europe) and S8 (global)
phones.
Unity | Unity Reference Content | 137
• Do not use Utilities 1.11.0 due to a crash when returning to focus from Universal Menu or Quit to Home
dialog.
• Single-Pass Stereo Rendering
• When Single Pass is enabled, building mobile projects will fail with the error message “Shader error in
'Mobile/Bumped Detail Diffuse'” in certain cases. For more information, see “Known Issues” in Single
Pass Stereo Rendering (Preview, Mobile Only) on page 75.
• Two graphics driver issues affect mobile applications with Single Pass enabled using some S8 or S8+
phones with Unity 5.6.0p2-3. They can occur when Standard Shader Quality is set to low, or when you
are using tree objects. For more information and workarounds, see “Known Issues” in Single Pass Stereo
Rendering (Preview, Mobile Only) on page 75.
Bug Fixes
• Fixed regression of MRC sandwich composition in Unity 5.6.1p2 in which the foreground output was
missing.
Known Issues
• Do not use Utilities 1.16.0-beta. If you are using that version, please update to a later version.
• Unity 5.6 versions 5.6.3p2 and later, and Unity 2017.1 versions 2017.1.0p3 and later:
• The Unity engine uses projection matrix calculations that are at variance with the Oculus SDK, causing VR
scenes to have the wrong parallax, which may cause discomfort. Note that object size in affected scenes
is not quite correct, and take this into consideration when thinking about design. We are working with
Unity to fix this as soon as possible.
• Unity Utilities 1.18 includes a VR Compositor Overlay bug which causes mobile applications using gamma
space lighting to be too dark. This bug is fixed in 1.19.0.
• All Unity versions with Oculus runtime 1.17+ and Windows 10 + Creators Update
• This combination results in spurious WM_DEVICECHANGE reports in the Editor, even in non-VR projects.
Many users will notice no impact, but users connected to certain USB devices may find the Editor
becomes non-responsive and needs to be terminated from Task Manager. To mitigate, please update to
the Beta runtime available on our Public Test channel. We are currently working with Unity and Microsoft
on a permanent solution.
• Cubemap VR compositor layers currently do not work in mobile applications, or in Rift applications using
Unity versions earlier than 2017.1.
• Adaptive Resolution currently works only with Unity 2017.1.1p1 and later.
• Unity 5.6 and later: If you have updated your OVRPlugin version from Utilities, you may see a spurious error
message when the Editor first launches saying “Multiple plugins with the same name 'ovrplugin'”. Please
disregard.
• Unity has a known issue such that parenting one VR camera to another will compound tracking twice. As a
workaround, make them siblings in the GameObject hierarchy.
• Rift
138 | Unity Reference Content | Unity
• All Unity versions prior to 5.4.3p3 leak 5MB/s if you have a Canvas in your scene, enable Run In
Background, and dismount the Rift. You can check OVRManager.hasVrFocus in an Update function to
disable your Canvases while the HMD is dismounted.
• Transparent VR Compositor Layers do not currently support multiple layers of occlusion.
• The following versions of Unity require the Visual C++ Redistributable for Visual Studio 2015 or Rift
builds will fail to run in VR, and the error “Security error. This plugin is invalid!” will be reported in
output_log.txt:
• 5.3.6p3-5.3.6p7
• 5.4.0f1-5.4.2p1
• 5.5.0b1-5.5.0b8
• Gear VR
• When Single Pass is enabled, building mobile projects will fail with the error message “Shader error in
'Mobile/Bumped Detail Diffuse'” in certain cases. For more information, see “Known Issues” in Single
Pass Stereo Rendering (Preview, Mobile Only) on page 75.
• Two graphics driver issues affect mobile applications with Single Pass enabled using some S8 or S8+
phones with Unity 5.6.0p2-3. They can occur when Standard Shader Quality is set to low, or when you
are using tree objects. For more information and workarounds, see “Known Issues” in Single Pass Stereo
Rendering (Preview, Mobile Only) on page 75.
Dash Support
At Oculus Connect 4 we announced Rift Core 2.0, which includes substantial changes to Oculus Home and
will replace our Universal Menu with Oculus Dash. We plan to make it available with the 1.21 runtime in early
December.
Unity | Unity Reference Content | 139
Dash re-implements Universal Menu as a VR compositor layer. Have a look at our Blog announcement and
watch the Dash video to get a sense of how it works.
We have added new application lifecycle support to our 1.18 integration in preparation for Dash. When Dash
draws a menu overlay, the new Has Input Focus flag will return false, indicating that the application should
pause and mute and the tracked controllers and hands be hidden from the scene, as Dash provides its own
UI in the foreground. Depending on the application, additional action may also be warranted when input
focus is lost (e.g., during a multi-player combat game, you may wish to freeze the player’s avatar and make it
temporarily invulnerable to other players).
For more information, see Application Lifecycle Handling on page 86. See the Input Focus sample in the Unity
Sample Framework for an example of a typical implementation.
API Changes
• Removed OVRManager.hasSystemOverlayPresent. All Dash lifecycle state information may now be
queries with OVRManager.hasInputFocus.
Bug Fixes
• Fixed VR compositor layer cubemap support on Rift only in Unity versions 2017.1 and later.
• Fixed Oculus integration 1.18 bug causing Occlusion Mesh to be rendered incorrectly. The lower side was
blocked instead of the upper side, causing the Editor mirror window to show a black border on the top.
• Fixed VR Compositor Overlay bug affecting Utilities 1.18.x, causing mobile applications using gamma space
lighting to be too dark. Rift applications and mobile apps using linear lighting were unaffected.
Known Issues
• Do not use Utilities 1.16.0-beta. If you are using that version, please update to 1.18.
• Adaptive Resolution currently works only with Unity 2017.1.1p1 and later.
• Cubemap VR compositor layers currently do not work in mobile applications, or in Rift applications using
Unity versions earlier than 2017.1.
• Unity 5.6 and later: If you have updated your OVRPlugin version from Utilities, you may see a spurious error
message when the Editor first launches saying “Multiple plugins with the same name 'ovrplugin'”. Please
disregard.
• Unity has a known issue such that parenting one VR camera to another will compound tracking twice. As a
workaround, make them siblings in the GameObject hierarchy.
• Rift
• All Unity versions prior to 5.4.3p3 leak 5MB/s if you have a Canvas in your scene, enable Run In
Background, and dismount the Rift. You can check OVRManager.hasVrFocus in an Update function to
disable your Canvases while the HMD is dismounted.
• Transparent VR Compositor Layers do not currently support multiple layers of occlusion.
• The following versions of Unity require the Visual C++ Redistributable for Visual Studio 2015 or Rift builds
will fail to run in VR, and the error “Security error. This plugin is invalid!” will be reported in output_log.txt:
• 5.3.6p3-5.3.6p7
• 5.4.0f1-5.4.2p1
• 5.5.0b1-5.5.0b8
• Mobile
• Do not use Utilities 1.11.0 due to a crash when returning to focus from Universal Menu or Quit to Home
dialog.
140 | Unity Reference Content | Unity
• Due to a Unity bug, the Camera pose can be corrupted by scripts in the first frame after being enabled
with VR support. As a workaround, use the latest Utilities version or zero out the eye anchor poses when
a new OVRCameraRig is spawned and the first frame after usePerEyeAnchors changes.
• Mobile developers should not use Unity versions 5.3.6p1-2 and 5.4.0p1-2 due to incorrect positional
movement of the head.
• Unity 5.3.4-5.3.6p3 and Unity 5.4.0b16-Unity 5.4.0p3: Do not set DSP Buffer Size to Best in Audio
Manager in the Inspector for now or you will encounter audio distortion. Set it to Good or Default
instead.
• Mobile applications built with Unity 5.6.0f2 crash immediately upon launch, and mobile applications built
with 5.6.0p1 may crash when Multi-View is enabled.
• Single-Pass Stereo Rendering
• When Single Pass is enabled, building mobile projects will fail with the error message “Shader error in
'Mobile/Bumped Detail Diffuse'” in certain cases. For more information, see “Known Issues” in Single
Pass Stereo Rendering (Preview, Mobile Only) on page 75.
• Two graphics driver issues affect mobile applications with Single Pass enabled using some S8 or S8+
phones with Unity 5.6.0p2-3. They can occur when Standard Shader Quality is set to low, or when you
are using tree objects. For more information and workarounds, see “Known Issues” in Single Pass Stereo
Rendering (Preview, Mobile Only) on page 75.
• Fixed bug causing VR Compositor Layers to persist after changing scenes in a multi-scene game.
The Utilities package includes the most recent version of the Oculus OVRPlugin that is also included in the
Unity Editor. When you import the Utilities package into a project, if the OVRPlugin included with Utilities is
greater than the version in your Editor, a pop-up dialog will give you the option to update it. We recommend
always using the latest-available OVRPlugin version.
Note: The latest OVRPlugin version number may be a version or two behind the Utilities version
number.
For information on which versions of the Unity Editor are compatible with which versions of Utilities for Unity,
please see Compatibility and Version Requirements.
Be sure to review our Downloads page for other useful tools to assist development, such as the Unity Sample
Framework. For more information on Oculus resources for Unity developers, please see Other Oculus
Resources for Unity Developers.
The Oculus Integration, available through the Unity Asset Store, provides several unityPackages in a single
download, including our Utilities for Unity, Oculus Platform SDK Unity plugin, Oculus Avatar SDK Unity Plugin,
Oculus Native Spatializer Plugin, and the Unity Sample Framework.
Unity | Unity Reference Content | 141
Unity 2017
2017.1 is officially supported, and we recommend developers update to this version when convenient. Our
support for 2017.2 is currently in beta, and we do not recommend shipping applications using it at this time.
Our support for 2017.3 is in alpha and may be unstable. See the “Unity 2017” section of “Known Issues” below
for more information.
New Features
• Mixed Reality Capture - see Unity Mixed Reality Capture on page 86 for more information.
• Added sandwich composition mode, which is similar to direct composition in that camera and application
content are composited by Unity. However, in sandwich composition, three distinct video layers are
preserved - foreground and background content from the application, and a middle camera layer.
This mode places greater demands on memory than Direct Composition, but allows for finer latency
correction.
• Added latency correction controls for direct and sandwich Composition.
• Added Chroma Key options for direct and sandwich composition, replacing the previous green screen
control parameters.
• Removed green screen tolerance, alpha cutoff, and color shadows settings.
• Added support for configuration by JSON file.
• Added a dynamic lighting option, which illuminate video content with application lights and flashes (e.g.,
the interpolated person in the scene is illuminated by explosions).
• Added ZED camera support, which provides depth information that can be used to present more realistic
dynamic lighting in direct or sandwich composition.
• Added Virtual Green Screen, which confines interpolated camera stream content to an area defined by
Guardian System configuration.
• Added selective layer hiding and capture camera selection options.
• OVRManager
• Added input focus and system overlay support. When input focus or system overlay status changes,
applications may implement appropriate handling, such as pausing and muting applications when a
system overlay is present, and hiding tracked controllers when input focus is lost. For an illustration of
typical use, see the Input Focus System Overlay sample in our Unity Sample Framework. For additional
documentation, see OVRManager in our Unity Scripting Reference on page 127.
Bug Fixes
• Fixed Gear VR black screen issue affecting Utilities 1.16.0-beta.
• Fixed “Mobile/Bumped detail diffuse” shader error when using Single-Pass Stereo Rendering with some
Unity versions.
• Fixed quad layer failure to render with Unity 5.4 and 5.6.
• Fixed issue limiting mipmap textures passed to OVROverlay to mip level 0.
Known Issues
• Unity 2017
142 | Unity Reference Content | Unity
• APKs built with Unity 2017 versions earlier than 2017.1.0p5 fail submission to the Oculus Store, which
does not accept APK Signature Scheme V2.
• Do not use Utilities 1.16.0-beta. If you are using that version, please update to 1.18.
• VR Compositor Overlays will remain viewable after switching scenes in multi-scene applications unless all
OVROverlay instances are explicitly disabled. Fixed in 1.18.1.
• Adaptive Resolution currently works only with Unity 2017.1.1p1 and later.
• Unity 5.6 and later: If you have updated your OVRPlugin version from Utilities, you may see a spurious error
message when the Editor first launches saying “Multiple plugins with the same name 'ovrplugin'”. Please
disregard.
• Unity has a known issue such that parenting one VR camera to another will compound tracking twice. As a
workaround, make them siblings in the GameObject hierarchy.
• Rift
• All Unity versions prior to 5.4.3p3 leak 5MB/s if you have a Canvas in your scene, enable Run In
Background, and dismount the Rift. You can check OVRManager.hasVrFocus in an Update function to
disable your Canvases while the HMD is dismounted.
• Transparent VR Compositor Layers do not currently support multiple layers of occlusion.
• The following versions of Unity require the Visual C++ Redistributable for Visual Studio 2015 or Rift builds
will fail to run in VR, and the error “Security error. This plugin is invalid!” will be reported in output_log.txt:
• 5.3.6p3-5.3.6p7
• 5.4.0f1-5.4.2p1
• 5.5.0b1-5.5.0b8
• Mobile
• Do not use Utilities 1.11.0 due to a crash when returning to focus from Universal Menu or Quit to Home
dialog.
• Due to a Unity bug, the Camera pose can be corrupted by scripts in the first frame after being enabled
with VR support. As a workaround, use the latest Utilities version or zero out the eye anchor poses when
a new OVRCameraRig is spawned and the first frame after usePerEyeAnchors changes.
• Mobile developers should not use Unity versions 5.3.6p1-2 and 5.4.0p1-2 due to incorrect positional
movement of the head.
• Unity 5.3.4-5.3.6p3 and Unity 5.4.0b16-Unity 5.4.0p3: Do not set DSP Buffer Size to Best in Audio
Manager in the Inspector for now or you will encounter audio distortion. Set it to Good or Default
instead.
• Mobile applications built with Unity 5.6.0f2 crash immediately upon launch, and mobile applications built
with 5.6.0p1 may crash when Multi-View is enabled.
• There is a known Unity bug causing a deterioration of performance in mobile applications when the back
button is used to enter the Universal Menu, and then to return to the application. It particularly affects
applications that use multi-threading or which use high CPU utilization, and S7 (Europe) and S8 (global)
phones.
• Single-Pass Stereo Rendering
• When Single Pass is enabled, building mobile projects will fail with the error message “Shader error in
'Mobile/Bumped Detail Diffuse'” in certain cases. For more information, see “Known Issues” in Single
Pass Stereo Rendering (Preview, Mobile Only) on page 75.
• Two graphics driver issues affect mobile applications with Single Pass enabled using some S8 or S8+
phones with Unity 5.6.0p2-3. They can occur when Standard Shader Quality is set to low, or when you
are using tree objects. For more information and workarounds, see “Known Issues” in Single Pass Stereo
Rendering (Preview, Mobile Only) on page 75.
Unity | Unity Reference Content | 143
Overview of 1.16-beta
This beta release adds support for mixed reality capture, which allows live video footage of a Rift user to be
composited with the output from a game to create combined video that showed the player in a virtual scene.
Unity 5.4 is not supported by 1.16-beta, and it introduces known issues regarding VR Compositor Layers - see
below for more details.
There is a known black screen issue with Android, and we recommend that you do not use this version for
mobile development.
We recommend only updating to this release if you need mixed reality capture support or cylinder VR
Compositor Layer support for Rift.
The Oculus integration includes preliminary support for Unity 2017 Beta. If you have any problems or questions,
please let us know in our Unity Developer Forum.
New Features
• Added support for mixed reality capture (Rift only). For more information, see Unity Mixed Reality Capture.
For more information, see Unity Mixed Reality Capture on page 86.
• Added cylinder layer support to OVROverlay on Rift.
• Added R16G16B16A16_FP / R11G11B10_FP support to OVRManager, which can remove banding
from dark colors. To enable, use OVRManager.eyeTextureFormat = R11G11B10_FP. Note: if
you need alpha channel in your frame buffer, you must use OVRManager.eyeTextureFormat =
R16G16B16A16_FP.
Bug Fixes
• Fixed a crash on Windows when another application toggles exclusive full-screen mode.
• Fixed bug with Unity 5.6 where Gear VR would report the wrong field of view for the first frame after launch.
• Fixed pose race condition when OVRCameraRig.useFixedUpdateForTracking is true.
Known Issues
• Unity 2017
• When using the Utilities package, you may encounter the following error when adding scripts
to your project: "Assets/OVR/Scripts/OVROverlay.cs(385,20): error CS1501: No overload for
method `CreateExternalTexture' takes `6' arguments". As a workaround, open OVROverlay.cs
in you script editor and change et = Cubemap.CreateExternalTexture(size.w,
size.h, txFormat, mipLevels > 1, isSrgb, externalTex); to et =
Cubemap.CreateExternalTexture(size.w, txFormat, mipLevels > 1, externalTex);.
• 1.16-beta issues
• Due to a black screen issue, we do not recommend using this version for mobile development.
144 | Unity Reference Content | Unity
• If you are passing a texture with mipmaps to OVROverlay, only the mip level o will be used. You will
experience aliasing if your texture is excessively high-resolution. Do not use resolutions above 1024 for
now.
• Projects using Utilities 1.16-beta only support cubemap VR Compositor Layers in Unity 2017.1 or later.
• Mixed reality capture
• Recentering a Rift mixed reality capture application will corrupt the camera pose when a static camera
was configured with the CameraTool. As a temporary workaround, you may attach a VR Object to your
camera (e.g., by using a third Touch), and it will recenter normally.
• Unity 5.6 and later: If you have updated your OVRPlugin version from Utilities, you may see a spurious error
message when the Editor first launches saying “Multiple plugins with the same name 'ovrplugin'”. Please
disregard.
• When using Android SDK Tools 25.3.1 or newer, we recommend using Oculus Utilities 1.15 or newer in
combination with Unity 5.4.5p2 or newer, 5.5.3p3 or newer, 5.6.0p3 or newer, or 2017.1.0b5 or newer.
• When Single Pass is enabled, building mobile projects will fail with the error message “Shader error in
'Mobile/Bumped Detail Diffuse'” in certain cases. For more information, see “Known Issues” in Single Pass
Stereo Rendering (Preview, Mobile Only) on page 75.
• Two graphics driver issues affect mobile applications with Single Pass enabled using some S8 or S8+ phones
with Unity 5.6.0p2-3. They can occur when Standard Shader Quality is set to low, or when you are using
tree objects. For more information and workarounds, see “Known Issues” in Single Pass Stereo Rendering
(Preview, Mobile Only) on page 75.
• The following versions of Unity require the Visual C++ Redistributable for Visual Studio 2015 or Rift builds
will fail to run in VR, and the error “Security error. This plugin is invalid!” will be reported in output_log.txt:
• 5.3.6p3-5.3.6p7
• 5.4.0f1-5.4.2p1
• 5.5.0b1-5.5.0b8
• Unity has a known issue such that parenting one VR camera to another will compound tracking twice. As a
workaround, make them siblings in the GameObject hierarchy.
• Rift
• All Unity versions prior to 5.4.3p3 leak 5MB/s if you have a Canvas in your scene, enable Run In
Background, and dismount the Rift. You can check OVRManager.hasVrFocus in an Update function to
disable your Canvases while the HMD is dismounted.
• Transparent VR Compositor Layers do not currently support multiple layers of occlusion.
• Gear VR
• Gear VR applications built with Unity 5.6.0f2 crash immediately upon launch, and Gear VR applications
built with 5.6.0p1 may crash when Multi-View is enabled.
• Do not use Utilities 1.11.0 due to a crash when returning to focus from Universal Menu or Quit to Home
dialog.
• Due to a Unity bug, the Camera pose can be corrupted by scripts in the first frame after being enabled
with VR support. As a workaround, use the latest Utilities version or zero out the eye anchor poses when
a new OVRCameraRig is spawned and the first frame after usePerEyeAnchors changes.
• With Unity 5.3, the world may appear tilted. As a workaround, use the latest Utilities version or disable
the virtual reality splash image.
• Mobile developers should not use Unity versions 5.3.6p1-2 and 5.4.0p1-2 due to incorrect positional
movement of the head.
• Unity 5.3.4-5.3.6p3 and Unity 5.4.0b16-Unity 5.4.0p3: Do not set DSP Buffer Size to Best in Audio
Manager in the Inspector for now or you will encounter audio distortion. Set it to Good or Default
instead.
• Mobile App Submission to Oculus Store
Unity | Unity Reference Content | 145
• All mobile applications using Utilities 1.9 and 1.10 will fail Oculus Store submission due to a bug affecting
reserved interaction handling for the Universal Menu. Please remove previously-imported project files
as described in Importing the Oculus Utilities Package on page 11 and import the latest Utilities version,
and update your Unity editor to a compatible version if necessary.
Known Issues
• Unity 5.6 and later: If you have updated your OVRPlugin version from Utilities, you may see a spurious error
message when the Editor first launches saying “Multiple plugins with the same name 'ovrplugin'”. Please
disregard.
• When using Android SDK Tools 25.3.1 or newer, we recommend using Oculus Utilities 1.15 or newer in
combination with Unity 5.4.5p2 or newer, 5.5.3p3 or newer, 5.6.0p3 or newer, or 2017.1.0b5 or newer.
• When Single Pass is enabled, building mobile projects will fail with the error message “Shader error in
'Mobile/Bumped Detail Diffuse'” in certain cases. For more information, see “Known Issues” in Single Pass
Stereo Rendering (Preview, Mobile Only) on page 75.
• Two graphics driver issues affect mobile applications with Single Pass enabled using some S8 or S8+ phones
with Unity 5.6.0p2-3. They can occur when Standard Shader Quality is set to low, or when you are using
tree objects. For more information and workarounds, see “Known Issues” in Single Pass Stereo Rendering
(Preview, Mobile Only) on page 75.
• The following versions of Unity require the Visual C++ Redistributable for Visual Studio 2015 or Rift builds
will fail to run in VR, and the error “Security error. This plugin is invalid!” will be reported in output_log.txt:
• 5.3.6p3-5.3.6p7
• 5.4.0f1-5.4.2p1
• 5.5.0b1-5.5.0b8
• Unity has a known issue such that parenting one VR camera to another will compound tracking twice. As a
workaround, make them siblings in the GameObject hierarchy.
• Rift
• All Unity versions prior to 5.4.3p3 leak 5MB/s if you have a Canvas in your scene, enable Run In
Background, and dismount the Rift. You can check OVRManager.hasVrFocus in an Update function to
disable your Canvases while the HMD is dismounted.
• Transparent VR Compositor Layers do not currently support multiple layers of occlusion.
• Gear VR
• Gear VR applications built with Unity 5.6.0f2 crash immediately upon launch, and Gear VR applications
built with 5.6.0p1 may crash when Multi-View is enabled.
• Do not use Utilities 1.11.0 due to a crash when returning to focus from Universal Menu or Quit to Home
dialog.
• Due to a Unity bug, the Camera pose can be corrupted by scripts in the first frame after being enabled
with VR support. As a workaround, use the latest Utilities version or zero out the eye anchor poses when
a new OVRCameraRig is spawned and the first frame after usePerEyeAnchors changes.
146 | Unity Reference Content | Unity
• With Unity 5.3, the world may appear tilted. As a workaround, use the latest Utilities version or disable
the virtual reality splash image.
• Mobile developers should not use Unity versions 5.3.6p1-2 and 5.4.0p1-2 due to incorrect positional
movement of the head.
• Unity 5.3.4-5.3.6p3 and Unity 5.4.0b16-Unity 5.4.0p3: Do not set DSP Buffer Size to Best in Audio
Manager in the Inspector for now or you will encounter audio distortion. Set it to Good or Default
instead.
• Mobile App Submission to Oculus Store
• All mobile applications using Utilities 1.9 and 1.10 will fail Oculus Store submission due to a bug affecting
reserved interaction handling for the Universal Menu. Please remove previously-imported project files
as described in Importing the Oculus Utilities Package on page 11 and import the latest Utilities version,
and update your Unity editor to a compatible version if necessary.
Beginning with this release, the Utilities package will also include the latest version of OVRPlugin, allowing us to
provide the latest features as quickly as possible.
When you import Utilities for Unity into a Unity project, if the OVRPlugin version included with the Utilities is
later than the version built into your editor, a pop-up dialog will give you the option to update it. We always
recommend using the latest available OVRPlugin version.
If you install OVRPlugin from the Utilities package and later wish to roll back to the version included with the
Editor for any reason, you may easily do so by selecting Tools > Oculus > Disable OVR Utilities Plugin.
For more information, please see “OVRPlugin” in Oculus Utilities for Unity on page 33.
Note: The update feature is currently not supported on OS X/macOS.
New Features
• Added OVRPlugin auto-updating (see above).
• Added support for preview Single Pass stereo rendering to Unity 5.6 (mobile only), which can provide a
significant reduction to CPU overhead. When enabled, objects are rendered to the left buffer and then
duplicated with minor adjustment to the right buffer, rather than drawing them in two separate passes. For
more information, see Single Pass Stereo Rendering (Preview, Mobile Only) on page 75.
• Performance Auditing Tool improvements. See Performance Auditing Tool (OVRLint) on page 104 for more
information.
• Now allows fixes to be applied to multiple instances of the same issue at once.
• Click on any reported object with a problem to highlight the relevant object in your scene.
Unity | Unity Reference Content | 147
API Changes
• Added OVRInput.RecenterController() to OVRIniput to recenter Gear VR Controller.
Known Issues
• Unity 5.6 and later: If you have updated your OVRPlugin version from Utilities, you may see a spurious error
message when the Editor first launches saying “Multiple plugins with the same name 'ovrplugin'”. Please
disregard.
• Gear VR applications built with Unity 5.6.0f2 crash immediately upon launch, and Gear VR applications built
with 5.6.0p1 may crash when Multi-View is enabled.
• Unity 5.6 and later: If you have updated your OVRPlugin version from Utilities, you may see a spurious error
message when the Editor first launches saying “Multiple plugins with the same name 'ovrplugin'”. Please
disregard.
• When Single Pass is enabled, building mobile projects will fail with the error message “Shader error in
'Mobile/Bumped Detail Diffuse'” in certain cases. For more information, see “Known Issues” in Single Pass
Stereo Rendering (Preview, Mobile Only) on page 75.
• Two graphics driver issues affect mobile applications with Single Pass enabled using some S8 or S8+ phones
with Unity 5.6.0p2-3. They can occur when Standard Shader Quality is set to low, or when you are using
tree objects. For more information and workarounds, see “Known Issues” in Single Pass Stereo Rendering
(Preview, Mobile Only) on page 75.
• The following versions of Unity require the Visual C++ Redistributable for Visual Studio 2015 or Rift builds
will fail to run in VR, and the error “Security error. This plugin is invalid!” will be reported in output_log.txt:
• 5.3.6p3-5.3.6p7
• 5.4.0f1-5.4.2p1
• 5.5.0b1-5.5.0b8
• Unity has a known issue such that parenting one VR camera to another will compound tracking twice. As a
workaround, make them siblings in the GameObject hierarchy.
• Rift
• All Unity versions prior to 5.4.3p3 leak 5MB/s if you have a Canvas in your scene, enable Run In
Background, and dismount the Rift. You can check OVRManager.hasVrFocus in an Update function to
disable your Canvases while the HMD is dismounted.
• Transparent VR Compositor Layers do not currently support multiple layers of occlusion.
• Gear VR
• Gear VR applications built with Unity 5.6.0f2 crash immediately upon launch, and Gear VR applications
built with 5.6.0p1 may crash when Multi-View is enabled.
• Do not use Utilities 1.11.0 due to a crash when returning to focus from Universal Menu or Quit to Home
dialog.
• Due to a Unity bug, the Camera pose can be corrupted by scripts in the first frame after being enabled
with VR support. As a workaround, use the latest Utilities version or zero out the eye anchor poses when
a new OVRCameraRig is spawned and the first frame after usePerEyeAnchors changes.
• With Unity 5.3, the world may appear tilted. As a workaround, use the latest Utilities version or disable
the virtual reality splash image.
• Mobile developers should not use Unity versions 5.3.6p1-2 and 5.4.0p1-2 due to incorrect positional
movement of the head.
• Unity 5.3.4-5.3.6p3 and Unity 5.4.0b16-Unity 5.4.0p3: Do not set DSP Buffer Size to Best in Audio
Manager in the Inspector for now or you will encounter audio distortion. Set it to Good or Default
instead.
• Mobile App Submission to Oculus Store
148 | Unity Reference Content | Unity
• All mobile applications using Utilities 1.9 and 1.10 will fail Oculus Store submission due to a bug affecting
reserved interaction handling for the Universal Menu. Please remove previously-imported project files
as described in Importing the Oculus Utilities Package on page 11 and import the latest Utilities version,
and update your Unity editor to a compatible version if necessary.
Unity 5.6.0f3 is the only supported version of 5.6 at this time. Earlier versions of 5.6.0 are not supported, and
5.6.0p1 has a crashing bug affecting Gear VR applications.
New Features
• Added android:installLocation="auto" to store-compatible AndroidManifest.xml.
Bug Fixes
• Fixed double-counting of orientation when recentering OVRPlayerController.
• Fixed NullReferenceExceptions on edit-and-continue.
• Fixed signature checking on non-English platforms.
• Fixed white flash.
• Fixed incorrect overlay states for ATW splash.
• Fixed VRAPI crash: "Invalid parms passed to vrapi_SubmitFrame".
• Fixed crashes in OVR::D3DUtil::Blitter::Blt due to inconsistent overlay lifetimes.
Known Issues
• Unity 5.6.0f3 is the only supported version of 5.6 at this time. Gear VR applications built with Unity 5.6.0f2
crash immediately upon launch, and Gear VR applications built with 5.6.0p1 may crash when Multi-View is
enabled.
• The following versions of Unity require the Visual C++ Redistributable for Visual Studio 2015 or Rift builds
will fail to run in VR, and the error “Security error. This plugin is invalid!” will be reported in output_log.txt:
• 5.3.6p3-5.3.6p7
• 5.4.0f1-5.4.2p1
• 5.5.0b1-5.5.0b8
• Unity has a known issue such that parenting one VR camera to another will compound tracking twice. As a
workaround, make them siblings in the GameObject hierarchy.
• Rift
• All Unity versions prior to 5.4.3p3 leak 5MB/s if you have a Canvas in your scene, enable Run In
Background, and dismount the Rift. You can check OVRManager.hasVrFocus in an Update function to
disable your Canvases while the HMD is dismounted.
• Transparent VR Compositor Layers do not currently support multiple layers of occlusion.
• Gear VR
Unity | Unity Reference Content | 149
• Do not use Utilities 1.11.0 due to a crash when returning to focus from Universal Menu or Quit to Home
dialog.
• Due to a Unity bug, the Camera pose can be corrupted by scripts in the first frame after being enabled
with VR support. As a workaround, use the latest Utilities version or zero out the eye anchor poses when
a new OVRCameraRig is spawned and the first frame after usePerEyeAnchors changes.
• With Unity 5.3, the world may appear tilted. As a workaround, use the latest Utilities version or disable
the virtual reality splash image.
• Mobile developers should not use Unity versions 5.3.6p1-2 and 5.4.0p1-2 due to incorrect positional
movement of the head.
• Unity 5.3.4-5.3.6p3 and Unity 5.4.0b16-Unity 5.4.0p3: Do not set DSP Buffer Size to Best in Audio
Manager in the Inspector for now or you will encounter audio distortion. Set it to Good or Default
instead.
• Mobile App Submission to Oculus Store
• All mobile applications using Utilities 1.9 and 1.10 will fail Oculus Store submission due to a bug affecting
reserved interaction handling for the Universal Menu. Please remove previously-imported project files
as described in Importing the Oculus Utilities Package on page 11 and import the latest Utilities version,
and update your Unity editor to a compatible version if necessary.
Version Compatibility
On initial release, Utilities v 1.12.0 is compatible with our recommended Unity 5.3.8f1 and 5.4.5f1. For up-to-
date compatibility information, see Compatibility and Version Requirements on page 7.
New Features
• Added support for the Gear VR Controller to OVRInput. For more information, see OVRInput on page 44.
API Changes
• Added OVRPlugin.GetAppFramerate() to OVRDisplay.cs; returns frame rate reported by Oculus
plugin (Rift and Gear VR). Requires a Unity Editor version we recommend for use with Utilities 1.12 - see
Compatibility and Version Requirements on page 7 for details.
Bug Fixes
• Changed OVRInput.GetAngularVelocity(..) and OVRInput.GetAngularAcceleration(..) to return Vector3
instead of Quaternion, avoiding issues for rates above 2*pi. Developers who need the old behavior for any
reason may use Quaternion.Euler(..).
• Gear VR:
• Fixed bug causing mobile applications built with Unity versions compatible with Utilities 1.11.0 to crash
when returning to focus from Universal Menu or Quit to Home dialog, or when the Gear VR is taken off
for several seconds, then put back on.
150 | Unity Reference Content | Unity
• Fixed black screen during launch with developer mode enabled in some Unity 5.3 versions using our 1.11
integration.
• Fixed bug causing mobile apps using Utilities 1.11.0 to appear tilted when enabling a virtual reality
splash screen.
• VR Compositor Layers:
1. Added OVRUnderlayTransparentOccluder, which was missing in previous versions.
2. Fixed issue causing layer colors to appear washed out when using render targets as input textures on PC.
3. Fixed issue where right-side textures were lost when using stereo pairs of OVROverlays.
Known Issues
• The following versions of Unity require the Visual C++ Redistributable for Visual Studio 2015 or Rift builds
will fail to run in VR, and the error “Security error. This plugin is invalid!” will be reported in output_log.txt:
• 5.3.6p3-5.3.6p7
• 5.4.0f1-5.4.2p1
• 5.5.0b1-5.5.0b8
• Unity has a known issue such that parenting one VR camera to another will compound tracking twice. As a
workaround, make them siblings in the GameObject hierarchy.
• Rift
• All Unity versions prior to 5.4.3p3 leak 5MB/s if you have a Canvas in your scene, enable Run In
Background, and dismount the Rift. You can check OVRManager.hasVrFocus in an Update function to
disable your Canvases while the HMD is dismounted.
• Transparent VR Compositor Layers do not currently support multiple layers of occlusion.
• Gear VR
• Do not use Utilities 1.11.0 due to a crash when returning to focus from Universal Menu or Quit to Home
dialog.
• Due to a Unity bug, the Camera pose can be corrupted by scripts in the first frame after being enabled
with VR support. As a workaround, use Utilities 1.12 or zero out the eye anchor poses when a new
OVRCameraRig is spawned and the first frame after usePerEyeAnchors changes.
• With Unity 5.3, the world may appear tilted. As a workaround, use Utilities 1.12 or disable the virtual
reality splash image.
• Mobile developers should not use Unity versions 5.3.6p1-2 and 5.4.0p1-2 due to incorrect positional
movement of the head.
• Unity 5.3.4-5.3.6p3 and Unity 5.4.0b16-Unity 5.4.0p3: Do not set DSP Buffer Size to Best in Audio
Manager in the Inspector for now or you will encounter audio distortion. Set it to Good or Default
instead.
• Mobile App Submission to Oculus Store
• All mobile applications using Utilities 1.9 and 1.10 will fail Oculus Store submission due to a bug affecting
reserved interaction handling for the Universal Menu. Please remove previously-imported project files
as described in Importing the Oculus Utilities Package on page 11 and import Utilities version 1.12, and
update your Unity editor to a compatible version if necessary.
• When building a mobile application for submission to the Oculus Store, you must set Install Location to
Auto in addition to generating a custom manifest as described in Building Mobile Applications on page
17.
If you have previously imported a Unity integration package, you must delete all Oculus Integration content
before importing the new Unity package. For more information, see Importing the Oculus Utilities Package on
page 11.
New Features
• Added Performance Auditing Tool for Rift and mobile development. This tool verifies that your VR project
configuration and settings are consistent with our recommendations. For more information, see Performance
Auditing Tool (OVRLint) on page 104.
• Added OVRGrabber and OVRGrabbable scripts for Oculus Touch to the Room sample in Assets/Scenes/.
For details, see our Unity Reference Content on page 123.
• (Mobile only) Added off-center cubemap support to OVROverlay, allowing you to display an overlay as a
cubemap with a texture coordinate offset to increase resolution for areas of interest. For more information,
see OVROverlay in our Unity Reference Content on page 123.
API Changes
• Deprecated OVRProfile.
Bug Fixes
• Fixed bug affecting reserved interaction handling for the Universal Menu that caused all mobile applications
using Utilities 1.9 and 1.10 to fail Oculus Store submission.
Known Issues
• Adaptive Resolution is not currently working and should be disabled. If applications using Adaptive
Resolution reach 45 Hz, they will remain stuck at that frame rate until relaunched. A fix is planned for the Rift
1.12 runtime release and should not require any application changes.
• The following versions of Unity require the Visual C++ Redistributable for Visual Studio 2015 or Rift builds
will fail to run in VR, and the error “Security error. This plugin is invalid!” will be reported in output_log.txt:
• 5.3.6p3-5.3.6p7
• 5.4.0f1-5.4.2p1
• 5.5.0b1-5.5.0b8
• Rift
• All Unity versions prior to 5.4.3p3 leak 5MB/s if you have a Canvas in your scene, enable Run In
Background, and dismount the Rift. You can check OVRManager.hasVrFocus in an Update function to
disable your Canvases while the HMD is dismounted.
• Gear VR
152 | Unity Reference Content | Unity
• Due to a Unity bug, the Camera pose can be corrupted by scripts in the first frame after being enabled
with VR support. As a workaround, use Utilities 1.11 or zero out the eye anchor poses when a new
OVRCameraRig is spawned and the first frame after usePerEyeAnchors changes.
• With Unity 5.3, the world may appear tilted. As a workaround, use Utilities 1.10 or disable the virtual
reality splash image.
• All mobile applications using Utilities 1.9 and 1.10 will fail Oculus Store submission due to a bug affecting
reserved interaction handling for the Universal Menu. Please remove previously-imported project files
as described in Importing the Oculus Utilities Package on page 11 and import Utilities version 1.11, and
update your Unity editor to a compatible version if necessary.
• Mobile developers should not use Unity versions 5.3.6p1-2 and 5.4.0p1-2 due to incorrect positional
movement of the head.
• Unity 5.3.4-5.3.6p3 and Unity 5.4.0b16-Unity 5.4.0p3: Do not set DSP Buffer Size to Best in Audio
Manager in the Inspector for now or you will encounter audio distortion. Set it to Good or Default
instead.
If you have previously imported a Unity integration package, you must delete all Oculus Integration content
before importing the new Unity package. For more information, see Importing the Oculus Utilities Package on
page 11.
Utilities 1.10 adds an option to build in APK for submission to the Oculus Store in Tools > Oculus. It also
includes a fix for an issue that caused poor performance or freezing when using multiple VR cameras or VR
Compositor underlays with Gear VR. Any mobile application using either of these should update to this version.
New Features
• Added option to Tools > Oculus to build APK for submission to Oculus Store.
• Added Rift support for cubemap overlays to VR Compositor Layers.
Bug Fixes
• Fixed poor performance or freezing bug when using multiple VR cameras or VR Compositor underlays with
Gear VR.
• Fixed memory leak in OVROverlay.
• Fixed uncommon issue in which setting mirror to full screen caused app rendering to freeze in Rift.
Known Issues
• The following versions of Unity require the Visual C++ Redistributable for Visual Studio 2015 or Rift builds
will fail to run in VR, and the error “Security error. This plugin is invalid!” will be reported in output_log.txt:
• 5.3.6p3-5.3.6p7
• 5.4.0f1-5.4.2p1
• 5.5.0b1-5.5.0b8
• Rift
Unity | Unity Reference Content | 153
• All Unity versions leak 5MB/s if you have a Canvas in your scene, enable Run In Background, and
dismount the Rift. You can check OVRManager.hasVrFocus in an Update function to disable your
Canvases while the HMD is dismounted.
• Gear VR
• Mobile developers should not use Unity versions 5.3.6p1-2 and 5.4.0p1-2 due to incorrect positional
movement of the head.
• Unity 5.3.4-5.3.6p3 and Unity 5.4.0b16-Unity 5.4.0p3: Do not set DSP Buffer Size to Best in Audio
Manager in the Inspector for now or you will encounter audio distortion. Set it to Good or Default
instead.
• Touch
• For PCs using Oculus runtime 1.10, OVRInput.GetConnectedControllers() does not mark Touch
controllers as disconnected when batteries are removed, and the input mask returns Touch (Left+Right)
active when only one controller is on. This issue will resolve automatically when runtime 1.11 is released.
If you have previously imported a Unity integration package, you must delete all Oculus Integration content
before importing the new Unity package. For more information, see Importing the Oculus Utilities Package on
page 11.
This release adds Gear VR touchpad and back button support to OVRInput.
Be sure to check out the new Mono Optimization sample included in Unity Sample Framework on page 115
v1.5.1. Monoscopically rendering distant content in a scene can offer significant rendering performance
improvements.
New Features
• Added Gear VR Touchpad and back button support to OVRInput.
• OVRInput.Controller.Active now automatically switches away from Touch if the user is not holding it.
• OVRBoundary now supports more than 256 Guardian System bounds points.
• Improved image quality for higher values in VRSettings.renderScale due to mipmapping. (Rift)
API Changes
• OVRInput.Button and OVRInput.RawButton events now report Gear VR touchpad swipes and back button
presses.
Bug Fixes
• Unity 5.3.6p8, 5.4.2p2, and 5.5.0b9 correct a failure to report shoulder button events in OVRInput when
used with Utilities 1.9.0.
• Fixed dependency on the Visual C++ Redistributable for Visual Studio 2015 causing Rift builds to fail to run
in VR in some versions of Unity (see Known Issues for more information).
• Fixed 5MB/s memory leak when using OVROverlay.
154 | Unity Reference Content | Unity
Known Issues
• OVRInput fails to report shoulder button events when Utilities 1.9.0 is used with Unity versions 5.4.2p1 and
5.5.0b8 or earlier.
• The following versions of Unity require the Visual C++ Redistributable for Visual Studio 2015 or Rift builds
will fail to run in VR, and the error “Security error. This plugin is invalid!” will be reported in output_log.txt:
• 5.3.6p3-5.3.6p7
• 5.4.0f1-5.4.2p1
• 5.5.0b1-5.5.0b8
• Rift
• All Unity versions leak 5MB/s if you have a Canvas in your scene, enable Run In Background, and
dismount the Rift. You can check OVRManager.hasVrFocus in an Update function to disable your
Canvases while the HMD is dismounted.
• Gear VR
• Mobile developers should not use Unity versions 5.3.6p1-2 and 5.4.0p1-2 due to incorrect positional
movement of the head.
• Unity 5 automatically generates manifest files with Android builds that will cause them to be
automatically rejected by the Oculus Store submission portal. If this is blocking your application
submission, please let us know on our Developer forum and we will work with you on a temporary
workaround.
• Unity 5.3.4-5.3.6p3 and Unity 5.4.0b16-Unity 5.4.0p3: Do not set DSP Buffer Size to Best in Audio
Manager in the Inspector for now or you will encounter audio distortion. Set it to Good or Default
instead.
Mobile SDK Examples has been deprecated; use the Unity Sample Framework on page 115 instead.
Note: Due to issues with earlier releases, we now recommend all developers update to 5.3.6p5 or
version 5.4.1p1 or later.
New Features
• Added support for the Oculus Guardian System, which visualizes the bounds of a user-defined Play Area.
Note that it is currently unsupported by public versions of the Oculus runtime. See OVRBoundary Guardian
System API on page 63 for more information.
• Added underlay support allowing VR compositor layers to be rendered behind the eye buffer.
• Added support for stereoscopic cubemap VR compositor layers (mobile only).
Unity | Unity Reference Content | 155
API Changes
• Added OVRBoundary API for interacting with the Oculus Guardian System.
• Removed OVRTrackerBounds.
Bug Fixes
• Fixed black screen issue related to unplugging the HDMI cable and re-plugging it back in.
• Fixed Touch judder issue.
Known Issues
• Due to issues with earlier releases, we now recommend all developers update to 5.3.6p5 or version 5.4.1p1
or later. 5.3.6p3-5.3.6p4 are also known to work.
• Gear VR
• Mobile developers should not use Unity versions 5.3.6p1-2 and 5.4.0p1-2 due to incorrect positional
movement of the head.
• Unity 5 automatically generates manifest files with Android builds that will cause them to be
automatically rejected by the Oculus Store submission portal. If this is blocking your application
submission, please let us know on our Developer Forum and we will work with you on a temporary
workaround.
• Gear VR developers using Unity 5.3.4 or later, or using Unity 5.4.0b16 and later: Do not set DSP Buffer
Size to Best in Audio Manager in the Inspector for now or you will encounter audio distortion. Set it to
Good or Default instead.
If you have previously imported a Unity integration package, you must delete all Oculus Integration content
before importing the new Unity package. For more information, see Importing the Oculus Utilities Package on
page 11.
Note: Unity versions prior to 5.3.4p5 (or 5.3.3p3 + OVRPlugin 1.3) are no longer supported. In addition,
Mobile developers should not use Unity versions 5.3.6p1-2 and 5.4.0p1-2 due to incorrect positional
movement of the head.
New Features
• Updated OVROverlay.cs to support cubemap (skybox) and hemicylinder overlay shapes (mobile only) in
addition to the existing quadrilateral Game Object shape.
• Added OVRRTOverlayConnector to stream Render Texture contents to an OVROverlay. For more
information, see VR Compositor Layers on page 69.
• Added runtime support for Adaptive Resolution (see description in OVRManager).
API Changes
• Unity versions prior to 5.3.4p5 (or 5.3.3p3 + OVRPlugin 1.3) are no longer supported.
156 | Unity Reference Content | Unity
Bug Fixes
• Fixed backwards head-neck model z translation on Gear VR.
• When targeting Oculus Touch, OVRInput.SetControllerVibration calls are now limited to 30 per second
due to performance issues; additional calls are discarded. (Note: we recommend using OVRHaptics
to control Touch vibrations - it provides better haptics quality without the performance issues of
OVRInput.SetControllerVibration().
• Fixed crash in CreateDirect3D11SurfaceFromDXGISurface after eye buffer re-allocation on Rift.
• Fixed OVRInput Xbox controller detection with Unity on Windows 10 Anniversary Edition.
• Fixed delay when loading with “Run in Background” enabled.
• Fixed missing runtime support for adaptive viewport scaling.
Known Issues
• Gear VR
• Mobile developers should not use Unity versions 5.3.6p1-2 and 5.4.0p1-2 due to incorrect positional
movement of the head.
• Unity 5 automatically generates manifest files with Android builds that will cause them to be
automatically rejected by the Oculus Store submission portal. If this is blocking your application
submission, please let us know on our Developer Forum and we will work with you on a temporary
workaround.
• Gear VR developers using Unity 5.3.4 or later, or using Unity 5.4.0b16 and later: Do not set DSP Buffer
Size to Best in Audio Manager in the Inspector for now or you will encounter audio distortion. Set it to
Good or Default instead.
New Features
• Added Adaptive Resolution to OVRManager, which automatically scales down app resolution when GPU
utilization exceeds 85%. See “OVRManager” in Unity Components for details. (Rift only, requires Unity v 5.4
or later)
• OVR Screenshot Wizard size parameter is now freeform instead of dropdown selection for greater flexibility.
• Added recommended anti-aliasing level to help applications choose the right balance between performance
and quality.
• Added support for more than one simultaneous OVROverlay. Now apps can show up to 3 overlay quads on
Gear VR and 15 on Rift.
Unity | Unity Reference Content | 157
API Changes
• Added OVRHaptics.cs and OVRHapticsClip.cs to programmatically control haptics for Oculus Touch
controller. See OVRHaptics for Oculus Touch on page 64 for more information.
• Added public members Enable Adaptive Resolution, Max Render Scale, and Min Render Scale to
OVRManager.
• Added OVRManager.useRecommendedMSAALevel to enable auto-selection of anti-aliasing level based on
device performance.
• Added OVRManager.useIPDInPositionTracking to allow apps to separately disable head position tracking
(see OVRManager.usePositionTracking) and stereopsis.
Bug Fixes
• Fixed bug preventing power save from activating on Gear VR.
• Fixed counter-intuitive behavior where disabling OVRManager.usePositionTracking prevented proper eye
separation by freezing the eye camera positions at their original offset.
Known Issues
• Gear VR
• Unity 5 automatically generates manifest files with Android builds that will cause them to be
automatically rejected by the Oculus Store submission portal. If this is blocking your application
submission, please let us know on our Developer Forum and we will work with you on a temporary
workaround.
• Gear VR developers using Unity 5.3.4 or later, or using Unity 5.4.0b16 and later: Do not set DSP Buffer
Size to Best in Audio Manager in the Inspector for now or you will encounter audio distortion. Set it to
Good or Default instead.
New Features
• Added OVR Screenshot and OVR Capture Probe tools, which exports a 360 screenshot of game scenes in
cube map format. See Cubemap Screenshots on page 96 for more information.
• Switched to built-in volume indicator on mobile.
• Exposed OVRManager.vsyncCount to allow half or third-frame rate rendering on mobile.
• Added bool OVRManager.instance.isPowerSavingActive (Gear VR).
Bug Fixes
• Repeatedly changing resolution or MSAA level no longer causes slowdown or crashing.
• Fixed scale of OVRManager.batteryLevel and OVRManager.batteryTemperature.
• Fixed race condition leading to black screens on Rift in some CPU-heavy cases.
• Fixed memory bloat due to unpooled buffers when using MSAA.
158 | Unity Reference Content | Unity
Known Issues
• Gear VR developers using Unity 5.3.4 or later, or using Unity 5.4.0b16 and later: Do not set DSP Buffer
Size to Best in Audio Manager in the Inspector for now or you will encounter audio distortion. Set it to
Good or Default instead.
New Features
• OVRInput may now be used without an OVRManager instance in the scene.
API Changes
• Restored OVRVolumeControl.
Bug Fixes
• OVRManager.instance.usePositionTracking now toggles the head model on Gear VR.
• Fixed incorrect fog interaction with transparent UI shader.
• Fixed crash on start with Unity 5.4.0b14 and b15 on Gear VR.
• Restored OVRVolumeControl, which was accidentally removed in 1.3.0.
Known Issues
• Utilities 1.3.0: Volume control will be missing on mobile applications until the release of Mobile SDK 1.0.2.
OVRVolumeControl is available with Utilities v 0.1.3.0 and earlier. It was also restored in Utilities v 1.3.2.
Note: Floor-level tracking will often be used with standing experiences, but there may be situations in
which eye-level tracking is a better fit for a standing experience, or floor-level tracking is a better fit for a
seated experience.
Any application running Unity should now be able to pull the correct height information.
New Features
• Added support for PC SDK 1.3, including support for Rift consumer version hardware.
• Added support for Asynchronous TimeWarp and Phase Sync.
• Added Rift Remote controller support.
• Added application lifecycle management, including VR-initiated backgrounding and exit.
• Exposed proximity sensor.
• Added support for multiple trackers.
• Exposed velocity and acceleration for all tracked nodes.
• Added support for EyeLevel and FloorLevel tracking origins.
• Audio input and output now automatically use the Rift microphone and headphones (if enabled in the
Oculus app).
• Rift’s inter-axial distance slider now affects the distance between Unity’s eye poses.
• Splash screen now uses Asynchronous TimeWarp for smooth tracking during load.
• Added experimental D3D 12 rendering support.
• Added events for focus change.
• Added events for audio device changes requiring sound restart.
• Added ControllerTracked state.
• Reduced head tracking latency on Gear VR by updating on render thread
• Improved performance by reducing lock contention.
• Exposed power management (CPU and GPU levels) on Android.
• Exposed queue-ahead on Android to trade latency for CPU-GPU parallelism.
API Changes
• OVRTracker.GetPose() no longer takes a prediction time. It now takes an optional index specifying the
tracker whose pose you want.
• OVRTracker.frustum has been replaced by OVRTracker.GetFrustum(), which takes an optional
index specifying the tracker whose frustum you want.
• OVRManager.isUserPresent is true when the proximity sensor detects the user.
• OVRInput.GetControllerLocal[Angular]Velocity/Acceleration exposes the linear and angular
velocity and rotation of each Touch controller.
• OVRDisplay.velocity exposes the head’s linear velocity.
• OVRDisplay.angularAcceleration exposes the head’s angular acceleration.
• Removed OVRGamepadController.cs and OVRInputControl.cs scripts, which have been replaced by the new
OVRInput.cs script. Refer to OVRInput for more information.
• Added public member Tracking Origin Type to OVR Manager.
• Added “floor level” reference frame for apps that need accurate floor height.
160 | Unity Reference Content | Unity
Bug Fixes
• Removed redundant axial deadzone handling from Xbox gamepad input.
• Fixed OVRManager.monoscopic to display left eye buffer to both eyes and use center eye pose.
• Application lifetime management now works, even without the Utilities.
• Fixed crash when running VR apps with Rift disconnected.
• OVRManager.isUserPresent now correctly reports proximity sensor output.
• OVRManager.isHMDPresent now correctly reports Gear VR docking state.
• We now prevent AFR SLI on NVIDIA hardware to avoid black screens/flicker.
• Fixed drift between TimeWarp start matrix and view matrix.
• Fixed crash when main monitor is on one adapter and Rift is on another.
• Fixed crash on Mac due to OVRPlugin being uninitialized before first access.
• Increased dead zone on OVRInput stick input to prevent drift.
• Fixed handedness issue with angular head velocity on Rift.
• Fixed handedness issue with rotation of OVROverlay quads.
• Fixed crash in OVROverlay when using D3D 12 and compressed textures.
• Fixed crashes due to thread synchronization checks on Gear VR.
• Fixed loss of input handling while paused.
• Fixed artifact in which bars appeared around mirror image when using occlusion mesh.
• Fixed Android logcat spam about OVR_TW_SetDebugMode.
• Gear VR logs now report VRAPI loader version, not SystemActivites version.
• Statically linking MSVC runtime to avoid missing DLL dependencies.
• Fixed black screen when HMD was reconnected: notifying Unity of display lost.
Known Issues
• Volume control will be missing on mobile applications until the release of Mobile SDK 1.0.2. To restore
OVRVolumeControl, please use an older copy of the Utilities.
You must download and install OVRPlugin from our website if you are using the following Unity versions:
After you have downloaded and installed Unity, take these steps to install OVRPlugin:
1. Close the Unity Editor if it is currently running.
2. Navigate to C:\Program Files\Unity\Editor\Data\VR\oculus
3. Delete all contents of the directory.
4. Extract the OVRPlugin zip, open the folder 5.3\oculus, and copy all of its contents into C:\Program Files
\Unity\Editor\Data\VR\oculus.
Note: Do not install OVRPlugin version 1.3.2 with any version of Unity 5.3 prior to 5.3.3p3 or it will not
work properly.
After you have downloaded and installed Unity, take these steps to install OVRPlugin:
Note: Do not install OVRPlugin version 1.3.2 with any version of Unity 5.4 prior to 5.4.0b11, or it will not
work properly.
To use Unity 5.3.4p1 with the Oculus Rift or Samsung Gear VR, you must download and install our OVRPlugin
for Unity 1.3.0, available from our Downloads Page.
After you have downloaded and installed Unity 5.3.4p1, take these steps to install OVRPlugin:
Note: Do not install OVRPlugin version 1.3.0 with any version of Unity other than 5.3.4p1, or it will not
work properly.
162 | Unity Reference Content | Unity
This document provides an overview of new features, improvements, and fixes included in the latest version of
the Utilities for Unity 5.x. For information on first-party changes to Unity VR support for Oculus, see the Unity
Release Notes for the appropriate version.
Utilities for Unity 0.1.3 extends OVRInput support to mobile. OVRInputControl and OVRGamepadController are
now deprecated and will be removed in a future release.
Mobile input bindings are now automatically added to InputManager.asset if they do not already exist - it is
no longer required to replace InputManager.asset with Oculus’ version. However, this asset is still provided for
now to maintain legacy support for the deprecated OVRGamepadController and OVRInputControl scripts.
New Features
• Default mobile input bindings are now programmatically generated when projects are imported if they do
not already exist.
• Replacing InputManager.asset is no longer required to enable gamepad support on mobile.
API Changes
• Added mobile support OVRInput.
• Deprecated OVRInputControl and OVRGamepadController.
Bug Fixes
• Fixed mobile gamepad thumbstick deadzone/drift handling and axis scaling.
• Fixed mobile gamepad support when multiple gamepads are paired.
• Fixed mobile gamepad bindings for triggers, D-pad, thumbstick presses, etc.
New Features
• Redesigned input API for Oculus Touch controllers and Xbox gamepads.
• Added h264 hardware-decoder plugin for Gear VR.
• Added “face-locked” layer support to OVROverlay when parented to the camera.
• Reduced latency in the pose used by the main thread for raycasting, etc.
• Updated to PC SDK 0.7 and Mobile SDK 0.6.2.0.
• Enabled VRSettings.renderScale on Gear VR.
Unity | Unity Reference Content | 163
API Changes
• The Utilities package now requires Unity 5.1 or higher.
• Added OVRInput API alpha. Refer to documentation for usage.
• Exposed LeftHand/RightHand anchors for tracked controllers in OVRCameraRig.
Bug Fixes
• Restored ability to toggle settings such as monoscopic rendering and position tracking.
• HSWDismissed event is now correctly raised when the HSW is dismissed.
• Fixed handedness of reported velocity and acceleration values.
• OVRPlayerController now moves at a consistent speed regardless of scale.
Known Issues
• Tearing in OS X: Editor preview and standalone players do not vsync properly, resulting in a vertical tear
and/or judder on DK2.
• When switching between a mobile application and System Activities screen, the back button becomes stuck
in the "down" state. For more information and workarounds, please see Troubleshooting and Known Issues.
Overview
This is the initial release of Oculus Utilities for Unity, for use with Unity versions 5.1.2 and later. The Utilities
extend Unity's built-in virtual reality support with the following features:
The Oculus Utilities for Unity expose largely the same API as the Oculus Unity Integration, but they offer all the
benefits of Unity's built-in VR support:
• Improved rendering efficiency with less redundant work performed for each eye.
• Seamless integration with the Unity Editor, including in-Editor preview and direct mode support.
164 | Unity Reference Content | Unity
• Improved stability and tighter integration with features like anti-aliasing and shadows.
• Non-distorted monoscopic preview on the main monitor.
• Oculus SDK 0.6.0.1 support (PC and mobile).
Known Issues
• Pitch, roll, and translation are off for the tracking reference frame in Unity 5.1.1, especially in apps with
multiple scenes.
• Mac OS X tearing. VSync is currently broken on the Mac, but works when you build for Gear VR.
• Performance loss. CPU utilization may be slightly higher than in previous versions of Unity.
• OVRPlayerController might end up in an unexpected rotation after OVRDisplay.RecenterPose() is called. To
fix it, call RecenterPose() again.
New Features
• Disabled eye texture anti-aliasing when using deferred rendering. This fixes the black screen issue.
• Eliminated the need for the DirectToRift.exe in Unity 4.6.3p2 and later.
• Removed the hard dependency from the Oculus runtime. Apps now render in mono without tracking when
VR isn't present.
Unity | Unity Reference Content | 165
As with Mobile SDK v 0.5.0, Unity developers using this SDK version must install the Oculus Runtime for
Windows or OS X. This requirement will be addressed in a future release of the SDK.
Bug Fixes
• Rework System Activities Event handling to prevent any per-frame allocations that could trigger Garbage
Collector.
Known Issues
• For use with the Mobile SDK, we recommend Unity versions 4.6.3. The Mobile SDK is compatible with Unity
5.0.1p2, which addresses a problem with OpenGL ES 3.0, but there is still a known Android ION memory
leak. Please check back for updates.
New Features
w
Bug Fixes
• Health and Safety Warning no longer displays in editor Play Mode if a DK2 is not attached.
Known Issues
• For use with the Mobile SDK, we recommend Unity versions 4.6.3, which includes Android 5.0 - Lollipop
support as well as important Android bug fixes. While the Mobile SDK is compatible with Unity 5.0.0p2
166 | Unity Reference Content | Unity
and higher, several issues are still known to exist, including an Android ION memory leak and compatibility
issues with OpenGL ES 3.0. Please check back for updates.
VrPlatform entitlement checking is now disabled by default in Unity; handling for native development is
unchanged. If your application requires this feature, please refer to the Mobile SDK Documentation for
information on how to enable entitlement checking.
New Features
• Synced with the Oculus PC SDK 0.5.0.1 Beta.
• VrPlatform entitlement checking is now disabled by default.
Bug Fixes
• Health and Safety Warning no longer displays in editor Play Mode if a DK2 is not attached.
Known Issues
• For use with the Mobile SDK, we recommend Unity versions 4.6.3, which includes Android 5.0 - Lollipop
support as well as important Android bug fixes. While the Mobile SDK is compatible with Unity 5.0.0p2
and higher, several issues are still known to exist, including an Android ION memory leak and compatibility
issues with OpenGL ES 3.0. Please check back for updates.
New Features
• New Mobile Unity Integration Based on Oculus PC SDK 0.4.4
We would like to highlight the inclusion of the new Mobile Unity Integration with full DK2 support based on
the Oculus PC SDK 0.4.4. As this is a significant API refactor, please refer to the Unity Development Guide:
Migrating From Earlier Versions section for information on how to upgrade projects built with previous versions
of the Mobile Unity Integration.
Unity | Unity Reference Content | 167
API Changes
• Fix for camera height discrepancies between the Editor and Gear VR device.
• Moonlight Debug Util class names now prefixed with OVR to prevent namespace pollution.
• Provide callback for configuring VR Mode Parms on OVRCameraController; see OVRModeParms.cs for an
example.
New Features
• Added Unity Free support for Gear VR developers.
Bug Fixes
• Unity vignette rendering updated to match native (slightly increases effective FOV).
• Unity volume pop-up distance to match native.
API Changes
The following are changes to Unity components:
Table 5: Events
Behavior Changes
• OVRCameraRig’s position is always the initial center eye position.
• Eye anchor Transforms are tracked in OVRCameraRig’s local space.
• OVRPlayerController’s position is always at the user’s feet.
• IPD and FOV are fully determined by profile (PC only).
• Layered rendering: multiple OVRCameraRigs are fully supported (not advised for mobile).
• OVRCameraRig.*EyeAnchor Transforms give the relevant poses.
Upgrade Procedure
To upgrade, follow these steps:
1. Ensure you didn’t modify the structure of the OVRCameraController prefab. If your eye cameras are on
Game Objects named “CameraLeft” and “CameraRight” which are children of the OVRCameraController
Game Object (the default), then the prefab should cleanly upgrade to OVRCameraRig and continue to work
properly with the new integration.
2. Write down or take a screenshot of your settings from the inspectors for OVRCameraController,
OVRPlayerController, and OVRDevice. You will have to re-apply them later.
3. Remove the old integration by deleting the following from your project:
• OVR folder
• OVR Internal folder (if applicable)
• Any file in the Plugins folder with “Oculus” or “OVR” in the name
• Android-specific assets in the Plugins/Android folder, including: vrlib.jar, libOculusPlugin.so, res/raw and
res/values folders
4. Import the new integration.
5. Click Assets -> Import Package -> Custom Package…
6. Open OculusUnityIntegration.unitypackage
7. Click Import All.
8. Fix any compiler errors in your scripts. Refer to the API changes described above. Note that the substitution
of prefabs does not take place until after all script compile errors have been fixed.
9. Re-apply your previous settings to OVRCameraRig, OVRPlayerController, and OVRManager. Note that the
runtime camera positions have been adjusted to better match the camera positions set in the Unity editor. If
this is undesired, you can get back to the previous positions by adding a small offset to your camera:
eye-height (which has changed from 1.85 to 1.675); or (2) uncheck Use Profile Data on the converted
OVRPlayerController and then manually set the height of the OVRCameraRig to 1.85 by setting its y-
position. Note that if you decide to go with (1), then this height should be expected to change when
profile customization is added with a later release.
c. If you previously used an OVRPlayerController with Use Player Eye Height unchecked
on its OVRCameraContoller, then be sure uncheck Use Profile Data on your converted
OVRPlayerController. Then, add 0.15 to the y-position of the converted OVRCameraController.
b. Adjust the camera's x/z-position. If you previously used an OVRCameraController without an
OVRPlayerController, add 0.09 to the camera z-position relative to its y rotation (i.e. +0.09 to z if it has
0 y-rotation, -0.09 to z if it has 180 y-rotation, +0.09 to x if it has 90 y-rotation, -0.09 to x if it has 270 y-
rotation). If you previously used an OVRPlayerController, no action is needed.
10.Re-start Unity
----------------------------------------------------------------------
if ( cameraController.GetCameraForward( ref cameraForward ) &&
cameraController.GetCameraPosition( ref cameraPosition ) )
{
...
to
if (OVRManager.display.isPresent)
{
OVRDevice.ResetOrientation();
to
OVRManager.display.RecenterPose();
----------------------------------------------------------------------
cameraController.ReturnToLauncher();
to
OVRManager.instance.ReturnToLauncher();
----------------------------------------------------------------------
OVRDevice.GetBatteryTemperature();
OVRDevice.GetBatteryLevel();
to
OVRManager.batteryTemperature
OVRManager.batteryLevel
170 | Unity Reference Content | Unity
----------------------------------------------------------------------
OrientationOffset
----------------------------------------------------------------------
FollowOrientation
----------------------------------------------------------------------
The Sample Framework is also available from the Unity Asset Store.
Integration Changes
• The Oculus Unity packaging structure has changed. When upgrading to 1.25 we recommend deleting your
old copy of the Sample Framework and restarting Unity, then adding the 1.25 package.
Known Issues
• Sample Framework Android builds use a custom manifest and are not visible from Applications, and cannot
be launched from Oculus Home or the Android Application Launcher. To launch:
Unity 5.4 and later - please check Compatibility and Version Requirements on page 7 for up-to-date Unity
version recommendations.
The Sample Framework is also available from the Unity Asset Store.
Bug Fixes
• Fixed compatibility issue with the Avatars SDK and certain versions of the Unity Editor (Unity 2017.2 and up).
• Updates to the Android Manifest settings to support the latest mobile APIs.
Known Issues
• In versions earlier than 1.181., layers of the UI Overlay sample may remain visible after exiting the scene with
the Sample Framework UI.
• Building the Sample Framework project for mobile using Unity 5.6 creates an APK that immediately crashes.
• Sample Framework Android builds use a custom manifest and are not visible from Applications, and cannot
be launched from Oculus Home or the Android Application Launcher. To launch:
The Sample Framework is also available from the Unity Asset Store.
Bug Fixes
• Sample Framework 1.23 fixes an issue that caused blurry world geometry textures.
Known Issues
• In versions earlier than 1.181., layers of the UI Overlay sample may remain visible after exiting the scene with
the Sample Framework UI.
• Building the Sample Framework project for mobile using Unity 5.6 creates an APK that immediately crashes.
• Sample Framework Android builds use a custom manifest and are not visible from Applications, and cannot
be launched from Oculus Home or the Android Application Launcher. To launch:
6. Close Settings.
7. Open Apps.
8. Select Gear VR Service.
9. Select Oculus Sample Framework to launch.
The Sample Framework is also available from the Unity Asset Store.
New Features
• Added a new sample that demonstrates a variety of teleport and locomotion behaviors using an architecture
that can be extended to fit various designs. This sample supports switching between any combination of
teleports and linear motion at run time.
Known Issues
• In versions earlier than 1.181., layers of the UI Overlay sample may remain visible after exiting the scene with
the Sample Framework UI.
• Building the Sample Framework project for mobile using Unity 5.6 creates an APK that immediately crashes.
• Sample Framework Android builds use a custom manifest and are not visible from Applications, and cannot
be launched from Oculus Home or the Android Application Launcher. To launch:
1.20.1
This version of the Oculus Sample Framework for Unity pulls in the most recent Avatar SDK and Utilities for
Unity versions. Importing the Avatar SDK and Utilities for Unity separately is not required. It is compatible with
Unity 5.4 and later - please check Compatibility and Version Requirements on page 7 for up-to-date Unity
version recommendations.
The Sample Framework is also available from the Unity Asset Store. VR builds of the Sample Framework are
available for Rift and Gear VR from the Gallery section of the Oculus Store.
Bug Fixes
• Fixed Avatar SDK shader issue resulting in very long import time.
Unity | Unity Reference Content | 173
Known Issues
• The mobile Sample Framework APK available through the Gallery is currently out of date. For now, we
recommend using the Unity project.
• In versions earlier than 1.181., layers of the UI Overlay sample may remain visible after exiting the scene with
the Sample Framework UI.
• Building the Sample Framework project for mobile using Unity 5.6 creates an APK that immediately crashes.
• Sample Framework Android builds use a custom manifest and are not visible from Applications, and cannot
be launched from Oculus Home or the Android Application Launcher. To launch:
1.20.0
This version of the Oculus Sample Framework for Unity pulls in the most recent Avatar SDK and Utilities for
Unity versions. Importing the Avatar SDK and Utilities for Unity separately is not required. It is compatible with
Unity 5.4 and later - please check Compatibility and Version Requirements on page 7 for up-to-date Unity
version recommendations.
The Sample Framework is also available from the Unity Asset Store. VR builds of the Sample Framework are
available for Rift and Gear VR from the Gallery section of the Oculus Store.
Bug Fixes
• Fixed mobile black screen with MTRendering.
• Fixed auto install location issue.
Known Issues
• The mobile Sample Framework APK available through the Gallery is currently out of date. For now, we
recommend using the Unity project.
• In versions earlier than 1.181., layers of the UI Overlay sample may remain visible after exiting the scene with
the Sample Framework UI.
• Building the Sample Framework project for mobile using Unity 5.6 creates an APK that immediately crashes.
• Sample Framework Android builds use a custom manifest and are not visible from Applications, and cannot
be launched from Oculus Home or the Android Application Launcher. To launch:
6. Close Settings.
7. Open Apps.
8. Select Gear VR Service.
9. Select Oculus Sample Framework to launch.
1.19.0
This version of the Oculus Sample Framework for Unity pulls in the most recent Avatar SDK and Utilities for
Unity versions. Importing the Avatar SDK and Utilities for Unity separately is not required. It is compatible with
Unity 5.4 and later - please check Compatibility and Version Requirements on page 7 for up-to-date Unity
version recommendations.
The Sample Framework is also available from the Unity Asset Store. VR builds of the Sample Framework are
available for Rift and Gear VR from the Gallery section of the Oculus Store.
New Features
• Renamed “Input Focus System Overlay” sample to “Input Focus.” Removed Is System Overlay
Present? handling. That flag was removed in our Unity Integration 1.19.
Known Issues
• The mobile Sample Framework APK available through the Gallery is currently out of date. For now, we
recommend using the Unity project.
• In versions earlier than 1.181., layers of the UI Overlay sample may remain visible after exiting the scene with
the Sample Framework UI.
• Building the Sample Framework project for mobile using Unity 5.6 creates an APK that immediately crashes.
• Sample Framework Android builds use a custom manifest and are not visible from Applications, and cannot
be launched from Oculus Home or the Android Application Launcher. To launch:
1.18.1
This version of the Oculus Sample Framework for Unity pulls in the most recent Avatar SDK and Utilities for
Unity versions. Importing the Avatar SDK and Utilities for Unity separately is not required. It is compatible with
Unity 5.4 and later - please check Compatibility and Version Requirements on page 7 for up-to-date Unity
version recommendations.
Unity | Unity Reference Content | 175
The Sample Framework is also available from the Unity Asset Store. VR builds of the Sample Framework are
available for Rift and Gear VR from the Gallery section of the Oculus Store.
Bug Fixes
• Fixed bug causing UI Overlay sample layers to remain visible after exiting the scene with the Sample
Framework UI.
1.18.0
New Features
• Added the Distance Grab sample illustrating how to select and grab distant objects using Touch controllers.
For more information, see Distance Grab sample now available in Oculus Unity Sample Framework on our
Developer Blog.
• Added Guardian Boundary System sample illustrating use of the OVRBoundary API to interact with the
Guardian System.
• Added Input Focus System Overlay sample to illustrate Input Focus and System Overlay handling. A simple
application is paused and muted when it loses input focus, and tracked controllers are hidden when a menu
VR Compositor Layer is displayed.
Bug Fixes
• Fixed missing quad layer in OverlayUIDemo.
Known Issues
• The mobile Sample Framework APK available through the Gallery is currently out of date. For now, we
recommend focusing on the Unity project.
• Layers of the UI Overlay sample may remain visible after exiting the scene with the Sample Framework UI.
Fixed in 1.18.1.
• Building the Sample Framework project for mobile using Unity 5.6 creates an APK that immediately crashes.
• Sample Framework Android builds use a custom manifest and are not visible from Applications, and cannot
be launched from Oculus Home or the Android Application Launcher. To launch:
1.16.0-beta
This version of the Oculus Sample Framework for Unity 5 pulls in the most recent Avatar SDK and Utilities for
Unity versions. Importing the Avatar SDK and Utilities for Unity 5 separately is not required. It is compatible with
Unity 5.4 and up - please check Compatibility and Version Requirements on page 7 for up-to-date Unity version
recommendations.
176 | Unity Reference Content | Unity
For complete instructions on downloading and using the Sample Framework, see Unity Sample Framework on
page 115 in our developer documentation.
New Features
Known Issues
• Building the Sample Framework project for mobile using Unity 5.6 creates an APK that immediately crashes.
The APK is available through the Gallery section of the Oculus Store.
• Sample Framework Android builds use a custom manifest and are not visible from Applications, and cannot
be launched from Oculus Home or the Android Application Launcher. To launch:
1.14.0
This version of the Oculus Sample Framework for Unity 5 pulls in the most recent Avatar SDK and Utilities for
Unity versions. Importing the Avatar SDK and Utilities for Unity 5 separately is not required. It is compatible with
Unity 5.4 and up - please check Compatibility and Version Requirements on page 7 for up-to-date Unity version
recommendations.
For complete instructions on downloading and using the Sample Framework, see Unity Sample Framework on
page 115 in our developer documentation.
New Features
• Added Gear VR Controller menu navigation support for mobile binary and project.
Bug Fixes
• Fixed Gear VR touchpad navigation issues.
Unity | Unity Reference Content | 177
Known Issues
• Building the Sample Framework project for mobile using Unity 5.6 creates an APK that immediately crashes.
The APK is available through the Gallery section of the Oculus Store.
• Sample Framework Android builds use a custom manifest and are not visible from Applications, and cannot
be launched from Oculus Home or the Android Application Launcher. To launch:
1.12.0
This version of the Oculus Sample Framework for Unity 5 pulls in the most recent Avatar SDK and Utilities for
Unity versions. Importing the Avatar SDK and Utilities for Unity 5 separately is not required. It is compatible with
Unity 5.4 and up - please check Compatibility and Version Requirements on page 7 for up-to-date Unity version
recommendations.
For complete instructions on downloading and using the Sample Framework, see Unity Sample Framework on
page 115 in our developer documentation.
New Features
• Added OverlayUIDemo, which demonstrates creating a UI with a VR Compositor Layer to improve image
quality and anti-aliasing. Includes a quad overlay for Rift, and a quad and a cylinder overlay for mobile.
Known Issues
• In Oculus Rift builds, Oculus Remote support is currently limited to opening and closing the menu system.
• Sample Framework Android builds use a custom manifest and are not visible from Applications, and cannot
be launched from Oculus Home or the Android Application Launcher. To launch:
1.11.0
The Oculus Unity Sample Framework assists developers in implementing Unity applications by providing
sample scenes and guidelines for common VR-specific features such as hand presence, crosshairs, driving, and
first-person movement.
This download includes a Unity Package of the Sample Framework. VR applications of the Sample Framework
are also available for the Oculus Rift from our Downloads page, and for the Samsung Gear VR from the Gallery
section of the Oculus Store.
This version of the Oculus Sample Framework for Unity 5 pulls in the most recent Avatar SDK and Utilities for
Unity versions. Importing the Avatar SDK and Utilities for Unity 5 separately is not required. It is compatible with
Unity 5.4 and up - please check Compatibility and Version Requirements on page 7 for up-to-date Unity version
recommendations.
For complete instructions on downloading and using the Sample Framework, see Unity Sample Framework on
page 115 in our developer documentation.
New Features
Known Issues
• In Oculus Rift builds, Oculus Remote support is currently limited to opening and closing the menu system.
• Sample Framework Android builds use a custom manifest and are not visible from Applications, and cannot
be launched from Oculus Home or the Android Application Launcher. To launch:
1.5.1
The Oculus Unity Sample Framework assists developers in implementing Unity applications by providing
sample scenes and guidelines for common VR-specific features such as crosshairs, driving, and first-person
movement.
This download includes a Unity Project of the Sample Framework. VR applications of the Sample Framework
are also available for the Oculus Rift from our Downloads page, and for the Samsung Gear VR from the Gallery
section of the Oculus Store.
This version of the Oculus Sample Framework for Unity 5 pulls in Utilities for Unity version 1.9. Importing
Utilities for Unity 5 separately is not required. It is compatible with Unity 5.3 and up - please check
Compatibility and Version Requirements on page 7 for up-to-date Unity version recommendations.
For complete instructions on downloading and using the Sample Framework, see Unity Sample Framework on
page 115 in our developer documentation.
New Features
• Added two Mono Optimization sample scenes in which near content is rendered stereoscopically and
distant content is rendered monoscopically. Depending on the scene, this approach may produce significant
rendering performance improvements. For a detailed explanation, see Hybrid Mono Rendering in UE4 and
Unity in our Developer Blog.
Known Issues
• In Oculus Rift builds, Oculus Remote support is currently limited to opening and closing the menu system.
• Sample Framework Android builds use a custom manifest and are not visible from Applications, and cannot
be launched from Oculus Home or the Android Application Launcher. To launch:
1.5.0
The Oculus Unity Sample Framework assists developers in implementing Unity applications by providing
sample scenes and guidelines for common VR-specific features such as crosshairs, driving, and first-person
movement.
This download includes a Unity Project of the Sample Framework. VR applications of the Sample Framework
are also available for the Oculus Rift from our Downloads page, and for the Samsung Gear VR from the Gallery
Apps section of the Oculus Store.
This version of the Oculus Sample Framework for Unity 5 pulls in Utilities for Unity version 1.5. Importing
Utilities for Unity 5 separately is no longer required. It is compatible with Unity 5.3 and up - please check
Compatibility and Version Requirements on page 7 for up-to-date Unity version recommendations.
180 | Unity Reference Content | Unity
For complete instructions on downloading and using the Sample Framework, see Unity Sample Framework on
page 115 in our developer documentation.
New Features
• Added Movie Player, Per-Eye Cameras, Multiple Cameras, and Render Frame Rate scenes.
• Reorganized in-VR scenes list structure to de-clutter the scenes menu.
• Now includes Utilities for Unity 5 v 1.5; separately importing the Utilities unitypackage is no longer required.
Known Issues
• In Oculus Rift builds, Oculus Remote support is currently limited to opening and closing the menu system.
The Oculus Unity Sample Framework assists developers in implementing Unity applications by providing
sample scenes and guidelines for common VR-specific features such as crosshairs, driving, and first-person
movement. The Sample Framework can guide developers in producing reliable, comfortable applications and
avoiding common mistakes.
The Oculus Unity Sample Framework consists of a Unity project as well as application binaries for playing the
sample scenes in VR. Sample scenes are navigated and controlled in-app with a simple UI, which also provides
explanatory notes.
This download includes a Unity Project of the Sample Framework. VR applications of the Sample Framework are
also available for the Oculus Rift/DK2 from our Downloads page, and for the Samsung Gear VR from the Gallery
Apps section of the Oculus Store.
For complete instructions on downloading and using the Sample Framework, see Unity Sample Framework on
page 115 in our developer documentation.
Please let us know about any issues you encounter in the Oculus Unity Forum, and keep your eye out for
updates.
Upgrade Procedure
1. Replace any usage of OVRManager.instance.virtualTextureScale or
OVRManager.instance.nativeTextureScale with UnityEngine.VR.VRSettings.renderScale. The value of
renderScale is equal to nativeTextureScale * virtualTextureScale. If you set renderScale to a value that is less
than or equal to any value it has had since the application started, then virtual texture scaling will be used. If
you increase it higher, to a new maximum value, then the native scale is increased and the virtual scale is set
to 1.
Unity | Unity Reference Content | 181
Note: Don't forget to move any scripts, image effects, tags, or references from the left and right eye
anchors to the center eye anchor as noted above.
Introduction
This guide covers developing Unity 4 games and applications for the Oculus Rift and Samsung Gear VR.
The legacy integration provides VR support for development with Unity 4.x Professional and Free editions. It
also includes Unity prefabs, C# scripts, sample scenes, and more to assist with development.
Projects using Unity 5 and later should use the Oculus Utilities for Unity package instead of this integration.
Developers beginning new projects should use Unity 5 or later.
For information on recommended and supported Unity versions, see Compatibility and Requirements.
• Getting started
• Downloading and installing the Oculus Unity Integration
• Package contents
• Input Mapping
• How to use the provided samples, assets, and sample applications
• Configuring Unity VR projects for build to various targets
• Getting Started Frequently Asked Questions
• Debugging and Performance Analysis
182 | Unity Reference Content | Unity
Most information in this guide applies to the use of the Utilities package for either Rift or Mobile development.
Any exceptions are clearly indicated where they occur.
Our Release Notes describe known issues with any specific version.
All Unity versions 5.1 and later ship with the Oculus OVRPlugin, providing built-in support for Rift, Oculus Go,
and Samsung Gear VR.
The optional Oculus Utilities for Unity package offers additional developer resources, and includes the latest
version of OVRPlugin. When you import Utilities for Unity into a project, if the OVRPlugin version included with
the Utilities is later than the version built into your editor, a pop-up dialog will give you the option to update it
in your project. We always recommend using the latest available OVRPlugin version. For more information, see
OVRPlugin on page 33.
Legacy support is available for Unity 4 - see our Unity 4.x Legacy Integration Developer Guide on page 181
for more information.
For complete details Oculus SDK or Integration version compatibility with Unity, see Unity-SDK Version
Compatibility.
OS Compatibility
• Windows: Windows 7, 8, 10
• Mac: OS X Yosemite, El Capitan, Sierra
OS X development requires the Oculus Rift Runtime for OS X, available from our Downloads page. Note that
runtime support for OS X is legacy only. It does not support consumer versions of Rift.
Unity | Unity Reference Content | 183
Controllers
You may wish to have a controller for development or to use with the supplied demo applications. Available
controllers include the Oculus Touch or Xbox 360 controller for Rift, and the Gear VR Controller for mobile
development.
SDK Examples is a resource for mobile developers that includes Unity example scenes illustrating the use
of common resources such as menus, crosshairs, and more. See Oculus Mobile SDKExamples for more
information.
Also Available
Additional development resources are available separately from our Downloads page.
• Legacy Oculus Spatializer for Unity (included with Oculus Audio SDK Plugins; the Native OSP is for use with
Unity 5 and later)
• OVRMonitor (includes the VrCapture library and Oculus Remote Monitor client for mobile performance
analysis and debugging; see our OVRMonitor documentation for more information).
Recommended Configuration
• On Windows, enable Direct3D 11 - it exposes the most advanced VR rendering capabilities. Direct3D 9 and
Windows OpenGL are not supported. D3D 12 is currently available as an experimental feature.
• Use the Linear Color Space. Linear lighting is not only more correct for shading, it also causes Unity to
perform sRGB read/write to the eye textures. This helps reduce aliasing during VR distortion rendering,
where the eye textures are interpolated with slightly greater dynamic range.
184 | Unity Reference Content | Unity
• Never clone displays. When the Rift is cloned with another display, the application may not vsync properly.
This leads to visible tearing or judder (stuttering or vibrating motion).
OVRMonitor is available to Unity 4 developers as a separate download. However, Unity 4 developers must
download the Oculus Mobile SDK for the SDK Examples.
When developing for mobile, please be sure to fully review all of the relevant performance and design
documentation, especially the Unity Best Practices: Mobile. Mobile apps are subject to more stringent
limitations and requirements and computational limitations which should be taken into consideration from the
ground up.
Application Signing
Mobile applications require two different signatures at different stages of development. Be sure to read the
Application Signing section of the Mobile SDK documentation for more information.
Getting Started
This section describes how to begin working in Unity.
Otherwise, create a new project into which you will import the Oculus assets:
Assets/Plugins Oculus.*
OVR.*
Assets/Plugins/ *Oculus*
Android/
AndroidManifest.xml
*vrapi*
*vrlib*
*vrplatlib*
Assets/Plugins/x86/ Oculus.*
OVR.*
Assets/Plugins/ Oculus.*
x86_64/
OVR.*
When the Importing package dialog box opens, leave all of the boxes checked and select Import. The import
process may take a few minutes to complete.
To import the SDKExamples into Unity, select Assets > Custom Package... and select
BlockSplosion.unityPackage to import the assets into your new project. Alternately, you can simply find
the .unityPackage file in your file system and double-click to launch.
When the Importing package dialog box opens, leave all of the boxes checked and select Import. The import
process may take a few minutes to complete.
186 | Unity Reference Content | Unity
Each sample application project includes a ProjectSettings folder, which provides default settings for the VR
mobile application. Copy these files to your [Project]/Assets/ProjectSettings folder.
The Unity Integration package may be used to integrate Oculus VR into an existing project. This may be useful
as a way of getting oriented to VR development, but dropping a VR camera into a Unity game that wasn't
designed with VR best practices in mind is unlikely to produce a great experience.
1. Import package.
2. Instantiate OVRCameraRig if you already have locomotion figured out or instantiate OVRPlayerController to
walk around.
3. Copy any scripts from the non-VR camera to the OVRCameraRig. Any image effect should go to both the
Left/RightEyeAnchor GameObjects. These are children of a TrackingSpace GameObject, which is itself a
child of OVRCameraRig. The TrackingSpace GameObject allows clients to change the frame of reference
used by tracking, e.g., for use with a game avatar.
4. Disable your old non-VR camera.
5. Build your project and run normally.
Note: This is one simple method for adding VR to an existing application, but is by no means the only
way. For example, you may not always wish to use OVRPlayerController.
Contents
OVR
The contents of the OVR folder in OculusUnityIntegration.unitypackage are uniquely named and should be safe
to import into an existing project.
Editor Contains scripts that add functionality to the Unity Editor, and enhance several C#
component scripts.
Materials Contains materials that are used for graphical components within the integration, such
as the main GUI display.
Moonlight Contains classes designed for mobile development. Holds sub-folders with mobile
equivalents of all top-level folders (Editor, Materials, Prefabs, et cetera).
Prefabs Contains the main Unity prefabs that are used to provide the VR support for a Unity
scene: OVRCameraRig and OVRPlayerController.
Unity | Unity Reference Content | 187
Resources Contains prefabs and other objects that are required and instantiated by some OVR
scripts, such as the main GUI.
Scripts Contains the C# files that are used to tie the VR framework and Unity components
together. Many of these scripts work together within the various Prefabs.
Textures Contains image assets that are required by some of the script components.
Note: We strongly recommend that developers not directly modify the included OVR scripts.
Plugins
The Plugins folder contains vrapi.so and the OculusPlugin.dll, which enables the VR framework to communicate
with Unity on Windows (both 32 and 64-bit versions).
This folder also contains the plugins for other platforms: OculusPlugin.bundle for MacOS; and Android/
libOculusPlugin.so, vrlib.jar, and AndroidManifest.xml for Android.
Prefabs
The current integration for adding VR support into Unity applications is based on two prefabs that may be
added into a scene:
• OVRCameraRig
• OVRPlayerController
To use, simply drag and drop one of the prefabs into your scene.
OVRCameraRig
OVRCameraRig replaces the regular Unity Camera within a scene. You can drag an OVRCameraRig into your
scene and you will be able to start viewing the scene with the Gear VR and Rift.
188 | Unity Reference Content | Unity
Note: Make sure to turn off any other Camera in the scene to ensure that OVRCameraRig is the only
one being used.
OVRCameraRig contains two Unity cameras, one for each eye. It is meant to be attached to a moving object
(such as a character walking around, a car, a gun turret, etc.) This replaces the conventional Camera.
• OVRCameraRig.cs
• OVRManager.cs
OVRPlayerController
The OVRPlayerController is the easiest way to start navigating a virtual environment. It is basically an
OVRCameraRig prefab attached to a simple character controller. It includes a physics capsule, a movement
system, a simple menu system with stereo rendering of text fields, and a cross-hair component.
To use, drag the player controller into an environment and begin moving around using a gamepad, or a
keyboard and mouse.
• OVRPlayerController.cs
Unity | Unity Reference Content | 189
• OVRGamepadController.cs
Unity Components
This section gives a general overview of the Components provided by the legacy integration.
OVRCameraRig
OVRCameraRig is a component that controls stereo rendering and head tracking. It maintains three child
"anchor" Transforms at the poses of the left and right eyes, as well as a virtual "center" eye that is halfway
between them.
This component is the main interface between Unity and the cameras. This is attached to a prefab that makes it
easy to add comfortable VR support to a scene.
Important: All camera control should be done through this component. You should understand this script when
implementing your own camera control mechanism.
Updated Anchors Allows clients to filter the poses set by tracking. Used to modify or ignore positional
tracking.
190 | Unity Reference Content | Unity
GameObject Structure
TrackingSpace A GameObject that defines the reference frame used by tracking. You can move this
relative to the OVRCameraRig for use cases in which the rig needs to respond to
tracker input. For example, OVRPlayerController changes the position and rotation of
TrackingSpace to make the character controller follow the yaw of the current head pose.
OVRManager
OVRManager is the main interface to the VR hardware. It is a singleton that exposes the Oculus SDK to Unity,
and includes helper functions that use the stored Oculus variables to help configure camera behavior.
This component is added to the OVRCameraRig prefab. It can be part of any application object. However, it
should only be declared once, because there are public members that allow for changing certain values in the
Unity inspector.
Monoscopic If true, rendering will try to optimize for a single viewpoint rather than rendering once
for each eye. Not supported on all platforms.
Eye Texture Format Sets the format of the eye RenderTextures. Normally you should use Default or
DefaultHDR for high-dynamic range rendering.
Eye Texture Depth Sets the depth precision of the eye RenderTextures. May fix z-fighting artifacts at the
expense of performance.
Eye Texture Antialiasing Sets the level of antialiasing for the eye RenderTextures.
Native Texture Scale Each camera in the camera controller creates a RenderTexture that is the ideal size for
obtaining the sharpest pixel density (a 1-to-1 pixel size in the center of the screen post
lens distortion). This field can be used to permanently scale the cameras' render targets
to any multiple ideal pixel fidelity, which gives you control over the trade-off between
performance and quality.
Virtual Texture Scale This field can be used to dynamically scale the cameras render target to values lower
than the ideal pixel fidelity, which can help reduce GPU usage at run-time if necessary.
Use Position Tracking Disables the IR tracker and causes head position to be inferred from the current rotation
using the head model. To fully ignore tracking or otherwise modify tracking behavior,
see OVRCameraRig.UpdatedAnchors above
Mirror to Display When enabled, the undistorted rendered output appears on your desktop in addition to
the Rift. If disabled, you may add your own scripts next to OVRCameraRig and set that
GameObject's Camera component to render whatever you like. Disabling may slightly
improve performance.
Unity | Unity Reference Content | 191
Time Warp (desktop only) Time warp is a technique that adjusts the on-screen position of rendered images
based on the latest tracking pose at the time the user will see it. Enabling this will force
vertical-sync and make other timing adjustments to minimize latency.
Freeze Time Warp If enabled, this illustrates the effect of time warp by temporarily freezing the rendered
(desktop only) eye pose.
Reset Tracker On Load This value defaults to True. When turned off, subsequent scene loads will not reset the
tracker. This will keep the tracker orientation the same from scene to scene, as well as
keep magnetometer settings intact.
Helper Classes
In addition to the above components, your scripts can always access the HMD state via static members of
OVRManager.
OVRTracker Provides the pose, frustum, and tracking status of the infrared tracking sensor.
OVRCommon
Utilities
The following classes are optional. We provide them to help you make the most of virtual reality, depending on
the needs of your application.
The controller will interact properly with a Unity scene, provided that the scene has
collision detection assigned to it.
OVRPlayerController contains a few variables attached to sliders that change the physics
properties of the controller. This includes Acceleration (how fast the player will increase
speed), Dampening (how fast a player will decrease speed when movement input is not
activated), Back and Side Dampen (how much to reduce side and back Acceleration),
Rotation Amount (the amount in degrees per frame to rotate the user in the Y axis)
and Gravity Modifier (how fast to accelerate player down when in the air). When HMD
Rotates Y is set, the actual Y rotation of the cameras will set the Y rotation value of the
parent transform that it is attached to.
Note: currently native XInput-compliant gamepads are not supported on Mac OS.
Please use the conventional Unity input methods for gamepad input.
OVRCrosshair OVRCrosshair is a helper class that renders and controls an on-screen cross-hair. It is
currently used by the OVRMainMenu component.
OVRGUI OVRGUI is a helper class that encapsulates basic rendering of text in either 2D or 3D.
The 2D version of the code will be deprecated in favor of rendering to a 3D element
(currently used in OVRMainMenu).
OVRGridCube OVRGridCube is a helper class that shows a grid of cubes when activated. Its main
purpose is to be used as a way to know where the ideal center of location is for the
user's eye position. This is especially useful when positional tracking is activated. The
cubes will change color to red when positional data is available, and will remain blue if
position tracking is not available, or change back to blue if vision is lost.
OVRTrackerBounds Warns players when the HMD moves outside the trackable range.
GameObject Structure
To import SDKExamples into Unity, begin by creating a new, empty project. Then select Assets > Import
Package > Custom Package... and select SDKExamples.unityPackage to import the assets into your project.
Alternately, you can locate the SDKExamples.unityPackage and double-click to launch, which will have the
same effect.
Once imported, replace your Unity project's ProjectSettings folder with the ProjectSettings folder included with
SDKExamples.
Note: If you don't replace the ProjectSettings folder, imported scenes will show console errors.
OVRChromaticAberration.csDrop-in component for toggling chromatic aberration correction on and off for Android.
OVRDebugGraph.cs Drop-in component for toggling the TimeWarp debug graph, which is no longer
available.
OVRDebugHeadController.cs
A simple behavior that can be attached to the parent of the CameraRig to provide
movement via the gamepad, useful for testing applications in Unity without an HMD.
OVRModeParms.cs Example code for de-clocking your application to reduce power and thermal load as
well as how to query the current power level state.
OVRMonoscopic.cs Drop-in component for toggling Monoscopic rendering on and off for Android.
OVROverlay.cs Add to an object with a Quad mesh filter to have the quad rendered as a TimeWarp
overlay instead by drawing it into the eye buffer.
OVRPlatformMenu.cs Helper component for detecting Back Key long-press to bring-up the Universal Menu
and Back Key short-press to bring up the Confirm-Quit to Home Menu. Additionally
implements a Wait Timer for displaying Long Press Time. For more information on
interface guidelines and requirements, please review Interface Guidelines and Universal
Menu in the Mobile SDK documentation.
OVRTimeWarpUtils.cs Demonstrates the interface calls for setting important configs, including min vsyncs,
enable tw debug graph, and enable cac.
OVRVolumeControl.cs An on-screen display that shows the current system audio volume.
See our Oculus Utilities for Unity Reference Manual for a more detailed look at these and other C# scripts.
Undocumented scripts may be considered internal, and should generally never be modified.
OVRInput
OVRInput exposes a unified input API for multiple controller types.
It may be used to query virtual or raw controller state, such as buttons, thumbsticks, triggers, and capacitive
touch data. It currently supports the Oculus Touch and Microsoft Xbox controllers on desktop platforms.
Gamepads compatible with Samsung Gear VR, such as the Samsung EI-GP20 and Moga Pro, must be Android
compatible and support Bluetooth 3.0. For more details on supported mobile gamepad features, see System
and Hardware Requirements in our Mobile SDK documentation.
194 | Unity Reference Content | Unity
When used with tracked controllers such as Oculus Touch, OVRInput also provides position and orientation
data through GetLocalControllerPosition() and GetLocalControllerRotation(), which return a
Vector3 and Quaternion, respectively.
Controller poses are returned by the constellation tracking system and are predicted simultaneously with
the headset. These poses are reported in the same coordinate frame as the headset, relative to the initial
center eye pose, and may be used for rendering hands or objects in the 3D world. They are also reset by
OVRManager.display.RecenterPose(), similar to the head and eye poses.
OVRInput provides control of haptic vibration feedback on compatible controllers. For example,
SetControllerVibration() sets vibration frequency and amplitude.
Mobile input bindings are now automatically added to InputManager.asset if they do not already exist.
For more information, see OVRInput in the Unity Developer Reference. For more information on Unity’s
input system and Input Manager, documented here: http://docs.unity3d.com/Manual/Input.html and http://
docs.unity3d.com/ScriptReference/Input.html.
Note: The term Touch in OVRInput refers to actual Oculus Touch controllers.
See OVRTouchpad.cs in Assets/OVR/Moonlight/Scripts for our interface class to the touchpad. The Gear VR
HMD touchpad is not currently exposed by OVRInput.
OVRInput Usage
The primary usage of OVRInput is to access controller input state through Get(), GetDown(), and GetUp().
Control Enumerates
OVRInput.RawButton
OVRInput.RawTouch
OVRInput.RawNearTouch
OVRInput.RawAxis1D
OVRInput.RawAxis2D
The first set of enumerations provides a virtualized input mapping that is intended to assist developers with
creating control schemes that work across different types of controllers. The second set of enumerations
provides raw unmodified access to the underlying state of the controllers. We recommend using the first set of
enumerations, since the virtual mapping provides useful functionality, as demonstrated below.
More on Controls
Example Usage:
// returns true if the primary button (typically “A”) was pressed this frame.
OVRInput.GetDown(OVRInput.Button.One);
// returns a Vector2 of the primary (typically the Left) thumbstick’s current state.
// (X/Y range of -1.0f to 1.0f)
OVRInput.Get(OVRInput.Axis2D.PrimaryThumbstick);
// returns true if the primary thumbstick has been moved upwards more than halfway.
// (Up/Down/Left/Right - Interpret the thumbstick as a D-pad).
OVRInput.Get(OVRInput.Button.PrimaryThumbstickUp);
// returns a float of the secondary (typically the Right) index finger trigger’s current state.
// (range of 0.0f to 1.0f)
OVRInput.Get(OVRInput.Axis1D.SecondaryIndexTrigger);
// returns true if the left index finger trigger has been pressed more than halfway.
// (Interpret the trigger as a button).
196 | Unity Reference Content | Unity
OVRInput.Get(OVRInput.RawButton.LIndexTrigger);
// returns true if the secondary gamepad button, typically “B”, is currently touched by the user.
OVRInput.Get(OVRInput.Touch.Two);
In addition to specifying a control, Get() also takes an optional controller parameter. The list of supported
controllers is defined by the OVRInput.Controller enumeration (for details, refer to OVRInput in the Unity
Developer Reference.
Specifying a controller can be used if a particular control scheme is intended only for a certain controller type.
If no controller parameter is provided to Get(), the default is to use the Active controller, which corresponds
to the controller that most recently reported user input. For example, a user may use a pair of Oculus Touch
controllers, set them down, and pick up an Xbox controller, in which case the Active controller will switch to
the Xbox controller once the user provides input with it. The current Active controller can be queried with
OVRInput.GetActiveController() and a bitmask of all the connected Controllers can be queried with
OVRInput.GetConnectedControllers().
Example Usage:
// returns a float of the Hand Trigger’s current state on the Left Oculus Touch controller.
OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, OVRInput.Controller.Touch);
// returns a float of the Hand Trigger’s current state on the Right Oculus Touch controller.
OVRInput.Get(OVRInput.Axis1D.SecondaryHandTrigger, OVRInput.Controller.Touch);
Note that the Oculus Touch controllers may be specified either as the combined pair (with
OVRInput.Controller.Touch), or individually (with OVRInput.Controller.LTouch and RTouch). This
is significant because specifying LTouch or RTouch uses a different set of virtual input mappings that allow
more convenient development of hand-agnostic input code. See the virtual mapping diagrams in Touch Input
Mapping for an illustration.
Example Usage:
// returns a float of the Hand Trigger’s current state on the Left Oculus Touch controller.
OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, OVRInput.Controller.LTouch);
// returns a float of the Hand Trigger’s current state on the Right Oculus Touch controller.
OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, OVRInput.Controller.RTouch);
This can be taken a step further to allow the same code to be used for either hand by specifying the controller
in a variable that is set externally, such as on a public variable in the Unity Editor.
Example Usage:
// public variable that can be set to LTouch or RTouch in the Unity Inspector
public Controller controller;
…
// returns a float of the Hand Trigger’s current state on the Oculus Touch controller
// specified by the controller variable.
OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, controller);
// returns true if the primary button (“A” or “X”) is pressed on the Oculus Touch controller
// specified by the controller variable.
OVRInput.Get(OVRInput.Button.One, controller);
This is convenient since it avoids the common pattern of if/else checks for Left/Right hand input mappings.
Raw Mapping
The raw mapping directly exposes the Touch controllers. The layout of the Touch controllers closely matches
the layout of a typical gamepad split across the Left and Right hands.
198 | Unity Reference Content | Unity
Virtual Mapping
Unity | Unity Reference Content | 199
Raw Mapping
Virtual Mapping
This diagram shows a common implementation of Xbox controller input bindings using
OVRInput.Controller.Gamepad.
Raw Mapping
The raw mapping directly exposes the Xbox controller.
200 | Unity Reference Content | Unity
Click on File > Build Settings... and select one of the following:
Unity | Unity Reference Content | 201
For Windows, set Target Platform to Windows and set Architecture to either x86 or x86 64.
Within the Build Settings pop-up, click Player Settings. Under Resolution and Presentation, set the values to
the following:
In the Build Settings pop-up, select Build. If prompted, specify a name and location for the build.
If you are building in the same OS, the demo should start to run in full screen mode as a standalone
application.
Quality Settings
You may notice that the graphical fidelity is not as high as the pre-built demo. You will need to change some
additional project settings to get a better looking scene.
Unity | Unity Reference Content | 203
Navigate to Edit > Project Settings > Quality. Set the values in this menu to the following:
The most important value to modify is Anti aliasing - it must be increased to compensate for the stereo
rendering, which reduces the effective horizontal resolution by 50%. An anti-aliasing value of 2X is ideal - 4x
may be used if you have performance to spare, and 8x usually isn't worth it.
Note: A quality setting called Fastest has been added to address a potential performance issue
with Unity 4.5 and OS X 10.9. This setting turns off effects and features that may cause the drop in
performance.
Now rebuild the project again, and the quality should be at the same level as the pre-built demo.
PC builds create a single executable file that may be used in either Direct Display or Extended Display modes.
Build Settings
1. From the File menu, select Build Settings…. From the Build Settings… menu, select Android as the
platform. Set Texture Compression to ETC2 (GLES 3.0).
2. Add any scenes you wish to include in your build to Scenes In Build..
Player Settings
1. Click the Player Settings… button and select the Android tab. Set Default Orientation to Landscape Left
in Settings for Android (may be collapsed).
Note: The Use 24-bit Depth Buffer option appears to be ignored for Android. A 24-bit window
depth buffer always appears to be created.
2. As a minor optimization, 16 bit buffers, color and/or depth may be used. Most VR scenes should be built to
work with 16 bit depth buffer resolution and 2x MSAA. If your world is mostly pre-lit to compressed textures,
there will be little difference between 16 and 32 bit color buffers.
3. Select the Splash Image section. For Mobile Splash image, choose a solid black texture.
Note: Custom Splash Screen support is not available with Unity Free. A head-tracked Unity logo
screen is provided instead.
4. While still in Player Settings, select Other Settings and verify that Rendering Path* is set to Forward,
Multithreaded Rendering* is selected, and Install Location is set to Force Internal, as shown below:
Unity | Unity Reference Content | 205
5. Set the Stripping Level to the maximum level your app allows. It will reduce the size of the installed .apk
file.
Note: This feature is not available for Unity Free.
Checking Optimize Mesh Data may improve rendering performance if there are unused components in
your mesh data.
Quality Settings
1. Go to the Edit menu and choose Project Settings, then Quality. In the Inspector, set Vsync Count to
Don’t Sync. The TimeWarp rendering performed by the Oculus Mobile SDK already synchronizes with the
display refresh.
206 | Unity Reference Content | Unity
Note: Anti Aliasing should not be enabled for the main framebuffer.
2. Anti Aliasing should be set to Disabled. You may change the camera render texture antiAliasing by
modifying the Eye Texture Antialiasing parameter on OVRManager. The current default is 2x MSAA.
Be mindful of the performance implications. 2x MSAA runs at full speed on chip, but may still increase the
number of tiles for mobile GPUs which use variable bin sizes, so there is some performance cost. 4x MSAA
runs at half speed, and is generally not fast enough unless the scene is very undemanding.
3. Pixel Light Count is another attribute which may significantly impact rendering performance. A model is re-
rendered for each pixel light that affects it. For best performance, set Pixel Light Count to zero. In this case,
vertex lighting will be used for all models depending on the shader(s) used.
Time Settings
Note: The following Time Settings advice is for applications which hold a solid 60 FPS, updating all
game and/or application logic with each frame. The following Time Settings recommendations may be
detrimental for apps that don’t hold 60FPS.
Go to the Edit -> Project Settings -> Time and change both Fixed Timestep and Maximum Allowed
Timestep to “0.0166666” (i.e., 60 frames per second).
Fixed Timestep is the frame-rate-independent interval at which the physics simulation calculations are
performed. In script, it is the interval at which FixedUpdate() is called. The Maximum Allowed Timestep
sets an upper bound on how long physics calculations may run.
Unity | Unity Reference Content | 207
Open the AndroidManifest.xml file located under Assets/Plugins/Android/. You will need to configure your
manifest with the necessary VR settings, as shown in the following manifest segment:
• Replace <packagename> in the first line with your actual package name, such as "com.oculus.cinema".
• Unity will overwrite the required setting android:installLocation="internalOnly" if the Player
Setting Install Location is not set to Force Internal.
• The Android theme should be set to the solid black theme for comfort during application transitioning:
Theme.Black.NoTitleBar.Fullscreen
• The vr_only meta data tag should be added for VR mode detection.
• The required screen orientation is landscape: android:screenOrientation="landscape"
• We recommended setting your configChanges as follows: android:configChanges="screenSize|
orientation|keyboardHidden|keyboard"
• The minSdkVersion and targetSdkVersion are set to the API level supported by the device. For the
current set of devices, the API level is 19.
• Do not add the noHistory attribute to your manifest.
Applications written for development are not launched through the Oculus Home menu system. Instead, build
the application directly to your phone, and you will be prompted to insert your phone into the Gear VR headset
to launch the application automatically.
To run the application in the future, remove your phone from the Gear VR headset, launch the application
directly from the phone desktop or Apps folder, and insert the device into the Gear VR when prompted to do
so.
208 | Unity Reference Content | Unity
1. Copy an Oculus Signature File specific to your mobile device to the folder Project/Assets/Plugins/Android/
assets/ or the application will not run. See the Application Signing section of the Mobile SDK documentation
for more information.
2. Be sure the project settings from the steps above are saved with File > Save Project.
3. If you are not already connected to your phone via USB, connect now. Unlock the phone lock screen.
4. From the File menu, select Build Settings…. While in the Build Settings menu, add the Main.scene to
Scenes in Build. Next, verify that Android is selected as your Target Platform and select Build and Run. If
asked, specify a name and location for the .apk.
The .apk will be installed and launched on your Android device.
Connect to the device via USB and open a command prompt. Run the installToPhone.bat script included
with the SDK. This script will copy and install both the Unity and Native sample applications as well as any
sample media to your Android device. You should now see application icons for the newly-installed apps on the
Android Home screen.
For more information about these sample apps please review the Initial SDK Setup section in Device and
Environment Setup Guide.
• From the Android Home screen, press the icon of the VR app you wish to run.
• A toast notification will appear with a dialog like the following: “Insert Device: To open this application,
insert your device into your Gear VR”
• Insert your device into the supported Gear VR hardware.
• 1-dot Button or Samsung gamepad tap launches a block in the facing direction.
• 2-dot Button resets the current level.
• Left Shoulder Button (L) skips to the next level.
• If you have a compliant gamepad controller for your platform, you can control the movement of the player
controller with it.
• The left analog stick moves the player around as if you were using the W,A,S,D keys.
• The right analog stick rotates the player left and right as if you were using the Q and E keys.
• The left trigger allows you to move faster, or run through the scene.
• The Start button toggles the scene selection. Pressing D-Pad Up and D-Pad Down scrolls through available
scenes. Pressing the A button starts the currently selected scene.
• If the scene selection is not turned on, Pressing the D-Pad Down resets the orientation of the tracker.
Mouse Control
Using the mouse will rotate the player left and right. If the cursor is enabled, the mouse will track the cursor and
not rotate the player until the cursor is off screen.
Shadowgun
In Shadowgun, locomotion allows the camera position to change.
• Left Analog Stick will move the player forward, back, left, and right.
• Right Analog Stick will rotate the player view left, right, up, and down. However, you will likely want to rotate
your view just by looking with the VR headset.
Rift
The app does not launch as a VR app.
Verify that you have installed the Oculus app and completed setup as described in Preparing for Rift
Development on page 8.
Verify that you have selected Virtual Reality Supported in Player Settings.
are using a compatible runtime - see Compatibility and Requirements for more details.
Verify that you have not selected D3D 9 or Windows GL as the renderer (Legacy Integration only).
210 | Unity Reference Content | Unity
Mobile
The app does not launch as a VR app.
Verify that you selected Virtual Reality Supported in Player Settings before building your APK.
Applications fail to launch on Gear VR with error message "thread priority security exception make sure
the apk is signed”.
You must sign your application with an Oculus Signature File (osig). See "Sign your App with an Oculus
Signature File" in Preparing for Mobile Development on page 9 for instructions.
General Issues
Unity 5 hangs while importing assets from SDKExamples.
Be sure to delete any previously-imported Utilities packages from your Unity project before importing a new
version. If you are receiving errors and have not done so, delete the relevant folders in your project and re-
import Utilities. For more information, please see Importing the Oculus Utilities Package on page 11.
Contact Information
Questions?
General Tips
VR application debugging is a matter of getting insight into how the application is structured and executed,
gathering data to evaluate actual performance, evaluating it against expectation, then methodically isolating
and eliminating problems.
When analyzing or debugging, it is crucial to proceed in a controlled way so that you know specifically what
change results in a different outcome. Focus on bottlenecks first. Only compare apples to apples, and change
one thing at a time (e.g., resolution, hardware, quality, configuration).
Always be sure to profile, as systems are full of surprises. We recommend starting with simple code, and
optimizing as you go - don’t try to optimize too early.
We recommend creating a non-VR version of your camera rig so you can swap between VR and non-VR
perspectives. This allows you to spot check your scenes, and it may be useful if you want to do profiling with
third-party tools (e.g., Adreno Profiler).
It can be useful to disable Multithreaded Rendering in Player Settings during performance debugging. This
will slow down the renderer, but also give you a clearer view of where your frame time is going. Be sure to turn
it back on when you’re done!
Unity | Unity Reference Content | 211
Performance Targets
Before debugging performance problems, establish clear targets to use as a baseline for calibrating your
performance.
These targets can give you a sense of where to aim, and what to look at if you’re not making frame rate or are
having performance problems.
Below you will find some general guidelines for establishing your baselines, given as approximate ranges unless
otherwise noted.
Mobile
• 60 FPS (required by Oculus)
• 50-100 draw calls per frame
• 50,000-100,000 triangles or vertices per frame
PC
• 90 FPS (required by Oculus)
• 500-1,000 draw calls per frame
• 1-2 million triangles or vertices per frame
For more information, see:
Unity Profiler
Unity comes with a built-in profiler (see Unity’s Profiler manual). The Unity Profiler provides per-frame
performance metrics, which can be used to help identify bottlenecks.
PC Setup
To use Unity Profiler with a Rift application, select Development Build and Autoconnect Profiler in Build
Settings and build your application. When you launch your application, the Profiler will automatically open.
Mobile Setup
You may profile your application as it is running on your Android device using adb or Wi-Fi. For steps on
how to set up remote profiling for your device, please refer to the Android section of the following Unity
documentation: https://docs.unity3d.com/Documentation/Manual/Profiler.html.
212 | Unity Reference Content | Unity
The Unity Profiler displays CPU utilization for the following categories: Rendering, Scripts, Physics,
GarbageCollector, and Vsync. It also provides detailed information regarding Rendering Statistics, Memory
Usage (including a breakdown of per-object type memory usage), Audio and Physics Simulation statistics.
The Unity profiler only displays performance metrics for your application. If your app isn’t performing as
expected, you may need to gather information on what the entire system is doing.
In this mode, translucent colors will accumulate providing an overdraw “heat map” where more saturated
colors represent areas with the most overdraw.
To use this profiler, connect to your device over Wi-Fi using ADB over TCPIP as described in the Wireless usage
section of Android’s adb documentation. Then run adb logcat while the device is docked in the headset.
See Unity’s Measuring Performance with the Built-in Profiler for more information. For more on using adb and
logcat, see Android Debugging in the Mobile SDK documentation.
Compositor Mirror
The compositor mirror is an experimental tool for viewing exactly what appears in the headset, with
Asynchronous TimeWarp and distortion applied.
The compositor mirror is useful for development and troubleshooting without having to wear the headset.
Everything that appears in the headset will appear, including Oculus Home, Guardian boundaries, in-
game notifications, and transition fades. The compositor mirror is compatible with any game or experience,
regardless of whether it was developed using the native PC SDK or a game engine.
For more details, see the Compositor Mirror section of the PC SDK Guide.
214 | Unity Reference Content | Unity
The Remote Monitor client uses VrCapture, a low-overhead remote monitoring library. VrCapture is designed
to help debug behavior and performance issues in mobile VR applications. VrCapture is included automatically
in any project built with Unity 5 or later, or compiled with the Legacy Integration.
For more information on setup, configuration, and usage, please see VrCapture and Oculus Remote Monitor.
Event Tracing for Windows (ETW) is a trace utility provided by Windows for performance analysis. GPUView
view provides a window into both GPU and CPU performance with DirectX applications. It is precise, has low
overhead, and covers the whole Windows system. Custom event manifests.
ETW profiles the whole system, not just the GPU. For a sample debug workflow using ETW to investigate
queuing and system-level contention, see Example Workflow: PC below.
Systrace
NVIDIA NSight
NSight is a CPU/GPU debug tool for NVIDIA users, available in a Visual Studio version and an Eclipse version.
Unity | Unity Reference Content | 215
APITrace
https://apitrace.github.io/
Analyzing Slowdown
In this guide, we take a look at three of the areas commonly involved with slow application performance: pixel
fill, draw call overhead, and slow script execution.
Pixel Fill
Pixel fill is a function of overdraw and of fragment shader complexity. Unity shaders are often implemented
as multiple passes (draw diffuse part, draw specular part, and so forth). This can cause the same pixel to be
touched multiple times. Transparency does this as well. Your goal is to touch almost all pixels on the screen
only one time per frame.
Unity's Frame Debugger (described in Unity Profiling Tools on page 101) is very useful for getting a sense of
how your scene is drawn. Watch out for large sections of the screen that are drawn and then covered, or for
objects that are drawn multiple times (e.g., because they are touched by multiple lights).
Z-testing is faster than drawing a pixel. Unity does culling and opaque sorting via bounding box. Therefore,
large background objects (like your Skybox or ground plane) may end up being drawn first (because the
bounding box is large) and filling a lot of pixels that will not be visible. If you see this happen, you can move
those objects to the end of the queue manually. See Material.renderQueue in Unity's Scripting API Reference
for more information.
Frame Debugger will clearly show you shadows, offscreen render targets, et cetera.
Draw Calls
Modern PC hardware can push a lot of draw calls at 90 fps, but the overhead of each call is still high enough
that you should try to reduce them. On mobile, draw call optimization is your primary scene optimization.
Draw call optimization is usually about batching multiple meshes together into a single VBO with the same
material. This is key in Unity because the state change related to selecting a new VBO is relatively slow. If you
select a single VBO and then draw different meshes out of it with multiple draw calls, only the first draw call is
slow.
Unity batches well when given properly formatted source data. Generally:
• Batching is only possible for objects that share the same material pointer.
• Batching doesn't work on objects that have multiple materials.
• Implicit state changes (e.g. lightmap index) can cause batching to end early.
• Use as few textures in the scene as possible. Fewer textures require fewer unique materials, so they are
easier to batch. Use texture atlases.
• Bake lightmaps at the largest atlas size possible. Fewer lightmaps require fewer material state changes. Gear
VR can push 4096 lightmaps without too much trouble, but watch your memory footprint.
• Be careful not to accidentally instance materials. Note that accessing Renderer.material automatically
creates an instance (!) and opts that object of batching. Use Renderer.sharedMaterial instead
whenever possible.
216 | Unity Reference Content | Unity
• Watch out for multi-pass shaders. Add noforwardadd to your shaders whenever you can to prevent more
than one directional from applying. Multiple directionals generally break batching.
• Mark all mesh that never moves as Static in the editor. Note that this will cause the mesh to be combined
into a mega mesh at build time, which can increase load time and app size on disk, though usually not in a
material way. You can also create a static batch at runtime (e.g., after generating a procedural level out of
static parts) using StaticBatchingUtility.
• Watch your static and dynamic batch count vs the total draw call count using the Profiler, internal profiler
log, or stats gizmo.
Script Performance
Unity's C# implementation is fast, and slowdown from script is usually the result of a mistake and/or an
inadvertent block on slow external operations such as memory allocation. The Unity Profiler can help you find
and fix these scripts.
Try to avoid foreach, lamda, and LINQ structures as these allocate memory needlessly at runtime. Use a for
loop instead. Also, be wary of loops that concatenate strings.
Game Object creation and destruction takes time. If you have a lot of objects to create and destroy (say, several
hundred in a frame), we recommend pooling them.
Don't move colliders unless they have a rigidbody on them. Creating a rigidbody and setting isKinematic
will stop physics from doing anything but will make that collider cheap to move. This is because Unity maintains
two collider structures, a static tree and a dynamic tree, and the static tree has to be completely rebuilt every
time any static object moves.
Note that coroutines execute in the main thread, and you can have multiple instances of the same coroutine
running on the same script.
We recommend targeting around 1-2 ms maximum for all Mono execution time.
PC Debug Workflow
In this guide, we’ll use the example of a hypothetical stuttering app scene and walk through basic steps
debugging steps.
Where to Start
Begin by running the scene with the Oculus Performance HUD.
If the scene drops more than one frame every five seconds, check the render time. If it’s more than 8 ms, have a
look at GPU utilization. Otherwise, look at optimizing CPU utilization. If observed latency is greater than 30 ms,
have a look at queuing.
If you find garbage collection spikes, don’t allocate memory each frame.
Check for hogs in your hierarchy or timeline view, such as any single object that takes 8 ms to render. The GPU
may also wait for long stalls on CPU. Other potential problem areas are mesh rendering, shadows, vsync, and
subsystems.
Unity | Unity Reference Content | 217
Good performance is critical for all VR applications, but the limitations inherent to mobile development warrant
special consideration.
We recommend that you also review Design Guidelines and Mobile VR Design and Performance Guidelines in
the Mobile SDK documentation.
Design Considerations
Startup Sequence
For good VR experiences, all graphics should be rendered such that the user is always viewing a proper three-
dimensional stereoscopic image. Additionally, head-tracking must be maintained at all times.
An example of how to do this during application startup is demonstrated in the SDKExamples Startup_Sample
scene:
• Solid black splash image is shown for the minimum time possible.
• A small test scene with 3D logo and 3D rotating widget or progress meter is immediately loaded.
• While the small startup scene is active, the main scene is loaded in the background.
• Once the main scene is fully loaded, the start scene transitions to the main scene using a fade.
An example of demonstrating this functionality is in the SDKExamples GlobalMenu_Sample scene. For more
information about application menu options and access, see Universal Menu in the Mobile SDK documentation.
Best Practices
• Be Batch Friendly. Share materials and use a texture atlas when possible.
• Prefer lightmapped, static geometry.
• Prefer lightprobes instead of dynamic lighting for characters and moving objects.
• Bake as much detail into the textures as possible. E.g., specular reflections, ambient occlusion.
• Only render one view per eye. No shadow buffers, reflections, multi-camera setups, et cetera.
• Keep the number of rendering passes to a minimum. No dynamic lighting, no post effects, don't resolve
buffers, don’t use grabpass in a shader, et cetera.
• Avoid alpha tested / pixel discard transparency. Alpha-testing incurs a high performance overhead.
Replace with alpha-blended if possible.
• Keep alpha blended transparency to a minimum.
• Use Texture Compression. Favor ETC2.
• MSAA may be enabled on the Eye Render Textures.
218 | Unity Reference Content | Unity
Recommendations
• Be mindful of the total number of GameObjects and components your scenes use.
• Model your game data and objects efficiently. You will generally have plenty of memory.
• Minimize the number of objects that actually perform calculations in Update() or FixedUpdate().
• Reduce or eliminate physics simulations when they are not actually needed.
• Use object pools to respawn frequently-used effects or objects instead of allocating new ones at runtime.
• Use pooled AudioSources versus PlayOneShot sounds, as the latter allocate a GameObject and destroy it
when the sound is done playing.
• Avoid expensive mathematical operations whenever possible.
• Cache frequently-used components and transforms to avoid lookups each frame.
• Use the Unity Profiler to:
Rendering Optimization
Be conservative on performance from the start.
Unity provides several built-in features to help reduce draw calls such as batching and culling.
Static batching is used for objects that will not move, rotate or scale, and must be set explicitly per object. To
mark an object static, select the Static checkbox in the object Inspector.
Unity | Unity Reference Content | 219
Dynamic batching is used for moving objects and is applied automatically when objects meet certain criteria,
such as sharing the same material, not using real-time shadows, or not using multipass shaders. More
information on dynamic batching criteria may be found here: https://docs.unity3d.com/Documentation/Manual/
DrawCallBatching.html
Culling
Unity offers the ability to set manual per-layer culling distances on the camera via Per-Layer Cull Distance.
This may be useful for culling small objects that do not contribute to the scene when viewed from a given
distance. More information about how to set up culling distances may be found here: https://docs.unity3d.com/
Documentation/ScriptReference/Camera-layerCullDistances.html.
Unity also has an integrated Occlusion Culling system. The advice to early VR titles is to favor modest “scenes”
instead of “open worlds,” and Occlusion Culling may be overkill for modest scenes. More information about
the Occlusion Culling system can be found here: http://blogs.unity3d.com/2013/12/02/occlusion-culling-in-
unity-4-3-the-basics/.
Verify model vert counts are mobile-friendly. Typically, assets from the Asset Store are high-fidelity and will
need tuning for mobile.
Unity Pro provides a built-in Level of Detail System (not available in Unity Free), allowing lower-resolution
meshes to be displayed when an object is viewed from a certain distance. For more information on how
to set up a LODGroup for a model, see the following: https://docs.unity3d.com/Documentation/Manual/
LevelOfDetail.html
220 | Unity Reference Content | Unity
Verify your vertex shaders are mobile friendly. And, when using built-in shaders, favor the Mobile or Unlit
version of the shader.
Bake as much detail into the textures as possible to reduce the computation per vertex, for example, baked
bumpmapping as demonstrated in the Shadowgun project: https://docs.unity3d.com/430/Documentation/
Manual/iphone-PracticalRenderingOptimizations.html
Be mindful of GameObject counts when constructing your scenes. The more GameObjects and Renderers in
the scene, the more memory consumed and the longer it will take Unity to cull and render your scene.
Verify your fragment shaders are mobile friendly. And, when using built-in shaders, favor the Mobile or Unlit
version of the shader.
Overdraw: Objects in the Unity opaque queue are rendered in front to back order using depth-testing to
minimize overdraw. However, objects in the transparent queue are rendered in a back to front order without
depth testing and are subject to overdraw.
Avoid overlapping alpha-blended geometry (e.g., dense particle effects) and full-screen post processing
effects.
It is intended to serve as a basic introduction for developers who are new to VR development and to Unity.
Once the necessary tools are set up, this process should take a few hours to complete. By the end, you will
have a working mobile application that you can play and demonstrate on your Oculus Rift or Gear VR device, to
the amazement of your friends and loved ones.
We will build and modify the Unity game Roll-a-ball to add VR capability. The game is controllable by keyboard
or by the Samsung EI-GP20 gamepad.
Requirements
• Oculus Rift or Gear VR with compatible Samsung phone
• Samsung EI-GP20 gamepad (required for Mobile; optional for Desktop)
• PC running Windows 7, 8 or 10, or a Mac running OS X 10
• Unity 4 (for version compatibility, see Compatibility and Requirements)
You will also need to refer to the relevant Oculus SDK documentation, available for download here: https://
developer.oculus.com/documentation/
Desktop: Download and install the Oculus PC SDK and Unity Integration from Oculus PC SDK Downloads.
Prepare for development as described in the Oculus Rift Getting Started Guide. By the time you have
completed this process, you should be able to run the Demo Scene as described in that guide.
Unity | Unity Reference Content | 221
Mobile: Download and install the Oculus Mobile SDK from Oculus Mobile SDK Downloads. Prepare for
development as described by the Device and Environment Setup Guide. By the time you have completed
this process, you should be able to communicate with your Samsung phone via USB. To verify this, retrieve
the device ID from your phone by connecting via USB and sending the command adb devices from a
command prompt. If you are communicating successfully, the phone will return its device ID. You may wish
to make a note of it - you will need it later to request a Oculus Signature File (see step four in Modify Roll-a-
ball for VR for more information).
2. Install Unity.
Check which version of the Unity editor you should download and install in our Compatibility and Version
Requirements on page 7, then download the appropriate version here: http://docs.unity3d.com/Manual/
index.html. Unity provides extensive documentation to introduce users to the environment. You may wish
to begin by reviewing their documentation to gain a basic familiarity with core concepts such as the Editor,
GameObjects, prefabs, projects, and scenes.
3. Build the Unity Roll-a-ball application.
Unity provides a number of video tutorials that walk you through the process of creating a simple game. The
first in the series provides instructions for creating the Roll-a-ball application, in which you use the keyboard
or gamepad to control a ball that rolls around a game board and picks up floating token counters:http://
unity3d.com/learn/tutorials/projects/roll-a-ball
The development process is covered in eight short video tutorials which run from around five to fifteen
minutes in length. Allow for a few hours to complete the procedure.
The final video in the series, "107. Publishing the game," describes building the Roll-a-ball game for play
in a web browser. You may skip this lesson if you wish for the purposes of this exercise, as we will follow a
different procedure for building a playable application (PC/Mac) or APK (Android).
Note: We refer to the assets, folders, and so forth by the names used in the Unity tutorial, so it is
helpful to follow the names they use in their example.
4. Duplicate your Roll-a-ball project (optional).
Once you have completed building Roll-a-ball, you may wish to create a duplicate Roll-a-ball project
specifically for VR development. It can be useful to retain a backup of the original unmodified Roll-a-ball
project in case you make mistakes or wish to work with it later without the VR assets.
To duplicate the Roll-a-ball project, simply navigate in your OS to the Unity project folder containing your
Roll-a-ball project, copy the folder and all of its contents, and rename it. For this tutorial, we will use the
project folder name Roll-a-ball-VR.
5. Launch the new project and prepare the game scene.
1. Launch Unity and select File > Open Project... and select the project folder location for Roll-a-ball-VR in
order to launch the project.
2. In your Project tab, open Assets > _Scenes and select "MiniGame."
3. Press F2 and rename the scene "VRMiniGame."
4. Open the scene "VRMiniGame."
In Unity, select Assets > Import Package > Custom Package.... Navigate to the folder where you have
installed the Oculus SDK.
Mobile SDK: open the UnityIntegration folder in VrUnity, select UnityIntegration.unityPackage, and click
Open.
This will open Unity's Import Package dialog box, which lists the assets included in the package. Leave all
boxes checked and select Import.
For more information on the contents of the integration package, see A Detailed Look at the Unity
Integration.
2. Replace Main Camera with OVRCameraRig.
We will replace the Roll-a-ball Main Camera with OVRCameraRig, an Oculus prefab VR camera included
with our Unity Integration. OVRCameraRig renders two stereoscopic images of the game scene with the
appropriate distortion.
Main Camera tracks the player ball, but we will modify our camera to overlook the game board from a fixed
position, so the player may look around at whatever angle they choose.
Rather than deleting the Main Camera, simply deselect it to make it inactive in the scene. Select Main
Camera in your Hierarchy view and uncheck the check box at the top left of the Inspector view.
Note: Only one camera may be active in a Unity scene at any given time.
In the Project view, open the OVR folder and select the Prefabs folder. Select OVRCameraRig and drag it
into the Hierarchy view to instantiate it.
3. Elevate OVRCameraRig above the game board.
Select OVRCameraRig in the Hierarchy view and set the Position fields of the OVRCameraRig Transform to
the following values: X = 0; Y = 10; Z = -15.
4. Rotate OVRCameraRig forward for a better view.
Set the Rotation field of the OVRCameraRig Transform to the following value: X = 35; Y = 0; Z = 0.
Unity | Unity Reference Content | 223
Enter Play mode by pressing the play button, and note that the Game view now shows the image rendered
in two distorted and stereoscopic views as illustrated below. If you are using the PC SDK, you will see the
Health and Safety Warning appear over the game; press any key to continue past it.
To access your Samsung phone's VR capabilities, you will need to sign your application with an Oculus
Signature File (osig). If you recorded your device ID earlier, you may use it now to request your osig file.
Note that you need only one osig per mobile device.
You may obtain an osig from our self-service portal here: https://dashboard.oculus.com/tools/osig-
generator/. Once you have received an osig, copy it to your Unity project folder in /Roll-a-ball-VR/Assets/
Plugins/Android/assets/.
More information may be found on application signing in Application Signing in our Mobile guide.
Play
Go ahead and try it out! You may use your keyboard or a paired Samsung gamepad to control your ball and
collect the game pickup objects.
Note: Because the GUIText display we built in the Roll-a-ball tutorial will not work with OVRCameraRig
without substantial modification, you will not see the score counter or "You win!" message when all the
pieces have been collected.
224 | Unity Reference Content | Unity
When you're ready to get into the trenches, find out what the version of Unity we recommend at our
Compatibility and Requirements page, then download and install it. Next, build your own simple VR game by
following the instructions in our Tutorial: Build Your First VR App on page 22.
Question: What are the system requirements for Unity development for Oculus? What operating systems are
supported for Unity development?
Answer: For the most up-to-date information, see Unity Compatibility and Requirements. We currently support
Windows and OS X for development. The Oculus Rift requires Windows 7, 8 or 10.
Answer: Our latest version recommendations may be found in our Unity Compatibility and Requirements
document. Be sure to check back regularly, as we update it frequently as new SDKs and Unity versions are
released. You can find an archive of information in our Unity-SDK Version Compatibility list.
Question: What other tools and resources do you provide to Unity developers?
Answer: To find the latest tools we provide, check out our Other Oculus Resources for Unity Developers on
page 30.
Question: What do I need to run Rift applications that I build with Unity?
Answer: You will need a compatible Windows PC, a Rift, and the Oculus software. For more details, see
Preparing for Rift Development on page 8
Question: I want to focus on mobile development for the Samsung Gear VR. What do I need to do to get
started? Do I need to download the Oculus Mobile SDK?
Answer: The Android SDK is required for mobile development with Unity. However, most Unity developers
do not need to download the Oculus Mobile SDK, or to install Android Studio or NDK. For more details, see
Preparing for Mobile Development on page 9.
Question: Can I develop a single application for both Samsung Gear VR and the Oculus Rift?
Answer: Yes, but when developing for both Rift and mobile platforms, keep in mind that the requirements for
PC and mobile VR applications differ substantially. If you would like to generate builds for both PC and mobile
from a single project, it is important to follow the more stringent mobile development best practices, as well as
meeting the required 90 fps required by the Rift.
Question: What is the difference between the Oculus Unity 4 Legacy Integration and the Oculus Utilities for
Unity? How do the differences between Unity 4 and 5 affect development?
Unity | Unity Reference Content | 225
Answer: All developers should use Unity 5 or later. The Unity 4 Integration is maintained for legacy purposes
only.
In Unity 4, you must import our Legacy Integration for Unity 4 unitypackage and add the supplied VR camera
and player controller to your application to add Rift or Gear VR support.
Question: Where can I ask questions or get help?
Answer: Visit our developer support forums at https://developer.oculus.com. Our Support Center can be
accessed at https://support.oculus.com.