Oculus Rift

From ETC Public Wiki
Jump to: navigation, search

The Oculus Rift family (Oculus Rift and Oculus Rift S) is a virtual reality system that completely immerses you inside virtual worlds. Complete with touch controllers and two sensors (Oculus Rift only) for the ultimate VR experience.

Getting Started

Basic Setup

  1. Open your Unity Project.
  2. Go to Edit -> Project Settings.
  3. Go to Player -> PC Standalone -> XR Settings. Then check Virtual Reality Supported option, and make sure Oculus ranks first in the Virtual Reality SDKs. It is highly recommended to remove OpenVR from SDK list to avoid potential conflicts with SteamVR.
  4. Go to Other Settings section. Under Rendering change Color Space to Linear. Then under Configuration set Api Compatibility Level to .NET 4.x. Sometimes an editor restart is required.
  5. If you want to hide Splash Image when starting your application, go to Splash Image section and uncheck Show Splash Screen.
  6. If you want to hide config window when starting your application, go to Resolution and Presentation, and under Standalone Player Options set Display Resolution Dialog to Disabled or Hidden by Default.
  7. Download Oculus Integration plugin from Unity Asset Store.
  8. Import it into your project, and click on "Yes"/"Upgrade" for the two windows titled [Update Oculus Utilities Plugin] and [Update Spatializers Plugin]. This will require a restart for Unity, or you might have some weird issues later.

Scene Setup

  1. After you finish the Basic Setup, delete [Main Camera] from the hierarchy.
  2. From your project window, select Assets -> Oculus -> VR-> Prefabs.
  3. Drag [OVRCameraRig] your hierarchy
  4. You are all set! If you need a more complete camera set with player movement control, use [OVRPlayerController] instead of [OVRCameraRig].

Example Scenes

Oculus examples are under Assets -> Oculus -> SampleFramework -> Usage. Check them out to get started with the basics of Oculus platform.

Programming Information

Using Camera Rig

The Camera Rig includes several pivot points that you can attach game objects into like the case in SteamVR. If you want to move the camera rig, create an external object that wraps the camera rig and move the external object instead, as what [OVRPlayerController] does. The transform of [OVRCameraRig] cannot be modified by game code because it's the origin point tracked by Oculus Runtime.

The Camera Rig in Oculus is bound with a [OVRManager] script that includes most of the settings. For the camera, usually you only need to modify [CenterEyeAnchor]. For the two controllers, you can take a look into [LeftHandAnchor] and [RightHandAnchor]. Oculus introduces hand based interaction that is not configured in this way, so in most cases you will never need to change that.

To attach a controller model and show it in the scene, drag [OVRControllerPrefab] into the anchor points and you will see that!

In most cases you will never need to change this, but if you are switching between scenes that include Locomotion behaviours you might need to check the Camera Rig tracking and recenter options. Here are the relevant options:

  • Tracking Origin Type: The method that the origin of the tracking space will be calculated. The default option is Eye Level, which means the origin of [OVRCameraRig] is the initial device/eye position. Switching it to Floor Level will make the origin of [OVRCamerRig] become the center of the floor. This means if you want to place an object on the virtual floor, you can place the object with y coordinate being zero inside camera rig. These two modes make no difference in Oculus Rift S since both modes are working well, but in Oculus Rift CV1 the floor level is calculated from the height information you have configured in Oculus Runtime and might be not accurate.
  • Reset Tracker on Load: The option is disabled by default, which means the center of the camera rig is the same as the center of the physical play area. If you wish to map the center of camera rig in the virtual scene to where you are on entering the scene, enabling this option will help. For example, if you hope the player will always be on the virtual "seat" or cockpit, you can force the camera rig center mapped to the current player position in the real world.
  • Allow Recenter: If this option is enabled, the player can reset the center of the camera rig through Reset View button in the Oculus system menu.

Using Avatar

Avatar is a feature in Oculus SDK to provide expressive facial geometry and hands linked with the player's Oculus account. Players can customize their avatars and show themselves in social games as well.

While avatar is powerful for social networking games, it also provides a local avatar that can be used without network. You will gain body mesh and hand mesh automatically using this feature, which simplifies the development process of Oculus games.

To create a local avatar in the scene, drag [LocalAvatar] prefab into the scene. By default [Show First Person] should be enabled and you will see your hands and body immediately then! Check out more examples in SampleFramework to learn how to integrate avatars with other parts of Oculus SDK.

Note: The avatar feature is currently not working for Oculus Rift S in latest Unity plugin. We recommend you to avoid this feature before it's fixed.

Using Oculus Touch

Getting Input

Getting input from Oculus Touch controllers is very straightforward. The code snippet below shows how it works:

// Getting trigger and grip axes
float leftHandTrigger = OVRInput.Get(OVRInput.Axis1D.PrimaryIndexTrigger);
float rightHandGrip = OVRInput.Get(OVRInput.Axis1D.SecondaryHandTrigger);

// Getting thumbstick axis
Vector2 leftThumbstick = OVRInput.Get(OVRInput.Axis2D.PrimaryThumbstick);

// Getting button input
bool isAPressed = OVRInput.Get(OVRInput.RawButton.A);
// One, Two, Three, Four are mapped to A, B, X, Y
bool isAPressedDown = OVRInput.GetDown(OVRInput.Button.One);
bool isThumbstickPressed = OVRInput.Get(OVRInput.Button.PrimaryThumbstick);

You can find the complete input mapping in Touch Input Mapping.

Triggering Haptic

The code below demonstrates how haptic works:

// Start haptic with maximum frequency and amplitude; both parameters are from 0-1
OVRInput.SetControllerVibration(1, 1, OVRInput.Controller.RTouch);

yield return new WaitForSeconds(0.5f);

// Stop haptic
OVRInput.SetControllerVibration(0, 0, OVRInput.Controller.RTouch);

Using Hands

Oculus introduces a hand based interaction system that features grabbing by grip button (hand trigger). It can be also mapped to virtual hand models that support display of different fingers (thumb mapped to surface buttons, index mapped to index trigger, other fingers mapped to hand trigger).

The two basic components that support this system are [OVRGrabber] for hands/controllers and [OVRGrabbable] for objects. Both components can be inherited to override their behaviours like GrabBegin and GrabEnd. You should check examples scenes like AvatarGrab and DistanceGrab to learn how to set up these components.

If you wish to use customized hand model, the example scene [CustomHands] provides a good example on how to implement it.

Note: In Oculus Rift S, when you are using avatars the avatar models and controllers will be not displaying but the interaction is still functioning. We recommend you to attach a controller model to anchor points or using custom hands to avoid this problem before it's fixed.

Stereo Rendering

Stereo rendering may introduce some problems to shaders that use render texture and camera position. In most cases you need to manually modify the shader and related scripts to make sure the camera poses are correct for each eye. Please check Unity's reference and this post for more information.

Useful Links

  1. https://www.youtube.com/watch?v=r1kF0PhwQ8E&list=PLrk7hDwk64-Y7ELKfkw8ox8TaT9y3gNpS&index=7

Art (2D & 3D) Information

Currently there is no art information to know about the Oculus Rift.

Sound Information

Oculus provides with a spatializer that can enhance sound performance. You can learn more about this feature and guidelines for spatial audio in this link.

Design Guidelines & Tips

  • Oculus Touch provides a way to act like a hand and introduces virtual hand gestures in the input module. Taking good use of it would be great instead of using X/Y and A/B buttons.
  • For Oculus Rift family, the floor calibration is not as good as Vive's Lighthouse. Take special care when you are designing interaction with objects on the floor level.
  • For Oculus Rift CV1, when you are using two-sensor setup your headset will be tracked correctly only if it can be seen by the sensor.
  • The Oculus Touch controller is only visible when it's in the headset's camera view (Oculus Rift S) or it can be seen by the sensor (Oculus Rift CV1).

Example Projects

Currently there are no example projects for the Oculus Rift.

Previous ETC Projects

For previous projects that use Leap Motion with Oculus, please check the Leap Motion page.

  1. https://www.youtube.com/watch?v=Xj1we7ZGdiI
  2. https://www.youtube.com/watch?v=SVxwAUZltbw
  3. https://www.youtube.com/watch?v=5BROfIE7Su8
  4. https://www.youtube.com/watch?v=9kDKViIlSg8
  5. https://www.youtube.com/watch?v=cSnc4Qp19Cc
  6. https://www.youtube.com/watch?v=Xe7gFQItzFw

BVW

Currently there is no BVW-specific information to know about the Oculus Rift.