Leap Motion is a depth-sensing controller used for hand tracking. The device can optionally be mounted on a HMD for hand tracking in VR/AR platforms.
- 1 Getting Started
- 2 Programming Information
- 3 Art (2D & 3D) Information
- 4 Design Guidelines & Tips
- 5 Example Projects
- 6 Previous ETC Projects
- 7 BVW
- Download the Orion SDK and install the SDK.
- If using in conjunction with Oculus Rift or HTC Vive, follow the setup instructions on the website to stick Leap Motion on the headset (For BVW classes, TAs will do that for you.)
- Go to Unity Assets page for Leap Motion (You will need a developer account to download the files). Download Core Unity Assets package. If you need a high level interaction system (like grab, attch), download Interaction Engine module from that page. If you need to use your hand model and want to rig it, download Hands module from that page.
- Import all the packages you have downloaded. The Leap Motion package will present a window showing necessary scene checks. Fix anything it shows and then close the window.
- Go to Assets -> LeapMotion -> Core -> Prefabs, drag [LeapHandController] into the scene. There should one script missing and removing that should be fine.
- Drag two hand model prefabs (Left and Right, under Assets -> LeapMotion -> Core -> Prefabs) into [LeapHandController], and set Model Pool size to 1 and configure with these hand models.
The transform of the [LeapHandController] is where your hands will be mapped in the virtual scene.
- Enable [Virtual Reality Supported] in Project Settings -> Player -> PC -> XR Settings. Remove OpenVR if you are using Oculus, or Oculus if you are using SteamVR.
- Remove [Main Camera] from the scene.
- Select Assets -> LeapMotion -> Core -> Prefabs -> Leap Rig, drag [Leap Rig] into the scene.
The documentation for the Leap Motion Unity Modules does a good job of explaining the intricacies of using the SDK, particularly for the Interaction Engine.
Configuring Hand Models
The Hands module provides with a tool to bind Leap Motion scripts to a rigged hand model. You can check the Hands module doc to learn more on how to do so.
The Detection Utilities is a deprecated feature (but still very useful now) in Leap Motion to provide finger-based gestures and low-level gesture detection using Detector scripts. You can register to OnActivate and OnDeactivate events on detectors to get the input.
An earlier version of the example scenes can be found in this link. The object hierarchy is a little bit different in this early version but the basic idea remains the same. You can refer to the doc here to learn more about this feature.
The Interaction Engine provides high-level interaction with physical or pseudo-physical objects in your scene and handles hover near, touch, or grasp gestures. To start with this module, you need to drag [Interaction Manager] prefab from Assets -> LeapMotion -> Modules -> InteractionEngine -> Prefabs into the scene to enable this feature.
To create an interactable object, try adding [InteractionBehaviour] component onto your object, and add Hover/Contact/Grasp callbacks on this component. Note that all interaction behaviours in Leap Motion use Unity physics; therefore all objects meant to be interacted separately with hands must be a RigidBody.
You can learn more information about this module from the official doc.
Art (2D & 3D) Information
Leap Motion has a few default hand representations -- Capsule Hands, Rigged Hands, and Attachment Hands. You can also rig your own hand models using the Hands Module.
Hand representation has a large effect on the perceived affordance of hands in VR. If you choose to use a custom hand representation, think carefully about what abilities are conveyed intrinsically by the visual representation of your hands.
Design Guidelines & Tips
- The Leap Motion's field of view (FOV) is 150 degrees wide and 120 degrees deep (averaging 135 degrees). In most cases you should never worry about this.
- The valid range for hand detection is about 25 mm to 600 mm (1 inch to 2 feet) above the device.
- Flipping your hand palms would help Leap Motion to quickly recognize your hands.
- Leap Motion has accurate tracking for each finger as well as general gesture support with Interaction Engine. It is always a good idea to give clear feedback to hand action (either physically or in virtual worlds).
- While the Leap Motion tracker is fairly robust, you should always be prepared for loss of hand tracking and inaccurate hand poses, particularly when the hands are placed together. Displaying clear feedback about the state of the tracking will be essential to maintaining a responsive experience.
Currently there are no example projects for Leap Motion.
Previous ETC Projects
Currently there is no BVW-specific information to know about Leap Motion.