The Leap Motion Controller tracks hands and fingers and reports position, velocity, and orientation with low latency and good accuracy. The controller can be mounted on a VR headset or be used on a tabletop.
The Leap Motion controller system consists of a hardware device and a software component which runs as a service or daemon on the host computer. The software component analyses images produced by the hardware and sends tracking information to applications. The Leap Motion Unity plugin connects to this service to get data. Scripts included with the plugin translate Leap Motion coordinates to the Unity coordinate system. These scripts and additional graphic assets make it easy to add 3D, motion-controlled hands to a Unity scene.
Unity3D uses a left-handed convention for its coordinate system, wheras the Leap Motion API uses a right-handed convention. (Essentially, the z-axis points in the opposite direction.) Unity also uses a default unit of meters, wheras the Leap Motion API uses millimeters. The plugin scripts internally transforms the tracking data to use the left-handed coordinate system and scales distance values to meters.
Note: When working in Unity, always get Frame objects from the LeapServiceProvider. Otherwise, the data in the frame will still be in the native Leap coordinate system – not the Unity coordinate system. The LeapServiceProvider performs all the necessary scaling, rotation, and translation.
The Leap Motion controller uses optical sensors and infrared light. The sensors have a field of view of about 150 degrees. The effective range of the Leap Motion Controller extends from approximately .03 to .6 meters above the device (1 inch to 2 feet).
Detection and tracking work best when the controller has a clear, high-contrast, silhouette view of the hands and fingers. The Leap Motion software combines its sensor data with an internal model of the human hand to help cope with challenging tracking conditions.