API Overview

The Leap Motion system recognizes and tracks hands, fingers and finger-like tools. The device operates in an intimate proximity with high precision and tracking frame rate and reports discrete positions, gestures, and motion.

The Leap Motion controller uses optical sensors and infrared light. The sensors are directed along the y-axis – upward when the controller is in its standard operating position – and have a field of view of about 150 degrees. The effective range of the Leap Motion Controller extends from approximately 25 to 600 millimeters above the device (1 inch to 2 feet).

https://di4564baj7skl.cloudfront.net/documentation/v2/images/Leap_View.jpg

The Leap Motion controller’s view of your hands

Detection and tracking work best when the controller has a clear, high-contrast view of an object’s silhouette. The Leap Motion software combines its sensor data with an internal model of the human hand to help cope with challenging tracking conditions.

Coordinate system

The Leap Motion system employs a right-handed Cartesian coordinate system. The origin is centered at the top of the Leap Motion Controller. The x- and z-axes lie in the horizontal plane, with the x-axis running parallel to the long edge of the device. The y-axis is vertical, with positive values increasing upwards (in contrast to the downward orientation of most computer graphics coordinate systems). The z-axis has positive values increasing toward the user.

https://di4564baj7skl.cloudfront.net/documentation/v2/images/Leap_Axes.png

The Leap Motion right-handed coordinate system.

The Leap Motion API measures physical quantities with the following units:

Distance: millimeters
Time: microseconds (unless otherwise noted)
Speed: millimeters/second
Angle: radians

Motion tracking data

As the Leap Motion controller tracks hands, fingers, and tools in its field of view, it provides updates as a set – or frame – of data. Each Frame object representing a frame contains lists of tracked entities, such as hands, fingers, and tools, as well as recognized gestures and factors describing the overall motion in the scene. The Frame object is essentially the root of the Leap Motion data model.

To read more about Frames, see Frames.

Hands

The hand model provides information about the identity, position, and other characteristics of a detected hand, the arm to which the hand is attached, and lists of the fingers associated with the hand.

Hands are represented by the Hand class.

https://di4564baj7skl.cloudfront.net/documentation/v2/images/Leap_Palm_Vectors.png

The Hand PalmNormal and Direction vectors define the orientation of the hand.

The Leap Motion software uses an internal model of a human hand to provide predictive tracking even when parts of a hand are not visible. The hand model always provides positions for five fingers, although tracking is optimal when the silhouette of a hand and all its fingers are clearly visible. The software uses the visible parts of the hand, its internal model, and past observations to calculate the most likely positions of the parts that are not currently visible. Note that subtle movements of fingers tucked against the hand or shielded from the Leap Motion sensors are typically not detectable. A Hand.Confidence rating indicates how well the observed data fits the internal model.

More than two hands can appear in the hand list for a frame if more than one person’s hands or other hand-like objects are in view. However, we recommend keeping at most two hands in the Leap Motion Controller’s field of view for optimal motion tracking quality.

Arms

An Arm is a bone-like object that provides the orientation, length, width, and end points of an arm. When the elbow is not in view, the Leap Motion controller estimates its position based on past observations as well as typical human proportion.

Fingers

The Leap Motion controller provides information about each finger on a hand. If all or part of a finger is not visible, the finger characteristics are estimated based on recent observations and the anatomical model of the hand. Fingers are identified by type name, i.e. thumb, index, middle, ring, and pinky.

Fingers are represented by the Finger class, which is a kind of Pointable object.

https://di4564baj7skl.cloudfront.net/documentation/v2/images/Leap_Finger_Model.png

Finger |Finger_tipPosition|_ and |Finger_direction|_ vectors provide the position of a finger tip and the general direction in which a finger is pointing.

A Finger object provides a Bone object describing the position and orientation of each anatomical finger bone. All fingers contain four bones ordered from base to tip.

Intermediate phalanges Proximal phalanges Distal phalanges Metacarpals 0-length thumb metacarpal

The bones are identified as:

  • Metacarpal – the bone inside the hand connecting the finger to the wrist (except the thumb)
  • Proximal Phalanx – the bone at the base of the finger, connected to the palm
  • Intermediate Phalanx – the middle bone of the finger, between the tip and the base
  • Distal Phalanx – the terminal bone at the end of the finger

This model for the thumb does not quite match the standard anatomical naming system. A real thumb has one less bone than the other fingers. However, for ease of programming, the Leap Motion thumb model includes a zero-length metacarpal bone so that the thumb has the same number of bones at the same indexes as the other fingers. As a result the thumb’s anatomical metacarpal bone is labeled as a proximal phalanx and the anatomical proximal phalanx is labeled as the intermediate phalanx in the Leap Motion finger bone model.

(Original diagram by Marianna Villareal.)

Tools

A tool is an object like a pencil.

Tools are represented by the Tool class, which is a kind of Pointable object.

https://di4564baj7skl.cloudfront.net/documentation/v2/images/Leap_Tool.png

A tool is longer, thinner, and straighter than a finger.

Only thin, cylindrical objects are tracked as tools.

Note that as of version 2, tools are independent of hands.

Gestures

The Leap Motion software recognizes certain movement patterns as gestures which could indicate a user intent or command. Gestures are observed for each finger or tool individually. The Leap Motion software reports gestures observed in a frame the in the same way that it reports other motion tracking data like fingers and hands.

Gestures are represented by the Gesture class and its subclasses, CircleGesture, KeyTapGesture, ScreenTapGesture, and SwipeGesture.

The following movement patterns are recognized by the Leap Motion software:

Circle — A finger tracing a circle.Swipe — A long, linear movement of a hand and its fingers.
Key Tap — A tapping movement by a finger as if tapping a keyboard key.Screen Tap — A tapping movement by the finger as if tapping a vertical computer screen.

Important: before using gestures in your application, you must enable recognition for each gesture you intend to use. The Controller class has an enableGesture() method that you can use to enable recognition for the types of gestures you use.

Motions

Motions are estimates of the basic types of movements inherent in the change of a user’s hands over a period of time. Motions include scale, rotation, and translation (change in position).

https://di4564baj7skl.cloudfront.net/documentation/v2/images/Motion_Graphic.png

Motions are computed between two frames. You can get the motion factors for the scene as a whole from a Frame object. You can also get factors associated with a single hand from a Hand object.

You can use the reported motion factors to design interactions within your application. For example, instead of tracking the change in position of individual fingers across several frames of data, you could use the scale factor computed between two frames to let the user change the size of an object.

Motion Type Frame Hand
Scale Frame scaling reflects the motion of scene objects toward or away from each other. For example, one hand moves closer to the other. Hand scaling reflects the change in finger spread.
Rotation Frame rotation reflects differential movement of objects within the scene. For example, one hand up and the other down. Hand rotation reflects change in the orientation of a single hand.
Translation Frame translation reflects the average change in position of all objects in the scene. For example, both hands move to the left, up, or forward. Hand translation reflects the change in position of that hand.

Sensor Images

Along with the computed tracking data, you can get the raw sensor images from the Leap Motion cameras.

https://di4564baj7skl.cloudfront.net/documentation/v2/images/Leap_Image_Raw.png

A raw sensor image with superimposed calibration points.

The image data contains the measured IR brightness values and the calibration data required to correct for the complex lens distortion. You can use the sensor images for augmented reality applications, especially when the Leap Motion hardware is mounted to a VR headset.

For more information, see Camera Images.