Leap Motion’s ‘Interaction Engine’ Aims for Effortless VR Input


Leap Motion continues to refine their hand-tracking tech for intuitive controller-free interactivity. The company’s latest focus has been on an ‘Interaction Engine’ which usurps standard physics engines when it comes to defining interactions between user input and virtual objects.

Leap Motion‘s Caleb Kruse demonstrated the company’s work on the Interaction Engine, which is said to form the foundation of intuitive and accurate interactions with a variety of objects. With that foundation in place, developers can focus on creating useful experiences rather than having to having to find out the best way to program interactions from the ground up. The Interaction Engine is still internal at Leap Motion, but the company tells us that they plan to release it widely to developers in the near future.

The Interaction Engine is a sort of intermediary between the user’s input and the physics engine, Kruse told us. Left to physics alone, grabbing an object too tightly might cause it to fly out of your hand as your fingers phase through it. The Interaction Engine, on the other hand, tries to establish your intent (like grabbing, throwing, or pushing) based on what the Leap Motion tracker knows about your hand movements, rather than treating your hand in VR like any other object in the physics simulation.

The result is more intuitive and consistent control when interacting with objects in VR—something that’s been a major hurdle for Leap Motion’s computer-vision based input. Now it’s easier and more predictable to grab, throw, and push objects.

Wallace & Gromit is Coming to Quest 2 in 'Grand Getaway' Interactive VR Experience

While developing the Interaction Engine, Leap wanted to be able to quantify the efficacy of their hand input, so they created a simple demo task in VR where users reach out to grab a highlighted ball and place it in a randomly indicated position. Through testing hundreds of users, Kruse said the company found people to be around 96% accurate in this task when using the Interaction Engine.

leap motion at indiecade
See Also: Phase Between the Real and Virtual World With Leap Motion and a Swipe of Your Hand

Another demo which utilized the Interaction Engine allows you to create cubes of varying sizes by pinching your thumb and index finger together to form a recognizable gesture. Then, when moving your hands close together, the outline of a cube forms and you can move your hands back and forth (like a pinch zoom) to set your desired scale.

When I tried these demos myself, I noted how the system was impressively able to understand that I was still holding objects even when I occluded my fingers with the back of my hand. The cube demo was fun an easy to use (especially with gravity turned off) and while I wasn’t quite as adept as Kruse in manipulating objects, his skills are a demonstration that it’s possible to get better at using the system over time (which means, by necessity, there’s a vital aspect of consistency to the system).

Grasping virtual objects which have no physical representation is still a strange affair, but the Interaction Engine definitely enhances predictability and consistency in object interactions, which is incredibly important for the practicability of any input method.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.

  • Super Game-guru


  • Zach Gray

    I tried working with leap and the UE4 plugin+dk2. The solve on the hands was so poor as to be unusable. Fingers crossing, getting stuck, not able to determine front/back of hand, etc. I see they are using pass through. Leap is a great idea, but I didn’t find it stable enough to make any substantive progress.

    • teknx

      How long ago was this? Aren’t they constantly putting out software updates?

      • Zach Gray

        I used the most recent version of the community version here: https://github.com/getnamo/leap-ue4. I had trouble with the official one working in HMD mode, but perhaps it’s time to try that again.

    • Same here.

    • Mickaël Fourgeaud

      Been working on an update to the ue4 unofficial plugin (from getnamo), so far it seems a bit more accurate and fast than the original version, but there’s definitely room for improvement on the rigging of the bones.

    • Yah, did they recently improve the hardware in the last 6 months or is he in some sort of “perfect-lighting” scenario? Because when I used UE4 and Unity demos that accessed the Leap mounted on my DK2, the results ranged from terrible to unusable. It lost my hands and fingers constantly.


    This looks quite promising. The Interaction Engine. Impressive. Kudos to Leap Motion for showing that there is still lots of room for VR innovation.

  • Chris Blackburn

    Werent they working on a new sensor specifically for vr called the dragonfly sensor or something, id love to see that sonce it was supposed to be even more advanced

  • Andrew Jakobs

    Is this still working on the original LeapMotion or is this with a newer version with better sensors.. Well, looking forward to the latest SDK.. hehe.. The LeapMotion has become so much better over the time I’ve bought it for around $30 on amazon.. Yes, it sucked as the block behind your keyboard, but it excels at being stuck to the front of the DK2 (or later the CV1).. Hmm which reminds me, will the CV1 have an USB passthrough that this time IS any usefull…

  • Martin

    Wooot! I really like to do that as well – where is the download button !?!

  • DrN00b

    I am very curious if a smartphone version would be possible considering it already has a back face camera. Especially since most people have smart phones and the oculus price tag.

  • Trailmix

    Is this only compatable with the Oculus? Or does it work with the Vive and maybe the PlayStation vr as well