Experiment #2: Telekinetic Powers

While the first experiment handled summoning and dismissing one static object along a predetermined path, we also wanted to explore summoning dynamic physics-enabled objects. What if we could launch the object towards the user, having it land either in their hand or simply within their direct manipulation range? This execution drew inspiration from Force-pulling, Magneto yanking guns out of his enemies’ hands, wizards disarming each other, and many other fantasy powers seen in modern media.

In this experiment, the summonable objects were physics-enabled. This means that instead of sitting up at eye level, like on a shelf, they were most likely resting on the ground. To make selecting them a more low-effort task, we decided to change the selection mode hand pose from overhanded, open palm-facing-the-target pose to a more relaxed open palm-facing-up with fingers pointed toward the target.

To allow for a quicker, more dynamic summoning, we decided to condense hovering and selecting into one action. Keeping the same underlying raycast selection method, we simply removed the need to make a selection gesture. Keeping the same finger-curling summon gesture meant the user could quickly select and summon an object by pointing toward it with an open, upward-facing palm, then curling the fingers.

Originally, we used the hand as the target for the ballistics calculation that launched a summoned object towards the user. This felt interesting, but having the object always land perfectly on one’s hand felt less physics-based and more like the animated summon. To counter this, we changed the target to an offset in front of the user – plus a slight random torque to the object to simulate an explosive launch. Adding a small shockwave and a point light at the launch point, as well as having each object’s current speed drive its emission, completed the explosive effect.

Since the interaction had been condensed so much, it was possible to summon one object after another in quick succession before the first had even landed.

Users could even summon objects again, in mid-air, while they were already flying towards the user.

This experiment was successful in feeling far more dynamic and physics-based than the animated summon. Condensing the stages of interaction made it feel more casual, and the added variation provided by enabling physics made it more playful and fun. While one byproduct of this variation was that objects would occasionally land and still be out of reach, simply summoning it again would bring it close enough. While we were still using a gesture to summon, this method felt much more physically based than the previous one.

Continued on Page 3: Extendo Hands! »

1
2
3
Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


  • dk
    • Betty

      Gℴℴgle pays now 97 dollars per hour to complete esay work on apple laptop .. Do work only for few hours in a whole day and fun greater time together with your family … any individual can also join it…on weekend I bought a great new Acuraafter just earnin $7934 last four weeks .it is truly the easiest job however you won’t forgive yourself if you don’t have a peek at this.!ux902a:=>=>=> http://GoogleCashWatchOpportunities/earn/hourly ♥♥p♥♥♥x♥♥♥u♥♥♥s♥o♥♥s♥p♥♥♥f♥♥♥v♥o♥♥v♥♥b♥♥♥i♥i♥o♥♥a♥p♥♥k♥r♥x♥r♥g♥m♥o♥♥d:::::!ox53g:chgwj

  • Great interaction methods here. I also think that more needs to be done to reduce arm fatigue. So all the above done with with arms at rest and just finger / wrist flicks and no forearm movement. In addition to that, using both hands simultaneously working together for a single interaction or multi-tasking with separate but simultaneous actions.

  • SoftBody

    Leap Motion should go into software development instead. Their hand-tracking/interaction research is going to waste if nobody can use it if it ties to their hardware.

    Right now, hand-tracking is a novelty. But soon it will be an assumed feature. People will go with whatever cheapest and most accessible. Their tech will simply be left behind.

    • They already are, in part. Their SDKs work even without Leap Motion, they told me. You can use their input system even with your regular Vive or Rift

  • This series is damn interesting

  • Lucidfeuer

    Are Leap Motion the only hardware company who’s work makes sense? This is exactly the work to be done in VR/AR, and hundred times more.

    The whole industry’s set-up and future was compromised from the day Oculus varpowared all their common-sense plans to integrate NimbleBit hand-tracking and 13thLab inside-out tracking from Gen1 when they were they bought-out by the mediocre, greedy blue-collars of Facebook cost and risk managers who decide that making a crap limited product was a good idea to make money…

  • Yoan Conet

    Glad finally someone did it ! Thanx Leap motion
    Imagine grabbing your enemies at distance and release them in mid air and then suddenly stop them like puppets !
    To feel real when grabbing them, at the start, everything but their center of gravity (legs harms head) would have to try keeping its original position like in real life.
    And when suddenly stopping the flying enemy, everything but their center of gravity would keep its momentum, so the enemy moving like a puppet ! Haha

  • No Spam

    How much do you want to bet that there’s an internal demo at Leap that uses extendo-hands to simulate Vader’s Force choke?

    “VR is a fad? I find your lack of faith…disturbing.”

  • M Stauffer

    This is very cool! Is the code available? I’d like to try it in an upcoming academic data visualization project – rather than spending the time coding it myself of course.