Because VR can take over our entire reality, it can be great for entertainment. But when it comes to AR, the hope is that the tech will be a transient and beneficial addition to reality, rather than taking over your world completely. But figuring out how that works means first understanding how we can interact with AR information at a basic level, like being able to do the same kinds of simple, information-driven tasks that you do hundreds of times per day with your smartphone. Leap Motion, a maker of hand-tracking software and hardware, has been experimenting with exactly that, and is teasing some very interesting results.

Smartphones are essential to our everyday lives, but the valuable information inside of them is constrained by small screens, unable to interact with us directly, nor the world around us. There’s widespread belief in the immersive computing sector that AR’s capacity for co-locating digital information with the physical world makes it the next big step for our smartphones.

Leap Motion has shown lots of cool stuff that can be done with their hand-tracking technology, but most of it is seen through the lens of VR. The company’s VP of Design, Keiichi Matsuda, however, has recently begun teasing prototypes for how the tech can be applied to AR, and the results are nothing short of a glimpse of what our smartphones will eventually become. Matsuda calls this prototype the ‘virtual wearable’:

The video is shot through an unidentified AR headset which is using Leap Motion’s camera-based hand-tracking module to understand the position of the user’s hands and fingers. Matsuda has envisioned some interesting affordances which uniquely work with the limitations of Leap Motion: the ‘flick tab’ menus are a smart stand-in for what most of us would think to represent as simple buttons; the visual lack of resistance helps reduce the expectation of tactile feedback. The footage also shows impressive occlusion, where the system understands the shape of the user’s hands and appropriately renders ‘clipping’ to make the AR menu feel like it really exists in the same plane as the user’s hands.

Exclusive: Designing 'Lone Echo' & 'Echo Arena’s' Virtual Touchscreen Interfaces

The design also draws an on existing, well established touchscreen interface affordances, like a line indicating the grab point of a sliding ‘drawer’ menu; it’s easy to see how this approach could be effectively used to convey and act upon the same sort of basic ‘notification’ type information that we frequently deal with on our smartphones.

Another video from Matsuda shows what the underlying hand-model, as tracked by Leap Motion, looks like to the system behind the scenes:

Leap Motion shared a sketch showing an expanded vision of the ‘virtual wearable’ concept:

Matsuda found his way to Leap Motion following the creation of two excellent short films which envision a future where AR is completely intertwined with our day to day lives: Augmented (hyper)Reality and its follow up, HYPER-REALITY (both definitely worth a watch). Now as Leap Motion’s VP of Design, he’s turning his ideas into (augmented) reality.

Leap Motion designers Barrett Fox and Martin Schubert have recently published a series of guest articles on Road to VR which are worth checking out:

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.

Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • dexkz

    I really hope LeapMotion tech will be integrated in future standalone VRAR headsets, as it is the only viable solution for a natural input

    • Laurence Nairne

      Yeah, it takes a bit of getting used to in VR because of a lack of tactile feedback on interaction behaviours, but for AR it’s the most robust system for hand input I’ve seen so far.

      That being said I think the public API is not as consistent as these short clips insinuate – hopefully there’s going to be a new release soon that gives this level of accuracy.

      • Stephanie

        Google paying every one 98 US dollars per-hr to complete few task with a computer … Do work Some only few time and fun greater time together with your relatives . Any person can also do this easy work!on Monday I got a great new Mercedes-Benz S-class after just getting $14523 past month .it sounds my favourite-job however you could no longer forgive yourself if you do not view this.!df50k:➽➽➽ http://GoogleCashOnlineWorkAtHome/get/pay/$97/perhour ♥♥♥t♥g♥♥z♥♥w♥♥♥t♥♥y♥♥♥u♥♥♥x♥♥♥a♥♥v♥c♥m♥♥z♥l♥x♥♥♥h♥j♥v♥♥w♥♥♥k♥♥g♥♥t♥♥t♥♥k♥♥a:::::::!uh531s:edrszwd

    • V2 version will be ONLY embedded into headsets. It is not clear which headsets, though

  • Adrian Meredith

    Leap really needs new hardware though. The tracking was very wonky when I tried it at the void with a very narrow usable volume

    • Laurence Nairne

      They created an improved version, but only for mobile VR and only as an in-built module.

      Tracking is still a little wonky and you have to seriously consider what forms of interaction you use and how robust they are, but I think that can mostly be fixed in software.

      The narrow volume is a noticeable issue though. I dontd think their hearts are in the hardware game though. They seemed very committed to getting HMD manufacturers to pick it up and build it into their own tech whilst LM just supply the APIs.

  • impurekind

    And at some point all of this will eventually come together to make AR something more than the currently pretty limited and mostly clunky gimmick that it is. . . .

  • We all asked further details, but he didn’t answer. I guess they are creating some hype before answering

  • anibalhenrique

    I am very happy with a super interesting prototype, I already imagine how efficient it will be with this cyber tool for us humans. Create our functionalities with the world of APIs, and have an attractive graphical interface for everyone. This is a major evolution, we are increasingly compressing the physical artifacts of technology, we have advantages in global communication by manipulating the physical phenomena of the earth. The automation implemented in this device would give us the opportunity to put our functions in the computational programming with more interactivity with the physical world of today, we can defend ourselves against the wastes that happens in our planet daily. We have technologies for the survival of humanity.