Oculus finally revealed that Quest will get controllerless hand-tracking input. We got to test the feature at Oculus Connect 6 last week and came away impressed. Hand-tracking will open the door to a range of casual use-cases that will broaden the headset’s appeal, but core gaming experiences will still rely on the precision and reliability of controllers.

Hand-tracking is coming to Quest early next year, Facebook announced last week at Oculus Connect 6. The appeal is clear: using your hands instead of controllers makes the headset that much easier to use; instead of learning the placement and functions of buttons and sticks, users will be able to throw on the headset and simply point and touch their way around virtual worlds.

At Connect, Oculus showed off the hand-tracking with a demo called Elixir made by VR studio Magnopus. The setting was fantastical witch’s workshop where I got to poke, prod, and play with spells and magic.

Quest Hand-tracking Demo

It was clear from the outset that the entire demo was designed and built specifically for hand-tracking. The demo revolved around touching, poking, and pinching interactions, but, notably, no ‘holding’ or direct manipulation of objects except in the case of a pen (which is a very purposeful choice that I’ll discuss more later).

Elixir is a sandbox-ish experience designed to show some of the ways that hand-tracking could be used for gaming interactions. In the experience I was asked to touch my hands to various objects which would turn them from human hands into magical hands which would give me unique powers.

For instance I was prompted to touch my hand to something which looked like a hot-plate crossed with grill which caused my hands to turn into fire-infused hands. Then when I waved my fire hands over some candles they were ignited. Another object I touched turned my hands into the hands of a creature with Wolverine-like claws which extended when I made a first.

The coolest and most inventive of these hand transformations was when I dunked my hands into a cauldron and they turned into octopus tentacles! The tentacles were longer than my actual fingers and they were physics-enabled, so when I wiggled my fingers the tentacles wobbled about in a gross but oddly satisfying way.

Designing Around Haptics Limitations

Aside from the hand transformations (and the ‘make fist’ gestures which powered them), a ‘pinch’ gesture was used at several points throughout the experience which, from a design standpoint, served as a sort of virtual button press. For instance, hanging above the cauldron was a liquid dispenser with an eye-dropper top which squirted liquid into the cauldron when squeezed. Elsewhere, a miniature bellows could be squeezed with a pinch to brighten a flame.

Because the pinching gesture naturally involves touching your own fingers, it provides a sort of self-haptic sensation which feels more natural than pressing a virtual button with no feedback at all.

This is the same reason why the magic pen—which was offered to me at the end of the experience to sign my name on a big scroll—was the only object in the experience which could actually be directly held and manipulated. It uses effectively the same pinching gesture as before (mostly replicating a real pen grip), which offers a haptic sensation since you’re actually touching your own fingers.

A good general purpose ‘grab’ gesture for arbitrarily sized and shaped objects—as has been attempted with gestures like making a fist or a ‘C’ shape with your hand—has remained quite elusive, as the lack of feedback just doesn’t feel quite right (not to mention issues with consistently detecting the pose to prevent items from dropping). That said, even a gesture-based ‘grab’ that doesn’t feel immersive could be perfectly useful for non-entertainment applications where immersion isn’t a high priority.

Continued on Page 2: Performance & Limitations »

1
2
Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Really excited to try this out. It seems like it’ll be great for simple input situations like browsing the internet or watching Netflix.
    I also have to wonder how the LBE crowd feels about this feature. There are a number of experiences out there that use Leap Motion for hand tracking. Having it essentially baked into the headset with a much larger tracking volume seems like potentially a big win.

  • Pablo C

    If this is good enough to finger-track to a digital keyboard, it will be gamechanger. It could make the Quest a production tool.

  • Mateusz Pawluczuk

    “the headset only knows your intended input when your hands are clearly visible. This makes it difficult for the system to know if you’re choosing to continue to hold an object (…)”

    That’s where Ctrl-Labs brain reading wristband comes in ( ͡° ͜ʖ ͡°)

  • As someone told on Twitter, this is not comparable with Leap Motion yet. But Facebook has plenty of money to invest. And this is running on a mobile headset

  • kontis

    It’s extremely strange that this article doesn’t mention the most important formation: a comparison with Leap Motion.

    • Trip

      I suspect that it will be superior to Leap Motion due to the form factor if nothing else. We know Leap Motion has only two cameras that are very close together. It also appears to have a far inferior camera resolution when compared to Quest and Rift S. Most importantly, Leap motion has been around since before VR and it still has had very little integration since the adoption rate is so low. Every year or so I dig mine out and do battle with it for a couple days trying to get decent controller emulation out of it, then put it back away in disappointment. The problem is the lack of official support, not the device. A feature that every Quest (and hopefully Rift S) user has by default would start to give devs a reason to support this method of input. I’m also very much hoping Oculus will allow the hand tracking to do it’s best to emulate Touch controls across all apps. That would be a huge selling point.

  • Trip

    Hi Ben. Do you think that Oculus will give the option to have your hands emulate Touch controllers across all apps? I know it would be terrible in many applications so I wouldn’t be surprised if they don’t but I really wish they’d give us the tools and let us decide what we want to use it in. Might you be able to put in an inquiry with Oculus?

  • Maddy

    this would be cool in racing games ! Like dirt rally 2.0.
    i want to see my hands synchronized there :)

  • C B

    I hope this will be released for the Rift-S as well, that unit is starting to feel like an after thought for Oculus.

  • TwowittnesesForerunner7

    Aside from Trade Mark & Copy Right infringments, Can Occulus ? Valve ? Some other company create a ” FALLOUT ” type of PIP – boy VR wrist controler with a CPU – GPU – extra battery power for the VR Headset ? or some kind of wearable vest / haptic body shield / backpack for to place a more powerful CPU – GPU – extra battery cartriges ? and still be untethered / automonious ?