Meta Releases ‘First Hand’ Demo to Showcase Quest Hand-tracking to Developers


Meta this week released a new demo app called First Hand to showcase the kind of experiences that developers can build with the company’s controllerless hand-tracking tools.

Controllerless hand-tracking has been available on Quest for years at this point, and while it’s a more accessible input modality than using controllers, controllers are still the main form of input for the vast majority of games and apps on the headset.

Meta has been increasingly pushing for developers to embrace hand-tracking as more than a novelty, and to that end has been building tools to make it easier for developers to take advantage of the feature. But what’s better than a good hands-on example?

This week Meta released a new demo exclusively built around hand-tracking called First Hand (named in reference to an early Oculus demo app called First Contact). Although the demo is largely designed to showcase hand-tracking capabilities to developers, First Hand is available for anyone to download for free from App Lab.

Over at the Oculus developer blog, the team behind the app explains that it was built with the ‘Interaction SDK’ which is part of the company’s ‘Presence Platform‘, a suite of tools made to help developers harness the mixed reality and hand-tracking capabilities of Quest. First Hand is also released as an open source project, giving developers a way to look under the hood and borrow code and ideas for building their own hand-tracking apps.

The development team explained some of the thinking behind the app’s design:

First Hand showcases some of the Hands interactions that we’ve found to be the most magical, robust, and easy to learn but that are also applicable to many categories of content. Notably, we rely heavily on direct interactions. With the advanced direct touch heuristics that come out of the box with Interaction SDK (like touch limiting, which prevents your finger from accidentally traversing buttons), interacting with 2D UIs and buttons in VR feels really natural.

We also showcase several of the grab techniques offered by the SDK. There’s something visceral about directly interacting with the virtual world with your hands, but we’ve found that these interactions also need careful tuning to really work. In the app, you can experiment by interacting with a variety of object classes (small, large, constrained, two-handed) and even crush a rock by squeezing it hard enough.

The team also shared 10 tips for developers looking to make use of the Interaction SDK in their Quest apps, check them out at the developer’s post.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.

Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • It’s in App Lab for a reason, it’s a neat first step into physics-reactive hand tracking apps, but it’s missing a lot of the polish and interactivity and length that First Contact had. Definitely fun to play through once but doesn’t have anything to make me want to come back to it. You play one (very badly performant) mini game after making the robot hands, then get teleported outside, wave to another robot, and the demo is over. Would love for there to be more to do, and more to pick up, inside the demo. There’s an artistic scifi touch controller, for instance, which is glued to a table, not able to be picked up by the gravity gloves. Same with the small toy robot from First Contact. It’s in the scene, but can’t be picked up or interacted with, just being offline off to the side, again, glued to a shelf.

    • Also, on an unrelated note, the gravity grab feature here looks like it should work the same way as Alyx gloves, point a fuzzy energy line at a selected object, pull the grab trigger and then whip your hand to pull it towards you… but it doesn’t work like that. Instead, you point the palm-laser thing at a selected object and hit the grab button and… it just snaps to your hand, almost immediately in some cases. Doesn’t feel nearly as nice as the Alyx Gloves. There are obviously going to be some drawbacks to making a gravity-gloves style demo on Quest but their gravity grab implementation just lacks pizaz here.

  • I’ll give this a try at some point

  • Is this going to be another Meta app forced onto my decide that I can’t delete?

    They have about ten apps on there now, and half of them are just wasting space for me, particularly the crap around tracking calories/fitness or whatever it is.

    • Richard R Garabedian

      that is every game for me. Even if i delete it its still there and i cannot get rid of it no matter what

  • Raphael

    A nice intro to mixed reality. Will appeal to kids especially.

  • Till Eulenspiegel

    Hand tracking is still a novelty and not accurate enough, it’s easier to just use the controller. Also, you lose haptic feedback without the controller so games feel less interactive.

  • Richard R Garabedian

    its a garbage app. I dont know how oculus made it. feels like a 12 year olds attempt at hand tracking