Oculus plans to further open up the mixed reality capabilities of Quest with new tools that will allow developers to build apps which more intelligently integrate with the user’s real room. In the near future developers will also be permitted to distribute mixed reality apps to customers via the Quest store or Oculus App Lab for the first time.

Oculus first began unlocking Quest’s mixed reality capabilities with the Passthrough API which allowed developers to tap into the headset’s pass-through video view for the first time earlier this year. Now the company is announcing a more advanced set of tools, which it calls the Presence Platform, which will allow developers to build more advanced mixed reality applications.

The Presence Platform includes the Insight SDK, Interaction SDK, and Voice SDK.

Insight SDK

The main building block of the insight SDK is the Passthrough feature, which developers previously had access to in an experimental form. That feature is moving out of its experimental form and into general availability starting with the next developer update.

Additionally, the Insight SDK includes Spatial Anchors which gives developers the ability to place virtual objects in the scene and allow them to persist between sessions. For instance, a piano learning app could allow you mark the location of your piano, and the app could then remember where the piano is any time you open it.

The Insight SDK further includes Scene Understanding, which Oculus says allows developers to build “scene-aware experiences that have rich interactions with the user’s environment.” This includes geometric and semantic representation of the user’s space, meaning developers can see the shape of the room and get a useful idea of what’s in it. For instance, the Scene Understanding feature will allow developers to know what parts of the scene are walls, ceilings, floors, furniture, etc all of which can be used as a surface on which virtual content can be naturally placed.

Oculus says the developer will see a “single, comprehensive, up-to-date representation of the physical world that is indexable and queryable.” You can think of this like the headset building a map of the space around you that developers can use as a guide upon which to build a virtual experience that understand your physical space.

However, users will need to do some work on their end in order to generate this map for apps that need it, including marking their walls and tracing over their furniture.

Crucially Oculus says that the Insight SDK will enable developers to build feature-rich mixed reality apps “without needing access to the raw images or videos from your Quest sensors.” We’ve reached out to the company to further clarify if Oculus itself will send the raw sensor footage off of the headset for any processing, or if it will all happen on-device.

The Scene Understanding portion of the Insight SDK will launch in an experimental form early next year, according to the company.

Interaction SDK

Another part of the Presence Platform is the Interaction SDK which will give Unity developers a ready-made set of simple interactions for hands & controllers, like poking buttons, grabbing objects, targeting, and selecting. This saves developers time in building their own versions of these commonly used interactions in their apps.

Oculus says the goal of the Interaction SDK is to “offer standardized interaction patterns, and prevent regressions [in tracking performance of specific interactions] as the technology evolves,” and further says that the system will make it easier for developers to build their own interactions and gestures.

The company says that the Interaction SDK (and the previously announced Tracked Keyboard SDK) will become available early next year.

Voice SDK

The Voice SDK portion of the Presence Platform will open up voice-control to Quest developers, which Oculus says can drive both simple navigation functions (like quickly launching your favorite Beat Saber song with your voice) and gameplay (like casting a voice-activated spell).

The system is based on Facebook’s Wit.ai natural language platform which is free to use. Oculus says the Voice SDK will arrive in an experimental form in the next developer release.

Mixed Reality Apps on the Quest Store and App Lab

While not all of the Presence Platform SDKs will arrive at the same time, as of the next Quest developer release, devs will be allowed to ship mixed reality apps via the Quest store or App Lab. That release is expected next month.

The World Beyond Sample App

Early next year Oculus says it will make available a sample project called The World Beyond which developers can use as a starting point for building atop the Presence Platform features. The app will also be made available to users.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.

Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • kontis

    Does the voice recognition in Voice SDK require connection to the internet? Can it be done locally on Quest?

  • Stephanie Sayers

    I’ve obtained $20190 in 21 days by easily working easy jobs on my home computer.~od526~Immediately once I had lost my last position, I was very unsettled but thank God I started this super on-line opportunity and doing this I can figure out how to produce thousands from the comfort of my home.~od526~Any individual can join this career and earn extra greenbacks online by viewing this page.. >>> http://www.jobfire99.com


    After using VR for 5 years now, I believe the Insight SDK and Passthrough gaming are what most VR game types need to be usable day to day. Complete blackout VR is enthralling and fun when you have a guide with you, when you are seated, or when you’re really committed to setting up your playspace and playing for a few hours. But wanting to play a few songs in Beat Saber day to day when I have a spare half hour would be so much more accessible if I didn’t have to close off the whole world around me. It will also make on boardsing for newcomers a much less nerve racking experience for me as their guide, and I won’t have to work so hard to explain every little thing to them to get them to avoid smashing my controller against a wall.