Khronos Group, the consortium behind the OpenXR industry standard, today announced that it has begun officially certifying products that correctly implement the OpenXR standard. Additionally, the group has added new extensions to the standard to support hand-tracking and eye-tracking.

OpenXR is a royalty-free standard that aims to standardize the development of VR and AR applications, making for a more interoperable ecosystem. The standard has been in development since April 2017 and is supported by virtually every major hardware, platform, and engine company in the VR industry, including key AR players.

The Khronos Group has announced the OpenXR Adopters Program, allowing any company building an OpenXR product to apply for the official stamp of approval. Once approved, products can use the OpenXR logo on their implementation and also gain patent protection under the Khronos IP Framework.

To ensure that companies are correctly implementing the standard, Khronos Group has published the OpenXR Conformance Test Suite, a collection of tests which companies can use to verify that their OpenXR product is correctly implementing the standard.

SEE ALSO
Oculus Now Accepting OpenXR Apps on Quest & Rift, a Big Step for Cross-platform Development

The announcements mean that OpenXR is finally ready to be rolled out widely across the XR industry.

“The time to embrace OpenXR is now,” said Don Box, Technical Fellow at Microsoft. “In the year since the industry came together to publish the OpenXR 1.0 spec and demonstrated working bits at SIGGRAPH 2019, so much progress has happened. Seeing the core platforms in our industry getting behind the standard and shipping real, conformant implementations […] is singularly awesome.”

Khronos says that Facebook has shipped an OpenXR implementation for both Quest and Rift, and Microsoft has shipped an implementation for WMR headsets and HoloLens 2. Valve has also published a preview implementation of OpenXR which developers can begin building with.

“OpenXR is designed to enable VR content compatibility on as many devices as possible, giving developers the confidence of knowing they can focus on one build of their VR title and it will ‘just work’ across the entire PC VR ecosystem,” said Joe Ludwig of Valve. “This release is a huge step forward toward that goal, bringing support from two different implementations in the PC ecosystem. With these and more on the way, including our ongoing developer preview in SteamVR, now is the time for developers and engine vendors to start looking at OpenXR as the foundation for their upcoming content.”

Khronos also announced new extensions to OpenXR which expand the standard to support hand-tracking and eye-tracking.

Hand-tracking company Ultraleap has published an OpenXR preview implementation for its Leap Motion hand-tracking peripheral, and high-end enterprise headset maker Varjo has published an OpenXR preview implementation for its eye-tracking headsets.

The end goal of OpenXR is to standardize the way that VR apps and headsets talk to each other. Doing so simplifies development by allowing tool and app developers to develop against a single specification instead of different specifications from various headset vendors. In many cases, OpenXR compatibility means that the exact same app can be run across several compatible headsets with no modification or even repackaging.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.


  • Sofian

    Does it mean we ll be able to use Quest apps on PC?

    • openxr

      Not exactly. OpenXR enables cross-platform AR/VR feature access, as the Khronos API Vulkan does for graphics features. There would still be operating system and CPU differences that would require a recompile (at a minimum) by developers. But OpenXR helps reduce the source code differences of AR/VR applications between platforms, making developing and testing across multiple platforms such as the Quest and PC easier.

      OpenXR WG

      • Ad

        Hi there! Does this also mean that if a dev makes a hand tracking app compatible with Quest hand tracking on a PC through link, that app will work with the leap motion on PC as well?

      • Thanks for your reply and information.

        Does this mean the only barrier to using Oculus PC applications built using openXR, on non Oculus headsets would be a DRM check or similiar on Oculus’s end?

        • mfx

          Exactly (for PC games). Developers can now make OpenXR apps that could run on all OXR hardware, but that doesn’t mean that publishers will get rid of exclusivity limitation neither.
          For quest games, they are so tightly developed for the quest platform that it’s not just a simple DRM difference only though.

          • Publishers could benefit from time limited exclusivity to their platform, before releasing later to all headsets. I’d be happy to support developers offering this, no problem waiting 6 months.

            Revive hasn’t always worked for me (I had an Oculus account from owning Rift CV1) so I won’t spend $40 on an Oculus store game relying on a hack to work.

  • mfx

    That’s is really cool to see !!

    Soon, instead of having to check the compatibility with headset X Y or Z, we will just have to check OpenXR y/n ? If yes, any good headset works :)

  • Ad

    Does this mean that any new hand tracking apps intended for Quest that are made with OpenXR will support the leap motion or MRTK? That would be great news because even though I don’t think hand tracking is seriously viable, I’d love to see it as open as possible for those cases where it does make sense like some WebXR.

    • silvaring

      Why don’t you think hand tracking is viable?

      • Ad

        1. Accuracy. Camera hand tracking suffers from serious occlusion issues that make it inconsistent. Even under idea conditions actuation is quite poor, causing you to not grab objects or to drop them. You have to move slowly and still it’s unreliable.

        2. Inputs. Gestures don’t actually work and even if they did you can’t have enough inputs for any kind of complex software. It’s a bad idea to sell an entire headset or make large programs that can only use hand tracking. Even something as simple as movement, there are slow basic movement schemes that people have come up with, basically just two handed teleportation, but nothing that could let you walk and chew gum at the same time.

        3. Interactions. The Kinect is wildly considered to have failed because of the above and because it lacked any haptic or tactile feedback whatsoever. And any gestures programmed can be activated accidentally or fail because of the way the user personally moves their body.

        All of this means they hand tracking just fails to deliver on its core premise: a simple straightforward and intuitive way to control VR. It obviously can’t do anything complex or server any serious VR user. But even people at the lowest end who want to do anything besides watch Netflix or use menus (things you may as well give them a remote or gaze control for) will just be frustrated and have no recourse for it. Voice commands were supposed to be a huge thing but they’re barely used anywhere as well.

        You can make experimental technologies to help with some of these issues, most of them add some small peripheral to improve tracking or actuation, but that kind of defeats the purpose of having just hand tracking control, and often isn’t scalable. The simplest solution in my view is just to make the best version of something like the knuckles controllers, thumbpad and all, and get everyone acclimated to it with well made detailed mixed reality tutorials. Same as mouse and keyboard.

        • Great points. I’ve always thoroughly enjoyed hand tracking as a novelty on Leap motion and something frustrating on Hololens.

          Index controllers are very interesting, Etee controller also an interesting recent study of mine. Their material sensing technology is incredible with 100 levels of input per finger.

          https://uploads.disquscdn.com/images/4efe91bb42e5febfa538fe45c72c029c69f45d5dd69cd9472d5c19864249e6b9.jpg

          • Ad

            I think Etee is a bad idea and they have a terrible attitude. It would have been a simple fix, just make it either modular or allow the user to cover up inputs to simplify it for your use case. But just taking off the buttons and joystick means you’re just getting less.

          • I like their material technology but didn’t gel with the form factor and lack of adjustment to suit different size hands.

            My findings are here if you have time to read, I decided to rebuild:-

            https://immersivecomputing.org/2020/07/03/experiments-with-etee/

            https://uploads.disquscdn.com/images/157421dbbc2f95f895621794cb9d0b860ea2a5f18a59795f839fd1a95938bf93.jpg

          • silvaring

            You went through some thorough teardowns and analysis there, pretty interesting stuff, I especially liked the comparison to bicycle handles and how those too are designed in different ways with different trade offs in mind.

            Have you tried hand tracking on Hololens version 2? I’m curious if it’s improved over the original. I imagine whatever Microsoft have planned for a new VR headset upgrade that the hand tracking module would probably come from parts of hololens so thats mainly why I’m asking. Anyway as I said to another user above I don’t expect hand tracking to REPLACE controllers, but if they can make it good enough for general menu / object navigation and communication in VR then wouldn’t that be good enough? Or is that a pipe dream and will we NEED some kind of glove interface just to get hand tracking working well enough to hang out in VR without having to hold cumbersome plastic controllers?

          • Ad

            If the etee allowed 3D finger tracker, attached behind the knuckles instead of in front of them, then it would make a lot of sense as a superior replacement for camera based finger tracking since it has more inputs and fewer compromises when what it replaces. But coming at this as a “simpler” form of knuckles controllers was truly a bizarre choice.

          • I felt the sensing material was very impressive and had great advantage when fused with motion controller with button and trigger. The fidelity and accuracy of the finger tracking was very impressive, better than my Knuckles no doubt.

            But a very different feel to Valve Knuckles (Palm sandwich), the Etee controllers clamped to the Proximal Phalanxes results in this:

            https://uploads.disquscdn.com/images/67692786305bb3c9662377a4e66e672a03f0d3f0a9c3864453147c37ac5748e8.jpg

        • silvaring

          I appreciate your post and its an insightful look into the issues and and various use cases of the technology. I do think you’re underselling the potential value in this ‘lowest end’ use case scenario though, controlling menus / slow and simple manipulation of objects / gesturing to other users. The potential of hand tracking is not that it needs to be a perfect VR tool and replace controller functionality, and I have no idea why you feel that hand tracking needs to fulfill this purpose. Just like you use a hammer for a nail, and an electric drill for a similar but different application, hand tracking and controllers are the same. Hands will be used as the tool ‘anyone’ can use in VR, like anyone can use a hammer. Controllers will be more specialized, drills that less amounts of people can and will probably use, but essential nontheless. I do think that for hand tracking to work really well though (and avoid some of those issues you mentioned) some kind of decent eye tracking technology might be needed though.

    • openxr

      That would be the expectation. If the application uses the OpenXR extension for hand tracking, then when the application is ported to other AR/VR platforms supporting that extension, hand tracking should “just work”. – OpenXR WG

      • Ad

        Wait, this openXR extension for hand tracking was made by ultra leap, so would it actually do that? Would an app made with it work on quest? I’m asking if this means you can make an app that works on both quest and leap.

        • Ryan Pavlik

          The specification extension is multi-vendor. What ultraleap made is an implementation of that extension in an API layer which can be added to a PC runtime. There have been no announcements from Oculus about supporting the same openxr extension, but it seems like a logical thing to do.

        • Ryan Pavlik

          The specification extension is multi-vendor. What ultraleap made is an implementation of that extension in an API layer which can be added to a PC runtime. There have been no announcements from Oculus about supporting the same openxr extension, but it seems like a logical thing to do.

          • Ad

            I would argue it wouldn’t be logical, why would they push for an open platform on something they could monopolize?

          • Ryan Pavlik

            I mean, the same logic could apply to participating in OpenXR at all, but there are plenty of reasons why companies do get involved.

          • Ad

            I mean, they did drag their feet quite a lot with that one. But in this case they are the only ones with the hardware (leap is just not in the consumer space even a lot of devs don’t really think about them) and with WebXR starting out it would be in their interest to steer things towards them.

          • Ryan Pavlik

            I don’t want to get into more details, but “drag their feet” is **not** how I’d characterize Oculus’ participation. They have been active participants from the beginning, including drafting the original starting point for the spec, and are one of the first batch of conformant runtimes (which is the press release being discussed here) – those are the already-public participation things I can point to.

            I know it’s easy to be cynical and assume the worst, but in this case I’d suggest just waiting.

          • Ad

            If Apple signs on to this then sure, I just don’t see their incentive. Fine on the OpenXR question, I assumed because of OpenVR and how they only allow apps on their store that only work with their runtime. But for this it seem they are very proud of the Quest’s hand tracking and it puts others in a difficult position because hand tracking isn’t that useful of a feature (it was demoed on a number of headsets but abandoned)… unless you’re marketing a stand-alone headset to people who are completely new to VR. I would love to know that anyone could have a leap motion like me and stick it on any headset, I’m just not optimistic.

  • Graham J ⭐️

    I guess it’s good to have another way to support all the main PCVR headsets, like OpenVR does.

    • Andrew Jakobs

      OpenXR is a successor to OpenVR as I recall.

  • duck

    Very very important
    Should have been done much earlier, but its ok now

  • This is HUGE. Let’s hope they keep going this way, so OpenXR will be implemented by everyone pretty soon and the far west of VR will end

  • Andrew Jakobs

    This is great, it might also open OculusStore for other headsets with newer games.

    • Ryan

      Does it though? Or will it just make Revive’s job easier.

  • Ryan Pavlik

    I may as well note this here: OpenXR WG/Khronos is **not** certifying apps or engines. The conformance process (and associated restrictions on trademark usage, etc) is solely for “implementations” – runtimes, aka, headset drivers/runtimes. No certification is required for e.g. Unreal to brag about their OpenXR support since they are not an implementation or partial implementation, just as they need no testing to mention their Vulkan support. Figured I had better clarify this if only in the comments, lest I get questions about it at the talks I give over the next year. (Of course, I am not a lawyer, and the full info on the trademark policy is here: https://www.khronos.org/legal/khronos-trademark-guidelines )

    Ryan Pavlik
    OpenXR Specification Editor

    Principal Software Engineer, Collabora, Ltd.