Oculus have today released the latest version of their software development kit (SDK) and it’s one of the most significant to date.

Following an announcement as monumental as last week’s from Oculus, that the consumer Rift headset would launch Q1 2016, was always going to be tough. But the team in the Oculus software development boiler room have certainly given it their best shot today as they unleashed version 0.6.0.0 (beta 14) of their SDK into the world.

This is no incremental release either. The latest SDK brings with it fundamental changes to the way developers will interface with Oculus hardware and how they’ll approach certain rendering challenges.

oculus-rift-cv1-front (enhanced)

The Compositor Service and Texture Sets

One of the most interesting and fundamental changes to Oculus’ render pipeline is the Compositor. It allows hardware specific final passes to be abstracted from the application by handling lens distortion correction and chromatic aberration that matches the host hardware (i.e. DK1, DK2 or CV1), which will be processed by the OVRService.

In order to do this, the Compositor uses Texture Sets to pull application rendered scenes from a shared stack both the app and the Compositor service have access to. The Compositor then applies the necessary pre-warping before being displayed to the VR headset

oculus-rift-cv1-headstrap back plate

Layers

Every ounce of performance counts when developing applications for virtual reality, and with the help of the new layers support, Oculus is giving devs a few more nobs to turn, helping them squeeze more into their VR creations—well deserved optimizations considering the consumer version of the Oculus Rift will require some beefy system requirements to run in the first place.

SEE ALSO
Asked About Standalone VR Headset, Valve Says Steam Deck Hardware is 'very relevant to our future plans'

With the addition of layer support, which allows multiple independent application render targets to be sent independently to the headset, devs have more control over the layer’s size, resolution, and update rate before the layer its sent for distortion and display. So for example, if you need to render detail-centric layers like HUDs, which use text etc. you can target a layer with a resolution higher than the gameworld beyond and reap the performance benefits.

However, many functions have changed so porting older Oculus Rift projects to the new SDK may well be time consuming. From what we’ve seen (and we’re certainly not developers), the effort looks to be well worth it – although we’ll not see this opinion borne out until SDK 0.6.0.0 specific applications start to appear.

oculus-rift-cv1-full (enhanced)

Overview of New Features

The following are major new features for the Oculus SDK and runtime:

  • Added the compositor service, which improves compatibility and support for simultaneous applications.
  • Added layer support, which increases flexibility and enables developers to tune settings based on the characteristics and requirements of each layer.
  • Significantly improved error handling and reporting.
  • Added a suite of new sample projects which demonstrate techniques and the new SDK features.
  • Removed application-side DirectX and OpenGL API shims, which results in improved runtime compatibility and reliability.
  • Simplified the API, as described below.
  • Changed Extended mode to use the compositor process. Rendering setup is now identical for extended and direct modes. The application no longer needs to know which mode is being used.
  • Extended mode can now support mirroring, which was previously only supported by Direct mode.
  • Simplified the timing interface and made it more robust by moving to a single function: ovrHmd_GetFrameTiming.
  • Fixed a number of bugs and reliability problems.

    Download Oculus SDK v0.6.0.0

For more information on Oculus’ PC SDK 0.6.0, head to the developers section of the Oculus website for full documentation on API changes, bug fixes, Unity specific updates, and a number of known issues caused by the new SDK update.

SEE ALSO
Buying Guide: The Best VR Headsets in 2021

The latest SDK ploughs the road for development on consumer virtual reality projects and gives a peek into how Oculus see hard problems associated with gaming which will required heavy rendering whilst maintaining low latency.

We’d love to hear from developers out there. What elements of the latest Oculus SDK do you like / dislike, how much work are you going to have to put in to port your existing titles and what benefits do you see that having. Give us a shout in the comments below, or at info@roadtovr.com to let us know.

 

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.


  • mattnewport

    I spent yesterday updating my D3D11 code from 0.4.4 to 0.6.0 and it was a bit of a chore. I think the API is overall easier to use now with the compositor changes but there was quite a bit of code churn involved in the upgrade. The docs were ok but could have been better on the changes required, particularly regarding changes to the build setup.

    I’ve got things more or less working now but there are a couple of issues I still have to work out. The new Health and Safety Warning is extremely uncomfortable to view due to some seeming stereo disparity issue. This seems to affect the Oculus World demo too so I don’t think it’s my code. There also seems to be a problem correctly handling SRGB texture sets (they are displayed on the HMD with the wrong hands) but that might be a bug on my end.

    One of the supposed benefits of the new compositor architecture is that told like the Visual Studio Graphics Debugger (the PIX replacement) would now work but my app still crashes on startup with the rift enabled. I think it should be a high priority to fix that issue.

    I’m unsure at the moment about the apparent move by Oculus to hide more of the details of hmd display from the developer. Valve is taking a different approach it seems and putting more control in the hands of developers which I think is probably the better easy to go even though it makes it easier to shoot yourself in the foot. At this early stage of VR development though I think it’s good for developers to be able to experiment.

    • mattnewport

      Wrong hands should read wrong gamma – damn autocorrect !