Oculus’ PC SDK version 1.19 now supports NVIDIA’s VRWorks Lens Matched Shading (LMS) technique, offering “a performance boost and a slight quality improvement” on supported GPUs.

Lens Matched Shading is part of NVIDIA’s VRWorks toolset, and takes advantage of the architecture of the company’s Pascal-based GPUs to better match the pre-warped VR image to the final output image, thereby reducing the rendering of pixels that would otherwise not be visible after the distortion-correction phase. LMS also provides a more even distribution of pixel sampling between the initially rendered image and what ends up being seen through the lens. Back in 2016, NVIDIA offered an approachable explanation of LMS.

This week, Oculus announced the addition of LMS to their PC SDK 1.19. The company deeply details the implementation in depth at their developer blog, and said that the feature “provides a performance boost and a slight quality improvement.”

NVIDIA’s LMS has been available in special builds of Unity and Unreal Engine for some time, but now that it’s built directly into Oculus’ SDK, developers working outside of those engines will have easier access to the feature, and it’s likely that Unity and Unreal Engine will see ongoing support for LMS in their main branch releases, making it more accessible to developers.

NVIDIA Announces 'RTX', Real-Time Ray Tracing Engine for Volta GPUs

LMS relies on GPUs based on NVIDIA’s Pascal architecture (GTX 1060 and Quadro P4000, and above). With 92.2% of Rift users using NVIDIA GPUs (65.8% Pascal-based), according to Oculus’ hardware report at the time of writing, the addition of LMS seems like a pragmatic way for the company to bring performance gains to many of its users, though investing in NVIDIA-specific technologies certainly doesn’t curry favor with AMD and its users.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.

  • DaKangaroo

    There really isn’t any reason why the technique couldn’t be implemented manually. It is basically just a reprojection of one frame on one side onto the other side, and then the gaps are filled. There’s no reason why, for example, Unreal Engine, couldn’t implement the technique as an option to be enabled, making it available to all hardware.

    • Robert Morris

      Google pays all people $99/hr, also get week by week salary … any individual can also avail this!!on Friday I bought a gorgeous Lotus Elan after I been earning $6393 this four weeks .it is the extraordinary approach however you wont forgive yourself if you do not check it.!sg72p:=>=>=> http://GoogleLinksBestFreelanceInternetJobs/make/$99/everyhour ♥♥♥k♥♥e♥♥♥x♥♥♥v♥i♥♥♥h♥y♥p♥f♥♥m♥♥♥z♥♥♥x♥p♥♥q♥l♥l♥m♥m♥♥♥a♥♥♥t♥♥♥e♥♥♥r♥p♥♥t♥l:::::!eg123i:imkejc

    • Raphael

      It’s a hardware feature which means it’s much more efficient. That’s the whole point.

    • Peter Hansen
    • Ax

      The technique has been available in Unreal Engine for a year++.
      You can just grab a fork of the engine with Nvidia’s ‘VRWORKS’ enabled, and you get access to lens-matched-shading, as well as multi-resolution-rendering. (Foveated rendering.)

      There is nothing particularly new here other than Oculus implementing it as a part of their SDK, really.

  • Very good news…

  • What has Steam been doing to boost performance, never seen them do anything other than some resolution thing which doesn’t boost anything(well, makes blurry for the sake of performance).

  • MeowMito

    This is why I love Oculus, they focus on these innovations. These tech can apply to current gen, it makes everyone happy.

  • Peter Hansen

    Can’t believe this has not happened a long time ago.