First Look: NVIDIA Reveals Iray VR for “Breakthrough Photoreal Virtual Reality”

18

NVIDIA at GTC 2016 today has announced Iray VR, a virtual reality adaptation of their Iray ray-tracing engine. The company says it enables “breakthrough photoreal”. On stage CEO Jen-Hsun Huang showed a quick demo of the rendering technology in action.

Iray is Nvidias’s physically-simulated rendering technology which can create photorealistic imagery.It works by simulating the physicla behavior of light and materials. Rendering such realistic frames takes very long and therefor cannot be done in real-time, let alone the blistering demand of 90 FPS from desktop-class VR headsets.

But Nvidia wanted to bring ray-traced photoreal visuals to VR, and so they created Iray VR, a new rendering plugin which renders such imagery in a way that’s compatible for virtual reality.

On stage, Nvidia CEO Jen-Hsun Huang explained that Iray VR works by creating light probes around the scene which render lightfields, volumetric areas that contain the information about the light traveling through that space. The system then chooses which probe to sample from based on the viewers location, and calculates what their view should be. This allows the viewer to see photoreal imagery in VR in real-time, according to Huang.

On stage, Huang picked up an HTC Vive headset and spun it around to show a view of the company’s not-yet-built future office rendered with Iray VR. From the demo it wasn’t clear that the system supported positional tracking, which would be a major limitation compared to a proper real-time rendering. We’ll go hands on with Iray VR here at GTC 2016 to learn more soon.

nvidia-iray-vr-lite

The company also revealed Iray VR Lite which is a similar ray tracing plugin for 3DS Max which renders ray-traced photospheres that can be viewed on mobile devices. Both Iray VR and Iray VR Lite will be available in June.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Ian Shook

    C’mon Otoy, get your shit together and release Brigade

    • yes please!

    • Horror Vacui

      Apparently it will be included in the next Octane Render…next year, as a part of it.

  • Killer_Kopy

    I assume it will be like a super photo realistic piece of a world frozen in time that I could walk around and look at.

  • crazysapertonight

    and what pc needs for such graphics in vr?

    oooh

  • Surykaty

    Meh .. otoy lightfields are better.. I simply have large doubts that 100 simple lightprobes would suffice for positional tracking.. how apart are they spaced? I also see alot of uncontinuous blending going on.. what a meh tech.. guys who bake the rendered scene on to 3d models are geniuses in comparison to this abomination

  • Alex Leiva

    So what are they showing? this is not real-time right? It is just rendering a stereo equirectangular image? You do that in Vray. Will you be able to walk around of just some head positional shifting like in lightfields?

  • Lightfields used in this will definitely allow for positional tracking, at least within a range enough to move your head around if not walk, so realtime-ish, and given a cluster/server behind the scenes can spit out these fields perhaps in minutes? seconds?, and then send it over to less capable VR machine to present, say even mobile, very interesting for some solid none gaming use cases

    • Better than stereo panoramic images in several ways, which lack 3D depth at the sides and stick to your head as you move, yet still as easy to render and deploy to low end devices

      • Horror Vacui

        Stereo Panoramic Images don’t lack 3D depth when they are well crafted. They just lack positional tracking.

        • They lack 3D separation increasingly as it reaches 90 degrees either side, unless you go lengths to fake it or project onto a mesh which all have own pit falls

          • Horror Vacui

            Well they are mostly useless anyway, there are nice “tech demos” or screenshot of what CGI could look like in VR.

            But I don’t believe in iRay either

  • Mateusz

    I don’t think there’s any positional tracking with Iray VR

    • Ian Shook

      Doesn’t positional tracking come from the headset? What does Iray have to do with tracking a person?

      • Mario Baldi

        You can’t get positional tracking from standard 360 pictures.
        My understanding is that, if these are “lightfield-like” pictures, than you could, to a certain degree.

      • Mateusz

        What I meant was that this isn’t photogrammetry where you can explore pictures (see SteamVR “postcards” for example). But instead these are only 360 degree high quality photos.

        • Ian Shook

          I re-read the article, I didn’t realize it was light-fields.

  • Cuerex

    ray tracing without animation again, oh wonders