jules-urbachAt Unity’s Unite keynote in November, Otoy’s Jules Urbach announced that their Octane Renderer was going to be built into Unity to bake light field scenes. But this is also setting up the potential for real-time ray tracing of light fields using application-specific integrated circuits from PowerVR, which Urbach says that with 120W could render out up to 6 billion rays per second. Combining this PowerVR ASIC with foveated rendering and Otoy’s Octane renderer built into Unity provides a technological roadmap for being able to produce a photorealistic quality that will be like beaming the Matrix into your eyes.

LISTEN TO THE VOICES OF VR PODCAST

I had a chance to catch up with Urbach at CES 2017 where we talked about the Unity integration, the open standards work Otoy is working on, overcoming the Uncanny Valley, the future of the decentralized metaverse, and some of the deeper philosophical thoughts about the Metaverse that is the driving motivation behind Otoy’s work toward being able to render virtual reality with a visual fidelity that is indistinguishable from reality.

Here’s Otoy’s Unity Integration Announcement:

Here’s the opening title sequence from Westworld that uses Otoy’s Octane Renderer:

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


  • The Westworld scene is amazing!

  • Jerald Doerr

    This is the most awesome news ever… I suggest you guys add links showing just how fast the octane render is… witch actually supports multiple GPU’s,CPU’s and even multiple computer renderings per frame.. but I’m sure it would be years before the multiple computer support would work with games..
    I am a bit confused though… does this mean you could use the renderer for game play or only animated 3D playback of your pre animated scenes? Still a big step forward if not.

    • It appears that Octane will allow you to render non-interactive animated scenes in Unity (with all the gorgeous rendering power of Octane). You can then use these as cut scenes or stand inside them in VR. I assume you will be able to cloud render (GPU) the timeline because a single frame can take a few seconds on a GTX 980 to appear at rough quality, It can take much longer if you start to add volumes like fire and translucent water etc which is why you use multi-gpu setups to render professionally.

    • Brent

      Look at some gameplay of brigade it works

  • OgreTactics

    So basically Brigade preview inside Unity, and ORBX/Lightfield export?

    The only thing I can judge this by, is the fact that it’s been available for 2 years now and I haven’t seen a SINGLE, not one demo. Render the metaverse stereocube maps is one of those things I like to show to clients first hand, dubbing it “screenshots of the future”, but then I never saw a single lightfield demo anywhere, and just rendering a stereovideo cube limits you to 1080pxEye, which is crap for 360° playback.

    So, is it another vaportech?

    • Raphael

      Looks like you’re complaining again… is there ever anything happy from that miserable troll head of yours?

      • Jerald Doerr

        Joker / lucdfer… with a name like that I’d expect him to see nothing but his jokes and negativity ..

        • Raphael

          Agree.

          “I’m saying most headset are crap that are no better and as unfinished as 90s VR headset and this is the reason why it doesn’t and won’t sell to the consumer public. Period.” << He said this 3 days ago.

          Now I do seem to remember using a VR HMD in the 90's at a public event because VR was too expensive and bulky then.

          The graphics were little more than filled polygons at 10 frames per second or less. Resolution was low and head-tracking not very good.

          Moving on to around 2007 I owned an IO-Systems PC3d.. cost $1000 new but I got it for about $300 used. 800 x 600. Flickery graphics. No contrast on the LCD meant I couldn't resolve any games with texture rich detail. The LCD would change color randomly when the unit heated up. The FOV was tiny and it looked like I was sitting at the back of a small movie theater. No head-tracking. No laser positional tracking.

          He believes current VR (CV1 and Vive) is exactly the same as that. The idiot believes that vr is a failuire because we haven't gone from 640 x 480 to 8k per eye (since DK1 emerged) and with foveated rendering.

          Muppet who thinks VR should move at a pace he dictates.

    • Jules Urbach

      LF rendering + 6DOF playback (from downloaded or streamed ORBX media file) will be a key part of the workflow we are introducing with Octane/Unity in the coming months.

      If you are at GDC17 I will be giving talks and hands on demos of all the above (including the latest Brigade work going into Octane 4) at the Unity booth.

      While the LF work has been developed and shown for years at trade shows, like brigade finally going into V4, the LF stack needed to wait for the right place and time to fit into the product roadmap. It was always my hope to get it to a point where it was practical for all to use, and I am glad we didn’t rush that process.

      With Octane/ORBX built into Unity going forward, millions of Unity developers will get all the authoring tools needed to start leveraging LF baking and ORBX media workflow, even in the free Unity edition.

      We’ve recently released a full featured version of Octane VR + DCC plugin of your choice for $20/month to expose the same workflow to all other 3D apps at an affordable price (UE4 plugin will be made available through this offering when it is done).

      • OgreTactics

        Hi, thanks man. I was actually happy to stumble on the Octane VR subscription I regularly redirect younger 3Designers to (it’s not obvious on the home page yet).

        I’ve been pondering people and colleagues about LF/ORBX and how this is the only way I know which can possibly enable volumetric animated CG scenes, but couldn’t find one example. And also Brigade, when I rant about real-time ray-tracing preview still not being a thing.

        While I will not attend GDC, I will get my hand on it asap and I’d like to ask: will Unreal follow? Because I have the pre-sentiment that (C4D/3DSMax/Blender being fucking unrealistically complex not to say archaic in terms of conception and interface design in front of the new accelerated/demand) Unreal will become a choice 3D Studio creation tools for lots of people especially with all the integrated baking/nodes tools.

  • Raphael

    I use Otoy Octane software: Octane plus Cinema 4d plugin. It’s nice to see the strides they’re making in this field.

  • Surykaty

    6 billion rays/per sec/per pixel!? OMFG.. realtime raytracing confirmed… I’ve waited my whole life for this!!!!!!

  • Surykaty

    6 billion rays/per sec/per pixel!? OMFG.. realtime raytracing confirmed… I’ve waited my whole life for this!!!!!!