Watch: ILMxLab’s Star Wars Real-Time Ray-Tracing Demo on Unreal Engine is Stunning

16

At the Unreal Engine presentation at GDC 2018 today, Epic Games CTO Kim Libreri introduced several projects highlighting the scalability and performance of Unreal technology, including a partnership with ILMxLAB and Nvidia. This was the first live demonstration of interactive, real-time, ray-tracing with Unreal Engine.

Nvidia announced RTX on Monday, a real-time ray-tracing solution designed for their latest Volta GPU architecture. Today, we were treated to a live stage demo using Unreal Engine, Nvidia RTX, and ILMxLAB’s Star Wars assets. Aside from some slight connection problems (the image was being streamed from a PC to the iPad on stage), this was a stunning demonstration, and seemed to be far less noisy than the Northlight footage from Remedy.

Tony Tamasi, Senior Vice President of Content and Technology at Nvidia explained some of the technology behind the scenes, describing ray-tracing as “the holy grail” of rendering that solves many of the fundamental problems of traditional rasterization.

“We partnered with our friends at Microsoft to deliver an industry standard API called DirectX Raytracing,” he said. “That API is perfectly integrated with RTX to enable ray-tracing through interfaces and APIs that game developers are familiar with, and on top of that we’ve layered in some GameWorks technology to give developers a kickstart for things like de-noising and reconstruction filters.”

SEE ALSO
'Vader Immortal' VR Series to Be a Canonical Part of the 'Star Wars' Saga

Despite the fact that this demo was running on a supercomputer (an Nvidia DGX Station running four Volta GPUs connected with NVLink), Tamasi thinks the technology is not too far away from reaching consumers. “We are at the crux of real-time ray-tracing being a practical reality right now, he said. “I expect you’re going to see games shipping with real-time ray-tracing this year.”

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.


  • dogtato

    ooook, so not quite runnable at home on my meager lone 1080

  • These latest game engine demos are amazing. Real time rendering that is as good as actual rendered footage! Good work!

    The Unity one shows much more poly count and can run on a PS4 in real-time apparently too.

    https://www.youtube.com/watch?v=DDsRfbfnC_A

    • daveinpublic

      Awesome

    • Harry Cox

      Looks cool, but you can get away with alot when rendering natural environments. It’s alot more visually forgivable than say rendering a person. That’s what I’d like to see these rendering demos have a go at. All in good time though

      • What do you mean “Visually forgiving”? This is showing ray tracing which is the rendering process of anything, people, environments etc at real time speeds. But if you are talking about modelling skills, anatomically accurate modelling and skin shaders with SSS and then make it look real through animation then yeah, that is extremely tough to do for any media, it has little to do with this article though.

  • daveinpublic

    I wonder if this is making ray tracing easier to run, or if computers are just now getting strong enough to play it realtime?

    • ethan

      Probably a bit of both to be honest

    • Advances in all areas really.

      Ray tracing in software like 3D Studio Max can take hours per frame when doing heavy 4K rendering, and that’s “per frame” where 24 frames = one second of footage. What this real-time stuff is doing is cutting back the quality to “just good enough” and also finding clever ways to to replace physically accurate effects (which take a lot of time computationally) to faking them. So speed over accuracy.

      GPU’s are getting more powerful

      Game engines, operating systems, drivers etc are all finding new ways to optimize.

      The result is what we are seeing.

  • David Herrington

    I’m wondering about the power of those 4 Volta GPU. Are each of those GPU closer to 2070, 2080, or 2080 TI?

    If they are only 4x 2060’s then this may still be closer to reality for us. If they are 4x 2080 TI’s then most consumers will have to wait for the 3000 series before this becomes a reality.

    • Mei Ling

      Ray tracing has been “marketed” for years and it always has been the same; you get a couple of cool concept videos with the technology applied but then you fast forward half a decade and you’re still technically getting exactly the same demos.

      • silvaring

        Isn’t the difference now that Ray tracing provides enormous benefits to virtual reality film content where you can move to locations within the scene? Current 360 video not only looks flat and dull but it also lacks this ability to move around within the scene, and google just released a vr light field demo with GoPro cameras but it only captured still frames so not practical for moving scenes. This could probably be used for near photorealistic CGI in virtual reality, so animated shows, maybe some VR short films with performance capture (like the studio Intel have built for this purpose).

  • Darius Pollock

    Does this mean we going to get new generation of slow cards with A new feature that depends on developer implementation “Titan V” ? I hope not , i’m looking for something powerful enough to power the 8KX.

  • oompah

    Bake the ray tracing tech into the silicon
    & think of having a chip (or 2 , one for each eye) in which
    every pixel has its own processor
    (those times may not be far)
    so that every pixel is ray traced by ITS OWN processor
    so think of every pixel as a neuron
    let also have a separate processor
    for calculating ambient lighting
    so the process should be :
    start
    calculate ambient lighting by dedicated processor
    calculation of rays traced by every neuron (every pixel+dedicated proc+dedicated memory)
    combine & recalculate ambient lighting by incorporating output of ray tracing
    cycle till acceptable limits reached

    So this is basically a matter of parallel processing

    • Paula

      Google paying all people 97 US dollars/hr to work parttime on the laptop .. Labor for few period of time daily & live happy greater time together with your loved ones .. You can also apply this special offer!!!on weekend I purchased a latest Chevrolet after I been getting $18200 this-past/six weeks .it is truly my favourite-work but you can now not forgive yourself if you don’t take a look at this.!lf01x:➸➸➸ http://GoogleHelloPartTimeJobsFromHome/get/pay/$99/hourly ♥♥♥n♥♥e♥♥e♥♥♥j♥x♥x♥♥♥u♥♥q♥♥♥d♥c♥c♥♥z♥♥♥c♥o♥♥s♥♥♥z♥j♥s♥♥♥g♥♥♥h♥♥♥e♥♥♥i♥♥b♥♥q♥i:!ef73o:pkosfkk

  • Amazing, but I hoped that it could run on my gtx1080, too! Damn.. .:D

  • Lucidfeuer

    I would expect ILM, has part of Disney, to be a little more careful and fine with their branding, but it’s becoming a ridiculous disvaluation of their Star Wars franchise.

    Save for a few forgettable token experiences on mobile or VR, ILM is quickly becoming synonymous with (I guess paid) bullshit marketing demo of technology that here for nothing but PR and speculative valuation but no actual products. Like Google Seurat, this rtRT thing is absolute BS, having coding any RT engine for the fun will tell you that much.

    tl;dr: I can guarantee you that, in today’s state of things, this is absolute BS and you won’t see one single real experience of it.