At the Unreal Engine presentation at GDC 2018 today, Epic Games CTO Kim Libreri introduced several projects highlighting the scalability and performance of Unreal technology, including a partnership with ILMxLAB and Nvidia. This was the first live demonstration of interactive, real-time, ray-tracing with Unreal Engine.
Nvidia announced RTX on Monday, a real-time ray-tracing solution designed for their latest Volta GPU architecture. Today, we were treated to a live stage demo using Unreal Engine, Nvidia RTX, and ILMxLAB’s Star Wars assets. Aside from some slight connection problems (the image was being streamed from a PC to the iPad on stage), this was a stunning demonstration, and seemed to be far less noisy than the Northlight footage from Remedy.
Tony Tamasi, Senior Vice President of Content and Technology at Nvidia explained some of the technology behind the scenes, describing ray-tracing as “the holy grail” of rendering that solves many of the fundamental problems of traditional rasterization.
“We partnered with our friends at Microsoft to deliver an industry standard API called DirectX Raytracing,” he said. “That API is perfectly integrated with RTX to enable ray-tracing through interfaces and APIs that game developers are familiar with, and on top of that we’ve layered in some GameWorks technology to give developers a kickstart for things like de-noising and reconstruction filters.”
Despite the fact that this demo was running on a supercomputer (an Nvidia DGX Station running four Volta GPUs connected with NVLink), Tamasi thinks the technology is not too far away from reaching consumers. “We are at the crux of real-time ray-tracing being a practical reality right now, he said. “I expect you’re going to see games shipping with real-time ray-tracing this year.”