At GDC last week Oculus announced they were bringing a VR rendering function called Asynchronous Timewarp to Windows with version 1.3 of the Oculus PC SDK. The feature (abbreviated ‘ATW’) smoothes over otherwise uncomfortable stuttering which can happen when the PC doesn’t render frames quite fast enough. Valve calls ATW an “ideal safety net,” but suggests a system to avoid relying on it too heavily.

Asynchronous Timewarp is an improved version of Synchronous Timewarp which Oculus has implemented since the Rift DK2. While Synchronus Timewarp was effective in reducing latency, it was tied to the frame-to-frame render loop.

Illustration of ATW's ability to provide a reprojected view of the last available frame when rendering exceeds allotted time | Photo courtesy Oculus
An illustration of ATW’s ability to provide a reprojected view of the last available frame when rendering exceeds allotted time | Photo courtesy Oculus

ATW, as the name suggests, decouples Timewarp from the render loop, which gives it more flexibility in dealing with frames that aren’t finished rendering by the time they need to be sent to the VR headset. The company explains in more detail in a recent blog post:

On Oculus, ATW is always running and it provides insurance against unpredictable application and multitasking operating system behavior. ATW can smooth over jerky rendering glitches like a suspension system in a car can smooth over the bumps. With ATW, we schedule timewarp at a fixed time relative to the frame, so we deliver a fixed, low orientation latency regardless of application performance.

This consistently low orientation latency allows apps to render efficiently by supporting full parallelism between CPU and GPU. Using the PC resources as efficiently as possible, makes it easier for applications to maintain 90fps. Apps that need more time to render will have higher positional latency compared to more efficient programs, but in all cases orientation latency is kept low.

Oculus seems quite proud of the feature and writes that “Getting [ATW] right took a lot of work.”

SEE ALSO
Survios Affirms 'Alien' VR Game is Still in Development

“The user experiences much smoother virtual reality with ATW. Early measurements of Rift launch titles running without ATW showed apps missing ~5% of their frames. ATW is able to fill in for the majority of these misses, resulting in judder reduction of 20-100x. This functionality comes at no performance cost to the application and requires no code changes,” the blog post continues.

The company maintains that ATW is “not a silver bullet,” and continues to recommend that developers target a consistent 90 FPS framerate in their VR apps for the best experience.

TheLab_with_logo
See Also: Valve’s ‘The Lab’ is a Wondrous Playground Full of VR Toys on HTC Vive

Valve agrees that ATW should only be used as a last resort for the occasional missed frame. Valve Senior Programmer Alex Vlachos called ATW an “ideal safety net” during his Advanced VR Rendering Performance talk at GDC last week, but suggested an ‘Adaptive Quality’ system which would favor reducing rendering quality over relying too heavily on ATW.

Vlachos said the Adaptive Quality system has two goals:

  1. Reduce the chances of dropping frames and [resorting to timewarp]
  2. Increase quality when there are idle GPU cycles

Using such a system opens the door to GPUs below the recommended VR specification being able to hit 90 FPS even on applications not expressly designed for that specification, while on the high end allowing VR experiences to look even better on systems that exceed the recommended spec.

He offered an example of settings which can be tweaked up or downward automatically based on the measured framerate:

valve adaptive quality example vrUsing this system, Vlachos says Valve was able to get the Aperture Robot Repair demo to hit 90 FPS consistently on the Nvidia GTX 680, a four year old GPU, without relying on ATW.

SEE ALSO
'Racket Club' Update Brings More Flexibility with New Rules and Fan Favorite Modes

Ultimately these techniques allow for smoother VR performance when frames unexpectedly don’t render fast enough, whether its because the user’s view at a given moment in time is particularly render-intensive, or if background processes running on a user’s computer steal a little too much rendering power at the wrong time.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Sony’s PSVR would benefit the most from this because of PS4’s lack of computing power. Really looking forward to Sony implementing this!

    • Flamerate1

      PSVR already has a system that promises 120 fps. I don’t know how good it is, but their system seemed like it would be doing something like this as a result.

      • Nice! Didn’t know the 120fps were because of something similar

      • Stephen Marshall

        ATW is more advanced that what Sony is doing.

        I think that Sony PSVR doesn’t have to worry as much as a PC based VR system about unexpected dropped frames because consoles are much more efficient and don’t have lots of ther processes running in the background (for example anti-virus software)

        • eck

          I cringe every time I see someone talking about console versus PC efficiency, citing background processes like anti-viruses as a point of argument. Especially in the age of gaming PCs with typically 8+ GB of system memory and plenty of multicore processor resources. It’s not the 90s anymore.

          • Stephen Marshall

            But it’s a fact of life with PCs. They run a lot of different types of software….some of which take away CPU cycles in the background when you are running other software.

            Multi-core processors help, but they don’t totally eliminate those issues.

          • Jim Cherry

            biggest problem with pcs isn’t antivirus software its all the other drivers and os overhead. Also consoles aren’t as efficient as you would like now that they multitask in os.

          • Stephen Marshall

            I never said anti-virus was the biggest problem.

            I agree…there are many OS layers of abstraction when it comes to PCs….that’s speeds up development and increases capatibilty and code reuse, but puts a heavier burden on the hardware…

            Also, this generation for consoles isn’t as efficient as previous ones, but the still are more efficient at playing games than PCs as PCs are general purpose computing devices….and consoles (mostly) play games…this obviously has changed in recent generations due to consoles running apps, but it still stands as consoles perform very badly with apps compared to PCs.

            But they are still more efficent.

            The thing that makes PC games look and perform much better is sheer brute forcing it with money. My computer has a Gtx 970….it obviously can produce better graphics/frame rates than my xbone or ps4. when I bought my computer it was $1800.00, compared to my xbone, at $500 when I first bought it…my xbone can run games at fairly reasonable performance level…..much better than me buying a $500 PC vs my $1800 PC at the time.

  • DonGateley

    I must really be missing something here but what is shown in that diagram is the first solution that any programmer fresh out of college would immediately implement to ameliorate the problem of underrun in video rendering.

    Now if rather than merely copying L2 some clever and very fast motion interpolation were done on it first that would be a different matter.

    • “Now if rather than merely copying L2 some clever and very fast motion interpolation were done on it first that would be a different matter.”

      Yep! The “warp” from ATW comes from the fact that it does re-position the previous frame based on difference in head orientation. On Gear VR, sometimes when an app is loading, it’ll ATW a single frame for entire seconds. When this happens, you can look around, and the image doesn’t follow your gaze. Instead, it stays stuck in black space showing your last rendered viewpoint.

      Also, I’m fairly certain that actually implementing this wasn’t even possible until Oculus worked with Nvidia and AMD to have their drivers changed (perhaps to allow frame scheduling intervention that wasn’t possible before).