At Connect 2021 last week, Meta revealed a new Quest rendering technology called Application Spacewarp which it says can increase the performance of Quest apps by a whopping 70%. While similar to the Asynchronous Spacewarp tech available to Oculus PC apps, Meta says Application Spacewarp will produce even better results.

Update (November 12th, 2021): Meta has now released Application Spacewarp for developers. The tools include support for Unity, Unreal Engine, and native applications. The company has also updated the OVRMetrics tool with Application Spacewarp metrics.

The original article, which overviews the Application Spacewarp rendering tech, continues below.

Original Article (November 5th, 2021): Given that Quest is powered by a mobile processor, developers building VR apps need to think carefully about performance optimization in order to hit the minimum bar of 72 FPS to match the headset’s 72Hz display. It’s even harder if they want to use the 90Hz or 120Hz display modes (which make apps look smoother and reduce latency).

Considering the high bar for performance on Quest’s low-powered hardware, anything that can help boost app performance is a boon for developers.

That’s why at Connect 2021 last week, Meta introduced a new Quest rendering technology called Application Spacewarp which it says can improve application performance by nearly 70%.

The technique achieves this by allowing applications to run at half-framerate (for instance, 36 FPS instead of 72 FPS), and then the system generates a synthetic frame, based on the motion in the previous frame, which is filled in every-other frame. Visually, the app appears to be running at the same rate as a full-framerate app, but only half of the normal rendering work needs to be done.

Image courtesy Meta

An application targeting 36 FPS has twice as much time for each frame to render compared to running at 72 FPS; that extra time can be spent by developers however they’d like (for instance, to render at a higher resolution, use better anti-aliasing, increase geometric complexity, put more objects on screen etc).

Of course, Application Spacewarp itself needs some of the freed up compute time to do its work. Meta, having tested the system with a number of existing Quest applications, says that the technique increases the render time available to developers by up to 70%, even after Application Spacewarp finishes its work.

Developer Control

Developers using Application Spacewarp can target 36 FPS for 72Hz display, 45 FPS for 90Hz, or 60 FPS for 120Hz.

Meta Tech Lead Neel Bedekar posits that 45 FPS for 90Hz display is the “sweet spot” for developers using Application Spacewarp because it requires less compute than the current minimum bar (45 FPS instead of 72 FPS) and results in a higher refresh rate (90Hz instead of 72Hz). That makes it a fairly easy ‘drop-in’ solution which makes the app run better without requiring any additional optimization.

SEE ALSO
'Silent Slayer' Preview – Dr. Van Helsing's Deadly Game of Operation

Of course 60 FPS for 120Hz display would be even better from a refresh rate standpoint, but in this case a 60 FPS app using Application Spacewarp would require additional optimization compared to a native 72 FPS app (because of the overhead compute used by Application Spacewarp).

Meta emphasizes that Application Spacewarp is fully controllable by the developer on a frame-by-frame basis. That gives developers the flexibility to use the feature when they need it or disable it when it isn’t wanted, even on the fly.

Developers also have full control over the key data the goes into Application Spacewarp: depth-buffers and motion vectors. Meta says that this control can help developers deal with edge cases and even find creative solutions to best take advantage of the system.

Lower Latency Than Full Framerate

Combined with other techniques, Meta says that Quest applications using Application Spacewarp can have even lower latency than their full framerate counterparts (that aren’t using the extra tech).

That’s thanks to additional techniques available to Quest developers—Phase Sync, Late Latching, and Positional Timewarp—all of which work together to minimize the time between sampling the user’s motion input and displaying a frame.

Differences Between Application Spacewarp (Quest) and Asynchronous Spacewarp (PC)

While a similar technique has been employed previously on Oculus PC called Asynchronous Spacewarp, Meta Tech Lead Neel Bedekar says that the Quest version (Application Spacewarp) can produce “significantly” better results because applications generate their own highly-accurate motion vectors which inform the creation of synthetic frames. In the Oculus PC version, motion vectors were estimated based on finished frames which makes for less accurate results.

Application Spacewarp Availability

Application Spacewarp will be available to Quest developers beginning in the next two weeks or so. Meta is promising the technique will support Unity, Unreal Engine, and native Quest development right out of the gate, including a “comprehensive developer guide.”

Per the update at the top of the article, Application Spacewarp is now available to Quest developers.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Shem

    Should we expect higher polygon/res versions of all the major apps once this out in the wild?

    • Foreign Devil

      From reading the description I’m thinking it just means a smoother and faster refresh rate/frame rate rather than an increased ability to handle more polys or complex lighting.

      • benz145

        It’s not just smoothing out head tracking, it’s actually increasing the amount of time that each frame has to render, which is basically a direct increase in performance.

        An application targeting 36 FPS has twice as much time for each frame to render compared to running at 72 FPS; that extra time can be spent by developers however they’d like (for instance, to render at a higher resolution, use better anti-aliasing, increase geometric complexity, put more objects on screen etc).

      • grindathotte .

        This could be great when porting from PC; they won’t ahve to kill the polygon count by quite so much.

        • Mario Baldi

          Yep, and poly count isn’t everything.
          Complex shading fx, like pbr, global illumination of any kind, shadows etc are as important to the final quality of the frame.
          This is great news really.

      • Jistuce

        It means the game doesn’t have to actually be capable of hitting 72 Hz natively, which means they can add details that would otherwise be dropped to meet framerate targets.

    • Marc-André Désilets

      If developers of already released apps want to integrate this feature, yes. It means the have more time to render each frame so technically they could add better lighting/ higher polycount and better overall rendering quality. But they still need need to make a major update to their existing game. I think that for most of them what will will see is a simple 90 oe 120fps mode that wasn’t there before and more stability in the fps.

  • xyzs

    +70 percent optimisation with better quality and latency… who could moan about that ?!
    Even half of the claim, I take it.

    • Christian Schildwaechter

      In theory it should be possible to render a higher resolution in VR than the same hardware could handle on a flat screen, because in VR with eye tracking the actual view axis is known as well as the viewers distance from the screen. So you only have to render objects at the center of the vision in high resolution, and only those that are still visible for the average user. Pancake vision always has to render everything at the same resolution, as we don’t know what the user is currently locking at, and always at the same detail level, as we don’t know whether sh/e is sitting at arms length away or half across the room.

      In reality optimization is hard, eye tracking doesn’t work as well yet as we need it to, so rendering the same quality in VR requires much faster hardware. This will change over time, but it will take a couple of years.

      • kraeuterbutter

        hmm.. imagine glasses, that have no lenses, no display..
        only cams for eyetracking..
        so: ultra light (less than 100g)

        you look on your normal 32″ 4k 16:9 monitor, but:
        foveated Rendering because of the eyetracking-information of the glasses..

        when in some years eyetracking and foveated rendering will work good, i don´t see why it – technically- should not be possible also to be implemented for a “normal” flat monitor in combination with tiny, light glasses for the eyetracking

        • kebo

          There is already an eye tracker for PC gaming and it’s “tobii eye tracker 5”. It’s basicly a camera on your monitor.

          It’s already beeing used in some games. In “the devision” the game checks where you want to move behind cover for excample.

          However, i don’t think PC eye tracking will become wide spread because it’s an extra device that noone will buy. In VR it’s already in the headset so no problem. And in VR people want it… not only for rendering but also for presence in multiplayer games (face tracking).

          • d0x360

            I agree but for different reasons. Well I agree with you that people playing flat likely won’t buy it, especially not for rendering.

            In fact that’s the main reason I wouldn’t ever buy it for rendering enhancements… It wouldn’t work.

            Not only does the eye tracking have to keep up with basically no latency which gets worse as the display gets bigger or the further away you are.

            Games would be better off using dx12u’s function that lowers object detail and geometry based on where it is and what’s being occluded.

            There’s a great demo of it in the dx12u launch video. You never see a single LOD change despite objects going from 1/4 detail and complexity to incredibly detailed and complex. It’s like nanite in ue5 but since it’s at the API level it offers more performance and more detail.

        • Christian Schildwaechter

          You can use eye tracking on a flat screen, but it has less benefits, mostly due to the small FoV. If you are sitting at about 0.6m/2ft from an 27″ display, you only get a 15° FoV. If you are looking at a 100″ TV from the recommended viewing distance of about 1.85m/6ft, the FoV is 7.5°. Pretty hard to reduce the render resolution anywhere without it being very obvious. In VR the much larger FoV means that there are big areas sufficiently far away from where you are currently looking at that it makes sense to render them differently.

          The other problem is that multiple persons can look at the same TV at the same time. If you only use the eye tracking to determine at which spots you render the image in the highest resolution, you could still render two, five or ten spots, matching the number of persons. But eye tracking also allows to render the image according to binocular cues like convergence, i.e. the closer an object is, the more inwards your eyes turn. Currently we render the image in VR at the same sharpness level for any depth, while in reality your eyes focus at a certain depth, and areas before and beyond it appear blurry. Eye tracking will enable this type of display for a single person, which in turn enables reducing details in the blurry areas, but this is simply impossible for a flatscreen and multiple persons, because they would blur out each others area of focus.

          There still can be benefits for eye tracking on flatscreens, e.g. saying “put that there” while first looking at the icon to be moved and then looking at the folder you want it moved too. There is actually an MIT media lab interface prototype from 1980 that does this with hand instead of eye tracking. Worth looking up on YouTube, and the project was actually called “put-that-there”. You don’t even need special glasses for this, a moveable camera that can zoom onto your face/eyes will be enough at some point. You can do simple eye tracking with your laptop webcam today.

      • Anthony Kenneth Steele

        with the 70% increase the quest is more powerful than ps4 original so we should be seeing games look as good res evil 7 with slightly higher resolution.

        • Christian Schildwaechter

          The 70% percent increase is a best case scenario, not an average. Sounds great, but if you only have to render every second frame, you’d expect a theoretical increase of 100%, meaning that at least 30% of the performance is still needed even to create the extrapolated frames, and typically it is more.

          The PS4 is also a much more balanced system when it comes to high end graphics, e.g. RAM speed is much higher. So purely from a current hardware perspective, there is still quite a distance between them, even with all the nifty tricks.

          The bigger problem is the development budgets. Even if you could create a highly polished AAA game like RE7 on Quest, there are currently no 3rd party companies willing to invest the amount of money needed due to the small user base (5mn Quest 2 vs. 115mn PS4).

          So “we should be seeing games look as good res evil 7” is not particularly realistic. The fact that Meta payed Konami to port a 2011 PS3 remaster of the 2005 GameCube RE4 instead of the 2017 PS4 RE7 is more a sign of what “we should be seeing”, not only regarding the size, but also visual fidelity.

          • Frédéric Lormois

            Quest 2 is near 10 millions units now. Meta can pay for AA/AAA games

          • Christian Schildwaechter

            Do you have any source for that? The only hard numbers we got are 4.2mn sold in Northern America from the recall in late July, which was estimated to mean around 5mn worldwide. After that they didn’t sell the headset for a month. And from comparing activity on Quest app sales we can see that most of the Quest 2 were sold in December and January, and by October activity (number of new ratings) had fallen to just 40% of the levels of January. All this indicates that it is very unlikely that Meta suddenly managed to double the sales in three months during which all other signs indicated that sales have dropped drastically.

    • kontis

      If you combine that + their Neural super sampling + dynamic foveated rendering

      You can’t.

      The idea of timewarp is to WAPR already rendered picture. You can’t re-warp things that you purposefully didn’t render.

      who could moan about that ?!

      I don’t know, maybe a dev that has a game with a lot of transparent objects and glass-type of materials. This method basically begs devs to gave up on this kind of aesthetics and focus on opaque materials. The depth buffer is single layer, not transparency supported.

      • xyzs

        Hum, you can.
        -The eye tracking gives info where to render the foveated zone full res / the edge zone to render lower quality.
        -The neural supersampling does its job of super sampling the foveated area / generating a sufficient res for the surronding area.
        -Then the spacewarp doubles the previous input framerate.

      • Christopher Stockman

        Right on the money. We can’t use it because of this very reason. It would require us to go back and alter a lot of assets.

  • Nothing to see here

    This could solve the problem of needing high frame rate for smooth head tracking but lower frame rates for actual 3D motion. This would increase the comfort level of VR without the cost of rendering detail.

    • kontis

      This isn’t about separating view framerate from object framerate. What you are describing is more like texture space rendering which is a completely different thing, but also quite promising for VR.

  • doug

    You know what else uses motion vectors? DLSS 2.0. This sounds like DLSS 2.0 with Zuckerberg’s branding stamped on the press release.

    • Christian Schildwaechter

      The motion vectors are actually used for the temporal component introduced in DLSS 2, which makes it technically an ML supported TAA method. And no, ASP has nothing to do with antialiasing, both just use the motion vectors to better estimate were not actually rendered pixels are or will be. In TAA these are fine details that got lost and are restored from the last frames, in ASP it is the future position of the current pixels in the next frame that will not be rendered, but interpolated.

      • guest

        Cannot imagine how pixels can ever get close to matching the tracking with any of these methods.

      • Zerofool

        in Application Spacewarp it is the future position of the current pixels in the next frame that will not be rendered, but interpolated.

        You mean extrapolated ;)

        • Christian Schildwaechter

          Indeed.

        • David

          hm I think “interpolated” works here — this is basically what motion interpolation is, right?

          • Christian Schildwaechter

            Most people probably understood what was meant, but extrapolation is the correct terminology for how AppSW creates extra frames. You can only interpolate between two known points, i.e. you can make an estimate what lay in the middle. AppSW doesn’t have the required second frame yet, as it still has to be rendered in the future, after the next pseudo frame that AppSW generates. So instead it uses past frames to determine where objects were headed and projects that movement into the future.

            inter = Latin for “within, between”
            extra = Latin for “out of, outside”

    • Jistuce

      Asynchronous Spacewarp, which this is an upgrade of, predates DLSS1.0, though.
      X Spacewarp also fakes a higher frame rate rather than faking a higher resolution.

      And X Spacewarp is also implemented on a general-purpose CPU rather than requiring dedicated neural network hardware. (DLSS literally only exists so nVidia can justify forcing gamers to pay for hardware that is only of use to business and science customers.)

      So no, it is really nothing like DLSS.
      “But motion vectors!”
      Yeah, well… a physics engine uses motion vectors, that doesn’t mean it is a ripoff of DLSS.

      • kontis

        DLSS literally only exists so nVidia can justify forcing gamers to pay for hardware that is only of use to business and science customers.

        The fact Nvidia already has multiple desirable consumer features using Tensor cores and the fact that similar hardware accelerators already exist in smartphones and even the XR2 chip in Quest 2 has it proves you are wrong and you are just trying to push a very biased AMD fanboy opinion, because AMD doesn’t have something that even mobile chips have.

        • Jistuce

          More of an S3 fanboy, really. Savage5’s comin’ out any day, just you watch!

    • guest

      literally the most inane thing i’ve ever read

    • benz145

      These are two totally different things serving different purposes.

  • Till Eulenspiegel

    Due to the underpowered nature of standalone VR devices, it’s better to artificially double the frame rate (Like PSVR) and use something like DLSS to up the resolutions. Brute force rendering is killing the battery and heating up the headset, and even then – it still can’t produce high quality image with good performance.

    • Marc-André Désilets

      It’s important to undestrand that this is very advanced frame interpolation based on motion vectors, not like the crappy thing we see on fake 120/240hz television. Oculus has been working on this tech for a while, the older version was asw 2.0 that was released more than2 years ago on PC. It’s actually a pretty amazing concept, can’t wait to see how it renders on mobile in real life scenario.

      https://www.oculus.com/blog/introducing-asw-2-point-0-better-accuracy-lower-latency/?locale=fr_FR

    • kontis

      People really don’t understand that algorithms aren’t free and there are big overheads to all tricks.

      DLSS takes more power per pixel than what Quest has for entire rendering.

      Also VR games, especially on mobile, generally don’t use screen space temporal pixel manipulation techniques and DLSS is extension of that.

  • Ad

    Valve needs to jump on this and enable things like choosing “super ultra settings” to enable 1:2-4 ASW, not just 1:1. This is a big threat to PC but also a chance to really force forward what VR can be.

  • johnyjazz

    Is this going to breath some new life into the old Quest 1 so it will be supported for longer?

    • Fox Vulpe Neacsu

      Yes! Definitely , i hope it gets implemented into the quest 1

    • kontis

      NO, it has a significant GPU compute overhead, so on Quest 1 the overhead could become so large (due to weaker GPU) it may not be even worth enabling.

      In other words: this method sacrifices some performance, but saves even more than it sacrifices, so it’s a net win. On Quests 1 that “sacrifice” part would be much larger than on Quest 2.

      • Christian Schildwaechter

        That is true if the game is GPU bound. Technically AppSW isn’t that much different from the reprojection techniques we used so far, so this doesn’t magically improve performance. But with ASW developers still had to try to render everything in 1/72th second, reprojection just kicked in if this failed.

        With AppWS you could now allocate almost twice that time. In a theoretical case where you are trying to render only a cube falling down stairs in the most accurate way, the time available to the physics engine might have been the limiting factor. Not sure how many Quest games there are that are more CPU bound, but these could benefit significantly both on Quest 1 and 2.

  • Wow, as a VR developer, I can say that this news sounds amazing!

  • Aragon

    Still wonder why we have all this motion interpolation technologies to boost framerate only in VR and not in “normal” games.

    • Christian Schildwaechter

      Mostly because people can play normal games at 30fps without getting motion sickness, and even tolerate drops below that. These are all sort of hacks to keep the VR frame rates/head tracking response fast enough to be comfortable for most users, even if this means that there is increased input lag for the controllers.

      All techniques to create extrapolated frames also come at a cost, they reduce the actual rendered frame rate, so for pancake games it makes more sense to instead render as many full frames as possible, while at the same time keeping input lag as low as possible.

      • silvaring

        Thank you, great comment.

    • kontis

      Normal games already use variable rate shading, dynamic resolution and temporal techniques to build frame based on data from previous frames.

      The same principles just different implementations that are better optimized for visual aspects of normal games that are quite a bit different from VR.

  • Alex Makes 3D

    That sounds interesting)

  • d0x360

    Not a fan of that idea. It sounds good in theory but I don’t think it will look that good in reality. Faking frames never works quite right. Plus a genuine high frame rate will always be better.

    I suppose space warp might be acceptable if you are already running the game at 90fps or above. I’m skeptical based on experience. Prove me wrong oculus. Do that AND release a nice high end PC OLED HMD. I get why people like the Quest but it’s not for me. I want high end but I don’t want to use base stations ever again. The Rift S spoiled me on inside out tracking. They just need to add 2 rear facing cameras and it would be perfect. Hell it’s damn close already thanks to Carmack’s mathematical genius.

    What they really need to do (in general) is support DLSS, FSR, XeSS and any other modern upscaling.

    Combine that with foveated rendering when eye tracking is faster and you will need significantly less power to run something or if your using a high end PC you would end up with absurdly good image quality and no problem hitting 120fps+

    • Anthony Kenneth Steele

      In slow paced eye candy games it will work well..But I use asw in fallout 4 on pc and you can hardly tell it’s on and for some reason it looks better at 36fps than 45fps.

    • benz145

      I don’t think you should write this off with the notion that “faking frames never works quite right,” as there’s lots of frame-faking already that you just don’t notice because it works so well. For instance, timewarp (which reduces latency by modifying the frame with more up-to-date sensor data) is essentially frame-faking, but it’s nearly invisible and forms the cornerstone of VR compositors across all major headsets.

      Modern video formats like MPEG make heavy use of very similar motion-prediction techniques in order to achieve much better compression ratios than would be possible otherwise.

  • Anthony Kenneth Steele

    What we need is dynamic asw that does what it can do at 36hz and runs the rest at 72hz. Because in my experience asw (frame interpolation) always has problems rendering moving wheels. But if it looks as good as asw on pc I will be happy.

  • Anthony Kenneth Steele

    I’d love to see The Walking Dead on Quest 1 with smooth framerates on that beautiful black screen. And up the resolution slightly.

  • Anthony Kenneth Steele

    Effectively this makes the quest 2… 2.12 tflops and the ps4 is only 1.84 tflops running Resident Evil 7 in vr!..much lower resolution mind you… but if they used fsr as well then you have Res Evil 7 level games at much higher resolution :D

    • Christian Schildwaechter

      That’s not how that works at all. AppSW doesn’t increase the performance of the Quest one bit, it in fact reduces the number of frames that can be fully rendered. Instead this is a way to make VR gaming usable at only 36fps without the users immediately having to throw up. Games can render more complex scenes if they have more time to finish the calculations, but PS4 usually run fine at 30fps and a much lower resolution without being nauseating, so at best the Quest 2 is catching up a little.

    • Duckman

      Base PS4 runs resident evil 7 in 540P and even then they had to disable some graphical effects. I know. I played it. It looked terrible. PS4 Pro runs it in 720P.

  • ncvr

    Not sure about something, and would appreciate your opinion – the Fresnel lens currently installed in the Quest 2 provides poor image quality outside a small FOV in the center of the lens. Shifting the gaze anywhere outside the small center FOV resultS in significantly blurrier image – so, even based on the HMD orientation only, with no gaze tracking, it should be possible to reduce the rendered resolution outside the center circle significantly, without losing observed quality. Is this done on the quest?

    • jiink

      Many games already use fixed foveated rendering. This was done on Quest 1 as well, maybe even the Go. PCVR games like The Lab use it too, but definitley not every game has this implemented

  • Rupert Jung

    I just don’t get why we still don’t see this kind of tech for 2D gaming, too.

  • Andrew Jakobs

    Why can’t you f-ing stop posting everything in bold. I know you only do it to get noticed.. It’s freaking annoying.

  • oomph2

    Dont tell anyone

  • Rupert Jung

    Hoping for a 90 Hz Upgrade for Resident Evil 4.

  • David Barlia

    As a developer, I’m excited about Application Spacewarp and how it will likely open up possibilities for rendering quality on Q2.

    Just a note: Although this tech is currently available, Unity developers at least will have to use bleeding-edge versions of Unity and it’s components to be able to make use of it–something nobody should rely on for a public release. Also, changing the target framerate to the sweet-spot of 45fps will have unwanted effects on *some* kinds of animation which will take a bit of reworking.

    So it’ll likely not be popping up in all your favorite games immediately.

    Nonetheless, Application Spacewarp is a really exciting development for mobile VR that is likely to seriously boost how ambitious developers can be with their projects in the near future.

  • Jonathan Winters III

    Green Hell is the first Quest game using this tech. The graphics and physics are jaw-dropping and sets the new standard for Quest 2 games.