Cambridge & Meta Study Raises the Bar for ‘Retinal Resolution’ in XR

17

It’s been a long-held assumption that the human eye is capable of detecting a maximum of 60 pixels per degree (PPD), which is commonly called ‘retinal’ resolution. Any more than that, and you’d be wasting pixels. Now, a recent University of Cambridge and Meta Reality Labs study published in Nature maintains the upper threshold is actually much higher than previously thought.

The News

As the University of Cambridge’s news site explains, the research team measured participants’ ability to detect specific display features across a variety of scenarios: both in color and greyscale, looking at images straight on (aka ‘foveal vision’), through their peripheral vision, and from both close up and farther away.

The team used a novel sliding-display device (seen below) to precisely measure the visual resolution limits of the human eye, which seem to overturn the widely accepted benchmark of 60 PPD commonly considered as ‘retinal resolution’.

Image courtesy University of Cambridge, Meta

Essentially, PPD measures how many display pixels fall within one degree of a viewer’s visual field; it’s sometimes seen on XR headset spec sheets to better communicate exactly what the combination of field of view (FOV) and display resolution actually means to users in terms of visual sharpness.

According to the researchers, foveal vision can actually perceive much more than 60 PPD—more like up to 94 PPD for black-and-white patterns, 89 PPD for red-green, and 53 PPD for yellow-violet. Notably, the study had a few outliers in the participant group, with some individuals capable of perceiving as high as 120 PPD—double the upper bound for the previously assumed retinal resolution limit.

The study also holds implications for foveated rendering, which is used with eye-tracking to reduce rendering quality in an XR headset user’s peripheral vision. Traditionally optimized for black and white vision, the study maintains foveated rendering could further reduce bandwidth and computation by lowering resolution further for specific color channels.

So, for XR hardware engineers, the team’s findings point to a new target for true retinal resolution. For a more in-depth look, you can read the full paper in Nature.

SEE ALSO
Mixed Reality Obstacle Course 'Laser Dance' Comes to Quest 3 in Early Access Next Month

My Take

While you’ll be hard pressed to find accurate info on each headset’s PPD—some manufacturers believe in touting pixels per inch (PPI), while others focus on raw resolution numbers—not many come close to reaching 60 PPD, let alone the revised retinal resolution suggested above.

According to data obtained from XR spec comparison site VRCompare, consumer headsets like Quest 3, Pico 4, and Bigscreen Beyond 2 tend to have a peak PPD of around 22-25, which describes the most pixel-dense area at dead center.

Meta ‘Butterscotch’ varifocal prototype (left), ‘Flamera’ passthrough prototype (right) | Image courtesy Meta

Prosumer and enterprise headsets fare slightly better, but only just. Estimating from available data, Apple Vision Pro and Samsung Galaxy XR boast a peak PPD of between 32-36.

Headsets like Shiftall MeganeX Superlight “8K” and Pimax Dream Air have around 35-40 peak PPD. On the top end of the range is Varjo, which claims its XR-4 ($8,000) enterprise headset can achieve 51 peak PPD through an aspheric lens.

Then, there are prototypes like Meta’s ‘Butterscotch’ varifocal headset, which the company showed off in 2023, which is said to sport 56 PPD (not confirmed if average or peak).

Still, there’s a lot more to factor in to reaching ‘perfect’ visuals beyond PPD, peak or otherwise. Optical artifacts, refresh rate, subpixel layout, binocular overlap, and eye box size can all sour even the best displays. What is sure though: there is still plenty of room to grow in the spec sheet department before any manufacturer can confidently call their displays retinal.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.

Well before the first modern XR products hit the market, Scott recognized the potential of the technology and set out to understand and document its growth. He has been professionally reporting on the space for nearly a decade as Editor at Road to VR, authoring more than 4,000 articles on the topic. Scott brings that seasoned insight to his reporting from major industry events across the globe.
  • xyzs

    Let’s just make 180 fov HMD with 16k per eye and every one will be satisfied ! Done

    • Sven Viking

      Surprisingly still a bit low of FOV which can range up to around a maximum of 270 degrees with eye movement (probably varying with face shape).

      • JanO

        Just like many enthousiasts, I would really like a larger FOV, but feel it might not happen that soon because it's been clearly associated with creating more motion sickness from our perferal vision….

        • JanO

          Peripheral vision…

        • Christian Schildwaechter

          This has been sort of solved since 2016, when Ubisoft released Eagle Flight for PCVR/PSVR, a 1st person "eagle in Paris" simulator using head movement as its primary control. Its locomotion approach made it a prime candidate for triggering motion sickness. Ubisoft as an AAA studio wanted to ensure comfort, so they did a lot of research and ended up with two main solutions.

          First they always showed the eagle's beak as a visual fixture, providing an "explanation" for the mismatch between what the eyes and the vestibular system in the ears report, which is the main cause for motion sickness. This also works in car simulations by keeping the cockpit fixed to the users head rotation, even if it doesn't make physical sense.

          And secondly by dynamically limiting the FoV. A higher FoV causes more motion sickness especially during head turns, so they introduced a dynamic vignette reducing the FoV only during head turns. They somewhat overdid it, and AFAIR there was no way to turn it down, but this actually helped with reducing motion sickness.

          We have known what triggers motion sickness, what can be done to prevent it at least for most people, and what developers should avoid doing for a decade. Most games now come with a number of comfort feature, though often these should be more fine-tunable by the users, and should be properly explained to those that in their search for maximum immersion switch of everything, not necessarily understanding why these things are even there. And of course a lot of developers never bothered to learn how to properly design for and implement comfort in VR games.

    • Christian Schildwaechter

      Let's start with first making eye tracking and ETFR the default on every headset. Otherwise the computational demands for rendering all the things suddenly visible in your peripheral view, which increase exponentially with a higher FoV, will prevent you from playing anything with decent graphics. And you probably don't want to waste your 180° 16K display on stylized cartoon visuals.

      As long as there is enough compute power for accurate eye tracking, this is a very simple and cheap optimization for increasing the FoV. Much easier than solving the optical distortion problems with the higher magnification lenses needed for a larger FoV. Or increasing the pixel count of current 4K displays by a factor of 16.

      Properly done, ETFR could enable something like 200% supersampling just for the foveated region even on mobile HMDs. This would result in a visually much sharper image, making the reduced PPD that would come with increasing the FoV without also increasing physical pixel count less noticeable.

      • xyzs

        You are really first degree, I was joking.

        I agree, eye tracking foveated rendering should be not an option but a default requirement for VR from now on.

        Since the foveated area is less then 5 percent of the vision, by having multiple circles of degressive rendering quality around the center of gaze, it should in theory be possible to render human perception quality with even less than 8 millions pixels (the equivalent of a 4k screen)

  • XRC

    35 PPD peak here with my Crystal original running 100% render resolution (4312×5100 per eye) which requires RTX 5090 to hit constant 90hz. If resolution is reduced (with my old RTX 4080) it loses sharpness quite noticeably, but at full resolution very impressive clarity especially long distance.

    Was fortunate enough to take part in the Almalence "digital lens" software trial for Crystal a while ago, very noticeable increase in perceived sharpness and resolution in OpenXR titles – I'd estimate ppd in the mid 40's.

    Dynamic foveated rendering is injected by Pimax Play runtime using Tobii 120hz eye tracking, most of my openvr DX 11 titles work as intended with it letting me increase super resolution by 1.3x in Aircar for example which just looks astonishing with a liquidity to the graphics.

    It's also more comfortable optically with the eye tracking providing in essence a dynamic pupil which works very well with the aspherical lenses.

    Pimax's latest Crystal Super has 50ppd and 57ppd optical engine options, but will require next generation Nvidia X90 GPU to exploit.

    • Christian Schildwaechter

      I hadn't heard about Almalence DLVR, and thanks to your comment I now found out that just last month they also released an open sourced eye tracking solution on GitHub. It promises the same aberrations and distortion compensation as their OpenXR plugin while only consuming 2% of the compute power on an XR2 Gen 2, so a lot less resource hungry than Meta's implementation on the Quest Pro, and therefore more suitable for mobile HMDs.

      The (no longer supported) OpenXR plugin seems to mostly add eye tracked supersampling to what ETFR/DFR usually does. Kind of a logical extension to reducing the render resolution outside of the foveated area, using the gained compute to increase the perceived resolution inside the foveated area.

      • XRC

        Pimax now have both quad views and DFR integrated into their Play runtime, eliminating need for third party tools, they are aware of the Almalence open source news ;)

        got the Super coming soon for testing, will report back as I'm very interested to see the difference between my existing headset (35ppd) and the Super's 50ppd.

  • Sven Viking

    This is not surprising to me, particularly some people being more sensitive than most. I expect most people would have difficulty noticing a difference above 60ppd in many or most circumstances, though (e.g. depending on content).

    • Christian Schildwaechter

      Seeing more details not only requires very good eye sight and looking straight forward, but also not moving the head and looking at a static object.There may be (some) room for improvements beyond 60PPD for productivity apps that require seeing fine details/text fixed in space.

      But it would be wasted on things like games with lots of head, object and world movement, where not even the person with they best vision in the world will be able to really recognize anything moving at even close to a 60PPD detail level. Seeing things during movement is mostly the brain interpreting some much lower resolution perceived blur. For games it makes a lot more sense to use extra render performance to increase geometry or texture details instead of upping the resolution beyond 60PPD. So far we have only a few 4K HMDs, but these already seem to be more than enough for displaying games, while things like reading or working as a virtual monitor clearly still need significantly higher resolutions.

  • brandon9271

    I'd settle for higher FOV at same ppd. What good are crystal clear visuals when you're looking through binoculars?

  • Someone on Reddit said that 60 is the point of diminishing returns

  • ShaneMcGrath

    Not really interested in more resolution, My Quest 3 is already good enough.
    I won't upgrade unless we see more FOV, Biggest thing that breaks VR immersion for me is no longer the resolution, It's when you see peripheral for a split second and realise you are looking through swimming goggles.

  • Sofian

    I am more happy with my Play for Dream for now but I am more interested in much brighter headsets with HDR support.
    I hope the next generation of pancake lenses will make this possible.

  • psuedonymous

    This has been well known for decades, and nobody other than marketers took the Apple definition of 'retinal resolution' seriously.
    e.g. Capability of the Human Visual System (DOI:10.1117/12.502607) and cites minimum perceptual acuity down to the sub-arcsecond level for some tasks. It also clarifies how other aspects of a display affect acuity, in particular illumination levels.
    Amusingly, Capabilty also lists Minimum Separable Acuity at 30 arcseconds in ideal illumination, meaning the new Nature paper is a replication of previous work.
    Even this 'new' figure of 120 PPD (AKA 30 arcseconds) is far too coarse to approach even Vernier acuity (1 arcsecond under ideal illumination), let alone minimum perceptible.