Oculus Research Reveals “Groundbreaking” Focal Surface Display

New tech could even eliminate the need for prescription glasses in VR

32

Oculus Research, the VR and AR R&D division of the company, today announced a new display technology that they’re calling the “Focal Surface Display.” The display aims to improve the vergence-accommodation conflict that plagues today’s VR headsets. The company calls the work “groundbreaking.”

Oculus Research’s Focal Surface Display prototype | Photo courtesy Oculus

Oculus Research has published a paper and will present the research on the focal surface display at the SIGGRAPH conference this July. A video released by Oculus (heading this article) gives a brief explanation of what the display achieves. An accompanying blog post offers additional detail.

Focal surface displays mimic the way our eyes naturally focus at objects of varying depths. Rather than trying to add more and more focus areas to get the same degree of depth, this new approach changes the way light enters the display using spatial light modulators (SLMs) to bend the headset’s focus around 3D objects—increasing depth and maximizing the amount of space represented simultaneously.

All of this adds up to improved image sharpness and a more natural viewing experience in VR.

[…]

By combining leading hardware engineering, scientific and medical imaging, computer vision research, and state-of-the-art algorithms to focus on next-generation VR, this project takes a highly interdisciplinary approach—one that, to the best of our knowledge, has never been tried before. It may even let people who wear corrective lenses comfortably use VR without their glasses.

The researchers are employing a spatial light modulator which appears to be able to selectively bend light to change its focus for select parts of the image.

The research paper, authored by Oculus Research scientists Nathan Matsuda, Alexander Fix, and Douglas Lanman, concludes with the following.

Focal surface displays continue down the path set by varifocal and multifocal concepts, further customizing virtual images to scene content. We have demonstrated that emerging phase-modulation SLMs are well-prepared to realize this concept, having benefited from decades of research into closely-related adaptive imaging applications. We have demonstrated high-resolution focal stack reproductions with a proof-of-concept prototype, as well as presented a complete optimization framework addressing the joint focal surface and color image decomposition problems. By unifying concepts in goal-based caustics, retinal scanning displays, and other accommodation-supporting HMDs, we hope to inspire other researchers to leverage emerging display technologies that may address vergence-accommodation conflict in HMDs.

While not a perfect fix for the vergence-accommodation conflict, Oculus is pitching the display tech as a “middle ground” between today’s VR displays and one with ideal properties to deal with vergence-accommodation conflict.

SEE ALSO
Facebook Teases "breakthrough technologies" Coming to New Oculus Products, Tours R&D Lab

“While we’re a long way out from seeing results in a finished consumer product, this emerging work opens up an exciting and valuable new direction for future research to explore.” Oculus writes in their blog post. “We’re committed to publishing research results that stand to benefit the VR/AR industry as a whole.”

So called ‘varifocal’ displays are a hot research topic right now because they stand to make the light emitted by a VR headset much closer to the light we see in the real world, allowing our eyes to focus more naturally and comfortably on the virtual scene. The same technology could also be used to eliminate the need for glasses while using a VR headset.

The focal surface display approach does require eye-tracking, which itself is not a completely solved issue. The researchers are also quick to admit that the technique is difficult to achieve with a wide field of view; above you can see their assessment of characteristics of a number of different techniques that have been devised to achieve a varifocal display.

Primer: Vergence-Accommodation Conflict

Accommodation is the bending of the eye’s lens to focus light from objects at different depths. | Photo courtesy Pearson Scott Foresman

In the real world, to focus on a near object, the lens of your eye bends to focus the light from that object onto your retina, giving you a sharp view of the object. For an object that’s further away, the light is traveling at different angles into your eye and the lens again must bend to ensure the light is focused onto your retina. This is why, if you close one eye and focus on your finger a few inches from your face, the world behind your finger is blurry. Conversely, if you focus on the world behind your finger, your finger becomes blurry. This is called accommodation.

Vergence is the rotation of each eye to overlap each individual view into one aligned image. | Photo courtesy Fred Hsu (CC BY-SA 3.0)

Then there’s vergence, which is when each of your eyes rotates inward to ‘converge’ the separate views from each eye into one overlapping image. For very distant objects, your eyes are nearly parallel, because the distance between them is so small in comparison to the distance to the object (meaning each eye sees a nearly identical portion of the object). For very near objects, your eyes must rotate sharply inward to converge the image. You can see this too with our little finger trick as above; this time, using both eyes, hold your finger a few inches from your face and look at it. Notice that you see double-images of objects far behind your finger. When you then look at those objects behind your finger, now you see a double finger image.

With precise enough instruments, you could use either vergence or accommodation to know exactly how far away an object is that a person is looking at. But the thing is, both accommodation and vergence happen together, automatically. And they don’t just happen at the same time; there’s a direct correlation between vergence and accommodation, such that for any given measurement of vergence, there’s a directly corresponding level of accommodation (and vice versa). Since you were a little baby, your brain and eyes have formed muscle memory to make these two things happen together, without thinking, any time you look at anything.

But when it comes to most of today’s AR and VR headsets, vergence and accommodation are out of sync due to inherent limitations of the optical design.

In a basic AR or VR headset, there’s a display (which is, let’s say, 3″ away from your eye) which makes up the virtual image, and a lens which focuses the light from the display onto your eye (just like the lens in your eye would normally focus the light from the world onto your retina). But since the display is a static distance from your eye, the light coming from all objects shown on that display is coming from the same distance. So even if there’s a virtual mountain five miles away and a coffee cup on a table five inches away, the light from both objects enters the eye at the same angle (which means your accommodation—the bending of the lens in your eye—never changes).

That comes in conflict with vergence in such headsets which—because we can show a different image to each eye—is variable. Being able to adjust the imagine independently for each eye, such that our eyes need to converge on objects at different depths, is essentially what gives today’s AR and VR headsets stereoscopy. But the most realistic (and arguably, most comfortable) display we could create would eliminate the vergence-accommodation issue and let the two work in sync, just like we’re used to in the real world.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Damien Wilson

    Throw this in with foveated eye rendering and it’s a winner.

    • Justos

      Gen2 is gonna be sick

      • Bundy

        My thoughts exactly. What a time to be alive.

      • elev8d

        Might be a gen3 thing.

    • ZenInsight

      And increase the FOV range. Gen 2 should be really great if they can get these improvements added. Also, integrated Leap Motion for full finger/hand recognition.

  • Xron

    hmz, I hope that it will be possible to add eye rendering to this, because only this wont be enough for next gen device.

    • Lukimator

      According to the table, you NEED eye tracking for this

      • Jesus

        The table is mistaken, read the paper 6.2 Future Work, it would be better with eye tracking, but dont NEED eye tracking right now

        • Joan Villora Jofré

          No, the table is correct; it will need eye tracking integrated, according to the Oculus article. Now it does not use it, but it’s because the display is fixed, I think.

  • traschcanman

    “The optical path begins, as shown in Figure 10b, with an eMagin WUXGA 1920×1200 60 Hz color OLED display”

    eMagin’s 2K x 2K (per eye) is also now available .

    CEO Andrew Sculley – 1Q CC :

    One company is funding a next-generation display that we are currently developing, another is actively involved in discussions and for whom we have done preliminary work and the third is keenly interested in our current 2Kx2K design. In addition several companies have come back to us for discussions that had been silent during the first quarter.

  • Joan Villora Jofré

    Just amazing!

  • Cool, havent seen this particular approach before.

  • Lucidfeuer

    That’s a completely overblown/unnecessary rationale but a nice experimentation nonetheless.

    • Joey G

      I was going to say, is this really needed? I’ve tried NVidia’s multi-plane focusing prototype, and while neat, it didn’t seem to improve my experience. In fact, as someone hitting his 40s, needing to physically focus closer would be detrimental, as I may need “reading glasses” for that soon.

      • Lucidfeuer

        It’s like they are trying to force-pull what lightfield screens will naturally do, but in 15-20 years.

        Right now, accurate eye-tracking, FOVRender, and per-pixel latency screens are the feasible and practical way to go.

      • Jack H

        It should be fine to modify the filter action to operate within certain depth bounds only.

  • So, the important point will be “How is the system supposed to do the focal calculation without eye tracking?”

    • Bundy

      I suspect they won’t ever need too. Eye tracking tech looks more mature than this. It’ll probably land first.

  • Ted Joseph

    I would stick with the current rift for the next 10 years if they increased the FOV to 180 degrees! This is my only beef at this point with VR. I am having a blast in VR, but the “blinders” take the immersion right out of the experience. Some tricks in VR sports while playing goalie are cool, but most games are not built this way. FOV is key in my opinion!

    • M Rob

      100% agree

      • Ted Joseph

        I fully understand why they didn’t. They are having a difficult time selling the Rift and Vive for the current price. Larger lenses would push it over the top, and sales would decrease — that is – until VR becomes mainstream.

        • Lucidfeuer

          Nothing to do with price. The first iPhone sold 6 millions units at 700$ although nobody knew what it was. Many millions of people buy clothes or accessories at that price. Some dumb UHD TVs are at that price. Even the Virtual Boy sold 800K in 9 month 20 years ago. So again and again, the price excuse is bullshit.

          As you originally said, amongst many other problems and missing compulsory components or specs, the FOV is WAY too low on the current headsets.

          The original Oculus DK1 had a FOV 110°…4 years ago. The fact that we didn’t significantly go beyond for the consumer version despite the myriad of potential solutions that exists and their multi-millions of budget, but that we in fact went backward 3 years after, tells you about how much HTC/Oculus’s goal was to make the maximum amount of money, up to the point where they forgot to sell first…

          • Ted Joseph

            I agree with your points minus the price discussion. Simple economics. The equilibrium price is the target for most financial champions within a company. This is the point at which supply and demand curves intersect. As demand decreases supply increases, and the price should come down. I think this is where Oculus and HTC are currently… The key is, how do they increase demand if the price reduction presents commercial issues? This is a tough task that takes a great deal of investigation through marketing, benchmarking, surveys, stronger games, etc…

          • Lucidfeuer

            Well, it’s not true anymore if not at all. Flagship smartphone or hardware don’t come down in price as demand goes down anymore, unless there’s…a new iteration. Hell even digital content pricing is stagnating and barely being cut now, in fact quite the contrary. You posits that these companies still do business, hence they sell, and to sell they have to either meet expectation or lower prices.

            They don’t anymore, they only do speculative finance. Apple’s overvaluation is probably around 10 times their actual revenues, same for Google, Microsoft, Samsung etc…they don’t do business anymore, they just provide a front for speculation way beyond the actual value produced or sold. And in the board of directors sits no executive anymore, but shareholders, because companies are not led by profit anymore but how good of a perceived front it provides for the stockholders and investors to gamble.

            So not only is it not a matter of price for the users, proof being no recent price-cut significantly altered HMD sales numbers, because marketing rules before market rules, the mass of the market is not more interested by a product they haven’t already bought just because it’s price changes, and that’s why pricing policies have changed and it doesn’t matter for the corporate governances too.

          • El Pingüno

            absolutely agree the price isn’t a issue. there are no big motivation so far your competitors don’t do that.

    • Wildtz0r

      How hard is it to convince yourself you’re wearing a ski mask while doing whatever?

  • George Vieira IV

    I’m glad they are working on this. It doesn’t matter how hi-res the screen is if I can’t focus on things that are close to me.

  • MW

    Theory and dreams (for today). We heard about foveated rendering years ago, and it is still far far away from masses. So… great but not do interesting.

  • Cooooool!

  • Just give me a 140 degree FOV, no godrays and no SDE and I’ll be a happy camper…

  • CazCore

    i’m far-sighted. i’m worried that this kind of display will extend my handicap to virtual reality as well.