Eye-tracking is an important technology that many would consider a ‘must have’ for the next generation of VR headsets, although that’s just one step towards resolving a long-standing issue in VR that prevents users from seeing the virtual world like they would naturally in the physical world. It’s called the vergence-accommodation conflict (detailed below), and Singapore-base Lemnis Technologies’ latest varifocal prototype aims to provide headset manufacturers with the software and hardware to help make it an issue of the past.

If you’re already well versed in the vergence-accommodation conflict, keep reading. We’ve summarized what’s at stake with varifocal displays at the bottom of the article.

‘Verifocal’ VR Kit by Lemnis

Unlike Oculus’ recently teased varifocal headset prototype, which couples eye-tracking with a moving display that physically adjusts to your eye’s focus, Lemnis’ latest prototype in their ‘Verifocal’ platform is based on an optic following an Alvarez lens design; an optical system invented in the ’60s by physicist Luis Alvarez which combines two adjustable lenses that shift to serve up a wide range of focal planes.

Image courtesy Lemnis Technologies, MIXED

This too requires eye-tracking, but shifts the onus from the display and onto the lenses themselves to dynamically adjust to the user’s focus. Lemnis has also developed prototypes featuring adjustable displays.

German VR publication MIXED (German) sat down with Lemnis Technologies co-founder Pierre-Yves Laffont to learn more about the commercial-focused platform, which hopes to provide headset manufacturers with ready-made software and hardware solutions—the company’s ‘Verifocal’ VR kit prototype, built into a Windows VR headset, appears to fit into a pretty standard form-factor.

Image courtesy Lemnis Technologies, MIXED

Laffont says the new Verifocal prototype can move the focal plane continuously depending on where the user is looking, and provides a focal range between 25 cm (~10 inches) to infinity.

“There is no fixed number of focal planes, compared to Magic Leap for example who has only 2,” Laffont explains. “The result is a smooth change of focus that can cover the whole accommodation range.”

Image courtesy Lemnis Technolgies, MIXED

To accomplish this, the company’s varifocal software engine analyzes the scene, uses the eye tracker output and the user’s eyeglasses prescription to estimate the optimal focus. It then instructs the adaptive optics to adjust the focus and correct the distortions synchronously.

When asked why Lemnis is exploring Alvarez optics as the next area of development, Laffont told MIXED this:

“Moving the screen is conceptually simple, mechanically it can be put in place in the short-term. Other approaches such as Alvarez lenses can further improve the optical quality. For AR, where the vergence-accommodation conflict is even more critical, liquid lenses are promising and there is a lot of work going in this direction. Our strength at Lemnis is that our Verifocal platform is built to work with all of those approaches – and we have built the expertise and processes to integrate any of them into a partner’s headset.”

SEE ALSO
Hands-on: StarVR One is the Most Complete Ultra-wide VR Headset to Date

Lemnis is initially targeting the enterprise segment with their tech, “where quality matters most and price is less sensitive.” Laffont believes the VR market will eventually reach a critical scale that will allow manufacturing prices to come down, and varifocal displays will likely become commonplace in most VR headsets.

Lemnis Technologies was founded in 2017 by scientists and engineers from academic institutions including MIT, Brown, ETH Zurich, Inria, NUS, NTU, KAIST, and companies such as Disney Research, Philips, NEC.

The ‘Verifocal’ prototype and the associated varifocal platform have been announced as an honoree of the CES 2019 Innovation Award.

Vergence-Accommodation Conflict

Outside of a headset, the muscles in your eye automatically change the shape of their lens to bend the light to your retina so it appears in focus. Simultaneously, both eyes converge on whatever real-world object you’re focusing on to create a single picture in your brain.

Images courtesy Pearson Scott Foresman, Fred Hsu

In short, the vergence-accommodation conflict is a physical phenomenon that occurs when the headset’s display presents light at a static distance from your eye, making your eyes strain to resolve whatever virtual object you happen to be looking at.

In a headset with a fixed-focus display or optics, you’re basically left with an uncomfortable mismatch that goes against ingrained muscle memory developed over the course of your entire life; you see an object rendered a few feet away, your eyes converge on the object, but your eye’s lenses never change shape since the light is always coming from a static source. You can read more about vergence-accommodation conflict and more eye-tracking related stuff in our extensive editorial on why eye-tracking is a game changer for VR.

BACK TO THE ARTICLE

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Well before the first modern XR products hit the market, Scott recognized the potential of the technology and set out to understand and document its growth. He has been professionally reporting on the space for nearly a decade as Editor at Road to VR, authoring more than 4,000 articles on the topic. Scott brings that seasoned insight to his reporting from major industry events across the globe.
  • grindathotte .

    I’m lucky in that I don’t suffer from the vergence-accommodation conflict, maybe because I am used to putting reading glasses on and off which changes the vergence vs accommodation relationship anyway. In fact, one of the delights in VR is that I can bring things close to my face to read small print, without having to don virtual glasses. Suddenly, my eyesight is perfect! Sadly, if they solve the vergence-accommodation conflict problem I will need virtual reading glasses. Let’s hope it’s optional.

    • Lucidfeuer

      The goal of solving it is also that it’d be not only optional but also actually account for varied visual impairments or problems paired with eye-trackers.

    • impurekind

      Yeah, I actually think by them solving this they’re going to end up making VR more blurry for me at various distances, and likely make the screen door effect more pronounced too.

    • Jistuce

      I am extremely nearsighted, and VR requires me to focus too far to not need glasses. The focal plane is something like a foot and a half out, and I want it at six inches.
      Given that won’t happen, I dread these things not working with glasses, which they already don’t much care about. A more “advanced” system could spec itself out of my biological window.

      • mirak

        Of course you have it, because it’s physiological, it’s about the muscles around the eyes.
        It’s just you are used to force it because you have a bad sight.

        • Jistuce

          Didn’t claim it doesn’t affect me. Figured that out two decades ago. (Virtual Boy 4 lyfe!). It doesn’t affect me MUCH, though.

          I was just saying that the cure could be worse than the condition.

          • mirak

            Ok.
            I would hope more about lightfield displays to solve that issue.
            Because it would also give a more realistic image, and would not require eye tracking, except if you need to improve performances.

    • dk

      https://youtu.be/GbRvkS3GDsw?t=727 u have seen this right

  • Freshboy

    Good to see vr is still a thing I haven’t been on this site in 2 years, im still waiting for a lighter vr headset with a 200° fov are more

  • hubick

    Any interview like this where they don’t ask about LATENCY is a useless puff piece.

    • Just for you:

      “Latency is a key point – an optimal varifocal display adjusts the focus faster than the eye can process. Accommodation is a relatively slow process in the real world (it can take hundreds of milliseconds for your eye to refocus after an eye saccade), so the focus adjustment does not need to operate as fast as the refresh rate of the HMD display. But it needs to be fast enough so that your accommodation does not “lag behind” in dynamic scenes.

      We have a range of prototypes based on different types of adaptive optics and actuation technologies, with some of them already faster than the saccadic suppression mechanism of the eye (the time during which you are temporarily “blind” when you look in a different direction). It all depends on the tradeoffs our customers want to make, on FOV / accommodation range / weight / price.”

  • I would curious to try both the screen-based and lenses-based approaches to see what is better

    • It’s not neither nore. You can combine them. Screen-based is easier and cheaper to implement.

  • cataflic

    Alvarez reminds me of some implosion assembly design of nuclear weapon….

  • Cool technology. It will be interesting to see what are the advantages of using a moving display wrt moving lenses

  • dsadas

    can someone explain to me why would we need it? i mean aren’t human eyes doing this automatically?

    • Miqa

      Yes, the eyes automatically adjust to accommodate for different depths. The problem is that the headsets only have a fixed physical depth.

  • oompah

    interesting

  • Harrison Ellers

    I wonder how their solution compares to DeepSee
    https://angel.co/deepseeinc