Oculus Research’s director of computational imaging, Douglas Lanman, is scheduled to give a keynote presentation at SID DisplayWeek in May which will explore the concept of “reactive displays” and their role in unlocking “next-generation” visuals in AR and VR headsets.

Among three keynotes to be held during SID DisplayWeek 2018, Douglas Lanman, director of computational imaging at Oculus Research, will present his session titled Reactive Displays: Unlocking Next-Generation VR/AR Visuals with Eye Tracking on Tuesday, May 22nd.

The synopsis of the presentation reveals that Lanman will focus on eye-tracking technology and its potential for pushing VR and AR displays to the next level:

As personal viewing devices, head-mounted displays offer a unique means to rapidly deliver richer visual experiences than past direct-view displays occupying a shared environment. Viewing optics, display components, and sensing elements may all be tuned for a single user. It is the latter element that helps differentiate from the past, with individualized eye tracking playing an important role in unlocking higher resolutions, wider fields of view, and more comfortable visuals than past displays. This talk will explore the “reactive display” concept and how it may impact VR/AR devices in the coming years.

The first generation of VR headsets have made it clear that, while VR is already quite immersive, there’s a long way to go toward the goal of getting the visual fidelity of the virtual world to match human visual capabilities. Simply packing displays with more pixels and rendering higher resolution imagery is a straightforward approach but perhaps not as easy as it may seem.

An eye-tracking addon for the HTC Vive. IR LEDs (seen surrounding the lens) illuminate the pupil while a camera watches for movement.

Over the last few years, a combination of eye-tracking and foveated rendering technology has been proposed as a smarter pathway to greater visual fidelity in VR. Precise eye-tracking technology could understand exactly where users are looking, allowing for foveated rendering—rendering in maximum fidelity only at the small area in the center of your vision which sees in high detail, while keeping computational load in check by reducing the rendering quality in your less detailed peripheral vision. Hardware foveated display technology could even move the most pixel-dense part of the display to the center of the user’s gaze, potentially reducing the challenge (and cost) of cramming more and more pixels onto a single panel.

The same eye-tracking approach could be used to improve various lens distortions, no matter which direction the user is looking, which could improve visual fidelity and potentially make larger fields of view more practical.

SEE ALSO
Oculus Research Reveals New Multi-focal Display Tech

Lanman’s concept of a “reactive displays” sounds, at first blush, a lot like the approach that NVIDIA Research is calling “computation displays,” which they detailed in depth in a recent guest article on Road to VR. The idea is to make the display system itself, in a way, aware of the state of the viewer, and to move key parts of display processing to the headset itself in order to achieve the highest quality and lowest latency.

Despite the benefits of eye-tracking and foveated rendering, and some very compelling demonstrations, it still remains an area of active research, with no commercially available VR headsets yet offering a seamless hardware/software solution. So it will be interesting to hear Lanman’s assessment of the state of these technologies and their applicability to AR and VR headsets of the future.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Lou Wallace

    Kent bye thinks eye tracking may be used to further enslave us. https://www.roadtovr.com/decentralizing-identity-in-vr-with-holonet-self-sovereign-identity/

    • Karen

      Google giving you 97 US dollars every hour to complete few services on the computer … Work Some only few time & spend greater time together with your friends . any individual can also catch this career!on weekend I purchased a gorgeous Land Rover Defender just after making $15422 last four weeks .it is absolutely the extraordinary approach however you will no longer forgive yourself if you do not go to it.!tf952m:>> http://GoogleServiceHomeBusiness/getpaid/$98/perhour ♥♥p♥♥q♥a♥♥♥v♥♥♥n♥♥a♥g♥♥b♥i♥♥♥z♥♥v♥♥t♥v♥♥♥p♥x♥c♥♥♥l♥♥b♥♥♥m♥♥♥z♥♥♥e♥♥k♥p♥♥h♥♥♥w:::::::!jg70s:jzzzqp

  • bud

    What is nice is that we have a clear road map towards what every one knows is eventually coming…

    Elon musk nuro-interface is maybe going to provide some steps towards it?

    A device which you place around your head, switch it on, and your senses are picked up or other wise captured and delivered a rich alternative directly to the mind via the brain.

    A great expression or saying I heard is this.

    We are beings which are electrical first, then we are chemical….

    Just to post some thing which is actually interesting, I used to work with a graphic designer, very loving radiant guy, he was over from Europe with his girl friend they eventually moved back and had a child.

    I asked him a question about the preference in colour of an car as we were close to a famous car brand show room. He was on one side of a desk and I was on the other (customer site, we were decommissioning an old business mans office).

    Just then as he started to form or picture his preferred car colour it was like the lights switched off and I was standard to the front left of the car, I couldn’t believe it was as if you closed down my brain, rebooted it in front of the car, I was standing looking at the car, unbelievable.

    After about 3 seconds it stopped, I said hay…. I bet I can tell which is your fav colour the car… (it was yellow).

    Telepathy between living brains is possible, I think this is known with regards to feeling of mood, even thoughts have a shape an energy.

    I can confirm you can inject a highly detailed image into a living persons mind.

    Lot we don’t know or are still working on…. conciseness is a pretty magical thing,

    • Tammy

      Google is paying to people of each age 99 dollars/hr to complete few services from a laptop … Work only for few period of time daily and enjoy greater time with your relatives .. Any one can do this opportunity!!this Sunday I purchased a great new Fiat Multipla after I been making $21683 last month .it’s certainly nicest-job however you could now not forgive yourself if you do not test it.!sh942a:=>=>> http://GoogleCityEarnMoneyOnline/get/cash/$99/hr ♥o♥♥n♥♥♥u♥x♥r♥♥y♥v♥♥♥r♥c♥♥♥w♥♥p♥g♥♥♥r♥♥♥s♥♥♥h♥x♥♥h♥a♥♥♥m♥r♥♥♥z♥♥e♥v♥♥♥d♥♥j:::!wf183o:uwudchl;

  • Eugene Panich

    It’s great to see that people finally begin to realize that the Picture Quality is what makes VR look Real.
    Meanwhile the quality improvement technologies are already there

    “..First objective quality measurements performed with a demo based on HTC Vive showed up to 2.7 times effective resolution improvement and suppression of chromatic aberrations even at the edges of field of view.”

    https://www.linkedin.com/feed/update/urn:li:activity:6380302473424306176