Better Social Avatars

Most social VR applications today appear to show users with realistic eye movements, including blinking, saccades, and object focus, but all of it is faked using animations and programmed logic. This illusion is good for making avatars appear less robotic, but of course the actual nonverbal information that would be conveyed when truly face-to-face with someone is lost.

Accurate eye-tracking data can readily be applied to VR avatars to actually show when a user is blinking and where they’re looking. It can also unlock both conscious and unconscious nonverbal communication like winking, squinting, and pupil dilation, and could even be used to infer some emotions like sadness or surprise, which could be reflected on an avatar’s face.

Meta has been pushing the boundary on social avatars with its Quest Pro headset which features both eye-tracking and mouth-tracking, bringing much greater authentic expression to virtual avatars.

Intent & Analytics

A heat map shows the parts of the scene viewed most often by users. | Image courtesy SMI

Eye-tracking can also be very useful for passively understanding player intent and focus. Consider a developer who is making a horror game where a player wanders through a haunted house. Traditionally the developer might spend a long time crafting a scripted sequence where a monster pops out of a closet as the player enters a certain area, but if the player isn’t looking directly at the closet then they might miss the scare. Eye-tracking input could be used to trigger the event only at the precise moment that the user is looking in the right direction for the maximum scare. Or it could be used to make a shadowy figure pass perfectly by the player but only in their peripheral vision, and make the figure disappear when the user attempts to look directly at it.

Switchback VR on does something even more creative with eye-tracking and horror thanks to PSVR 2—in certain areas of the game there a haunted mannequins that only move when you blink…

Beyond just using eye-tracking to maximize scares, such passive input can be used to help players achieve greater precision in their virtual environment. In Horizon Call of the Mountain on PSVR 2, for instance, the user’s gaze is used as a sort of ‘auto aim’ to help make long distance bow shows more accurate.

Tobii, a maker of eye-tracking hardware and software, shows how the same concept can be used to improve the accuracy of throwing in VR. By inferring where the user intends to throw an object based on their gaze, the system alters the trajectory of the thrown object to a perfectly accurate throw. While the clip below shows the actual vs. the corrected trajectory for demonstration purposes, in actual usage this is completely invisible to the user, and feels very natural.

Beyond this sort of real-time intent understanding, eye-tracking can also be very useful for analytics. By collecting data about what users are looking at and when, developers can achieve a much deeper understanding of how their applications are being used. For example, eye-tracking data could indicate whether or not users are discovering an important button or visual queue, if their attention is being caught by some unintended part of the environment, if an interface element is going unused, and much more.

Active Input

Image courtesy Tobii

Eye-tracking can also be useful for active input, allowing users to consciously take advantage of their gaze to make tasks faster and easier. While many XR applications today allow users to ‘force pull’ objects at a distance by pointing at them and initiating a grab, eye-tracking could make that quicker and more accurate, allowing users to simply look and grab. Using eye-tracking for this task can actually be much more accurate, because our eyes are much better at pointing at distant objects than using a laser pointer from our hands, since the natural shakiness of our hands is amplified over distance.

Similar to grabbing objects, eye-tracking input is likely to be helpful for making XR fast and productive, allowing users to press buttons and do other actions much more quickly than if they had to move their body or hands to achieve the same. You can bet that when it comes to XR as a truly productive general computing platform, eye-tracking input will play a major role.

Healthcare & Research

Image courtesy Tobii

And then there’s a broad range of use-cases for eye-tracking in healthcare and research. Companies like SyncThink are using headsets equipped with eye-tracking to detect concussions, purportedly increasing the efficacy of on-field diagnosis.

SEE ALSO
Disney Research Shows How VR Can Be Used to Study Human Perception

Researchers too can use eye-tracking for data collection and input, like getting a look at what role gaze plays in the performance of a professional pianist, better understanding autism’s influence on social eye contact, or bringing accessibility to more people.

– – — – –

Given the range of potential improvements, it’s clear why eye-tracking will be a game changer for AR and VR. While eye-tracking is today available only in premium headsets, eventually the tech is likely to trickle down to become an industry-standard feature.

1
2
Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Raphael

    Yes, eventually games will have a whole new dimension of realism/interaction. Natural voice input, AI powered characters, characters will track your eyes. The games we play now are very limited in terms of input/interaction realism. Facial expression detection will also make its way into games eventually.

    • Lucio Lima

      Yes, the future of VR is very promising!

    • impurekind

      And most of it is going to be realised because of VR specifically.

    • NooYawker

      I hope this all happens before I die!

      • Raphael

        December… can you hang around until then?

        • NooYawker

          I’ll try!

    • You have a keen eye for these things, Donatello :)

    • Zantetsu

      Ha ha the article was re-posted. I saw the Raphael baby icon which I haven’t seen in years and was like “wow who woke up the baby”. But now I see that the comment is 5 years old :)

      • XRC

        Where has everyone gone? Real quiet these days…

  • Lucio Lima

    Very interesting!

  • impurekind

    Well this all sounds like great stuff so hopefully they can get a properly reliable and effective solution working in the not too distant future.

  • moogaloo

    As someone who is cross eyed I am a bit concerned by stuff that uses both eye positions in concert like the focal depth stuff. i hope that they build something in to not do this if one eye feels like doing it’s own thing? If not it could potential ruin VR for me and millions of others.

    • Lucidfeuer

      I think that’s quite the opposite: eye-tracking enables all sorts of adjustment for astigmatics, cross-eyed, visual impaired etc…that are not possible with the just the screen.

      • kontis

        Exactly, but there are also some pessimistic hypothesis that it could convince brain that everything is okay and to stop trying to correct it, which would be undesirable.

        • Lucidfeuer

          Strictly talking optic, I don’t see how it could trick the brain without actually correcting vision.

          • Konchu

            I remember at least one thread with a person with Monocular vision not getting depth info from VR like they do in the real world. And I bet this varible depth focus will help simulate it for those people.

            But I do have a friend who has never been able to see 3D in movies stereo grams, old virtual boy etc but VR is amazing for them. So I can somewhat understand the fear that some immersion tech could ruin something for some people. I still think it will do more good that bad and I imagine it will be fairly easy to disable some things as long as it not detrimental to game performance etc. AKA if they start using variable focus for a Culling boost it might make those experiences harder to render without it.

      • Guest

        It cannot track saccading and adjust for individual varients. It just a marketing wet-dream to collect VC money!

      • Coffs

        Only if the eye tracking does each eye individually. If its basing the calcs on one eye, then the other eye gets screwed……

  • Lucidfeuer

    Oh so here something that I’m pretty sure they’ll do, but this has nothing to do with functionality or rendering: “Intent & Analytics” is the only reason why they invest the extra-cents to have eye-trackers.

    • Raphael

      Very impressed with your record for 2018 flappy. All but one of your statements is either half, three-quarters or fully dumb. You are consistent.

      • Graham J ⭐️

        And you call everyone “flappy”. Takes dumb to know dumb.

        • Raphael

          No flappy, I don’t call anyone else flappy.

      • Lucidfeuer

        And you’re still an eloquent genius with convincing counter-arguments.

    • brubble

      Well then seeing that you probably wont buy one, why are you here? Welcome to the internet where your precious f*cking “privacy” doesnt exist. Give it a rest man.

      • Sandy Wich

        Telling people to give privacy a rest is pathetic. If you want to give up caring about your own basic self respect and human rights then do it. But don’t shame people for caring about sociopaths spying on you and getting away with it cus they have money/influence/word it nicely.

        • brubble

          Oh no, the big bad companies might know what youve been looking at and hit you with ads and marketing?! Once these evil sociopaths compile enough info on you they’ll confer in their secret deep underground lair to decide if youre important enough to blast you with their top secret information brainwashing raygun to force into buyng Charmin over Royale.
          Basic self respect??? Human rights??? Pffft Really? Please do explain your preposterous, misguided hyperbole. You couldnt be any more melodramatic? Watch out man! The unmarked black van is circling your neighbourhood. Gimme a break. Bahahaha. Tool.

    • kontis

      Not true. Eyetracking is crucial in improving the quality of the experience (which is currently insufficient). Without these kind of improvements many people will not want to use HMDs.

      They have to do the eyetracking because they have no other choice. Analitics is more like a super enticing bonus for them and maybe a reason to give it a higher priority, but that’s all.

      • Lucidfeuer

        Yes right, these companies have no agendas and don’t care about data and money…eye-tracking is as crucial as pass-through AR, untethered wireless, inside-out tracking, hand tracking etc…yet I’m ready to bet we’ll see the priority being put on eye-tracking even though foveated rendering is not usable yet…

        • Raphael

          There is no counter-argument to stupidity flappy. People without logic or reasoning are “always right”.

          Once again you have an entirely negative/cynical opinion on the motivations of VR developers.

          “yet I’m ready to bet we’ll see the priority being put on eye-tracking even though foveated rendering is not usable yet…”

          Your bets are worth less than the shit from your botty flappy.

          Nvidia and Oculus along with other companies are developing EYE-TRACKING. PRIMARY USE is for foveated rendering. Your idiotic paranoia about eye-tracking being used for NSA surveillance or advertising just goes to show how utterly stupid you are.

    • Sandy Wich

      It’s not the only reason, but it’s a big one.

      People who don’t see what this is really going to be used for 5-10 years down the line… Idk about em.

  • Doctor Bambi

    Eye tracking can also help with redirected walking which I think will become a more important area of interest when full 6DOF standalone gets here.

    It’s amazing to me how much promise eye tracking holds for VR and AR. And it’s why I personally think Gen 2 headsets won’t launch without it. Even if it’s not quite accurate enough for foveated rendering, there are still plenty of benefits to be had in simpler use cases.

  • Jistuce

    Am I the only one concerned at how low the windshield ranks on that car cabin heatmap?

    • doug

      Car was still, in a lab.

      • Jistuce

        See, I assumed the windshield was just omitted from the data set so that all the other stuff would be visible. But that was boring, so I took the image at face value instead. (Also why are people checking the rearview mirror but not the front windshield?)

  • bud

    Great content road to vr team!!, not overlooked as just another article imo..

    Much appreciated, good job, nice.

    thanks,

  • Alexander Grobe

    Good article. However I was missing the usage of eye tracking for redirected walking in VR using saccades and eye blinks.

  • NooYawker

    I remember watching an episode of 20/20. John Stossel was doing a story about advertisers using eye tracking technology to see what people find interesting. They had him watch a Tab commercial with a girl on a beach and yea.. wasn’t hard to predict what he was looking at. But this was close to 40 years ago and I was amazed they could do such a thing. And after 40 years they finally found a consumer use for eye tracking.
    For the young folks:
    20/20 was a news program similar to 60 minutes
    Tab: the first diet soda
    John Stossel: a promising young reporter before he went insane and became a libertarian.

    • brandon9271

      What’s wrong with libertarians? :)

      • Who knows…

        • Zpfunk

          I’m glad someone made that reference. Eye Tracking for advertisement is the main inspiration for the push into the Next Lvl of virtual reality. Good article, but the author must have overlooked that use case. I believe it will be used in much the same way, although with our current internet based consumer culture, here in the United States, that information will most definitely be leveraged against the consumer. In my opinion. Eventually it may be impossible to look the other way.

      • Robert Gordon

        They don’t believe in the goverment as God, and giving all their money to the powerfull god-complex trolls in power to create more regulations to benefit the few billionairs that buy these regulations and limit competition and salaries; Oppressing everyone else, sending jobs to the countries with the lowest regulations/wages is best for them?

  • doug

    If Google is empowered knowing what ads people will click on, just wait unit a company knows what you looked at.

  • Psuedonymous

    Missing is THE most important application of 3D pupil tracking due VR: real time lens correction. Currently, lens correction assumes a single fixed pupil position, while in reality the rotation of your eyeball causes your pupil to physically translate side-to-side, up and down, and even forward and back slightly. Even if it remains within the eyebox, the distortion correction shader will only provide the correct view for one pupil position. By tracking the pupil, the correct distortion correction can be used all the time.

    • Kev

      I wonder if that could be used in some way to help people with low vision where they use a headset with cameras and the display tries to make all the appropriate corrections for them.

    • Zantetsu

      I believe that StarVR did this with the StarVR One. I was super excited about that tech oh about 5 years ago but they never released it in a consumer level headset, never made the tech available for wide review, and basically fell off the face of the Earth. Like a lot of VR since then unfortunately :(

    • Sven Viking

      This also becomes increasingly important with larger FOVs.

  • nipple_pinchy

    Eye-tracking/foveated rendering is going to allow for those lower power standalone, wireless headsets to come to market and be affordable for more people.

  • MarquisDeSang

    Yet PC Looser race will never be interested in VR, because it does not look and feel like tv games.

  • oompah

    wonderful
    this is the future & the right path
    to pursue in VR tech
    Combine it with Cloud VR streaming
    and then MAGIC

  • Sandy Wich

    I can’t wait to see the ad placements that track what I’m looking at and then they’ll, “cater the experience to my liking! <3", by forcibly filling my FacebookTM games with them.

  • dk

    can I get 8 reasons why it’s not in consumer headsets yet….also hand tracking although the unreasonably expensive vive pro might have that

  • Great editorial

  • Damien King-Acevedo

    > if you keep your eyes focused on this word and try to read just two sentences below

    Uh… I think there might be something wrong with your eyes; I couldn’t read two paragraphs below, but two sentences was fine.

  • VR4EVER

    Hands-of gaze-input will be groundbreaking for all the handicapped persons out there. Friend of mine loves vTime just for that. Should generally be a mandatory input-scheme, IMHO.

  • Scot Fagerland

    Ben, thanks for the great research and analysis. I am a patent attorney doing prior art research for a client in the field of enhanced vision, and this article was just the starting point I needed.

    • benz145

      We’re here to be a resource, glad this was helpful!

  • Byaaface

    Awesome article!

  • “Active Input” is not always the best thing to do. We never use eyes to activate objects in real life, so it’s weird when we can do that in VR. At maximum, it can support the activation of objects. E.g. you look at menu items to highlight them, but then you need a click on your controllers to confirm the selection

    • Ben Lang

      I’m in agreement generally, but also willing to be convinced otherwise.

      It’s a potentially useful capability, but I’m waiting to see someone figure it out in a way that feels good. There’s a real chance that it’s the kind of thing that we’ll become completely accustomed to after a few hundred hours of use (like a mouse or keyboard), but that’ll require a smart and consistent application of the feature.

      • Tabp

        Controllerless active input allows you to use the headset without controllers, which is a big deal in many circumstances, especially productivity. Meanwhile, in games input modes would tend to be purpose-specific. Yes, consistent “interaction language” is important and we’ll see the best implementation spread until everyone thinks it’s obvious to do it that way. Agreed with the speed benefits mentioned in the article.

        Even if you’re like “use the controllers because dev says so” users will be like “but I need to play with a drink in one hand and something else in the other hand.”