At GDC this year, SensoMotoric Instruments (SMI) showed a couple of new eye tracking demos at Valve’s booth. They added eye tracking to avatars in the social VR experiences of Pluto VR and Rec Room, which provided an amazing boost to the social presence within these experience.
There are so many subtle body language cues that are communicated non-verbally through the someone else’s eye contact, gaze position, or even blinking. Since it’s difficult to see your own eye movement due to saccades, it’s best to experience eye tracking in a social VR context. Without having a recording of your eyes in social VR, you have to rely upon looking at a virtual mirror as you look to the extremes of your periphery, observing your vestibulo–ocular reflex as your eyes lock gaze while you turn your head, or winking at yourself.
I had a chance to catch up with SMI’s Head of the OEM business Christian Villwock at GDC to talk about the social presence multiplier of eye tracking, the anatomy of the eye, and some of the 2x performance boosts they’re seeing with foveated rendering on NVIDIA GPUs.
LISTEN TO THE VOICES OF VR PODCAST
It’s likely that the next generation of VR headsets will have integrated eye tracking and the goal of both SMI and Tobii to be the primary providers, but neither Tobii or SMI are commenting on any specific licensing agreements that they may have come to with any of the major VR HMD manufacturers. I will say that SMI had some of the more robust social VR eye tracking demos at GDC, but Tobii had more nuanced user interaction examples and more involvement with the OpenXR standardization process in collaboration with the other major VR hardware vendors. You can read more information their integration with Valve’s OpenVR SDK in SMI’s GDC press release.