Apple was recently granted a patent for ‘EyeSight’, the external display on Vision Pro which shows the wearer’s eyes. The patent was filed way back in 2017 and envisioned the feature being used to show stylized eyes like those of anime and furry characters.

Image courtesy Apple

EyeSight is perhaps the most unique bit of hardware on Vision Pro. Apple has been sitting on the idea for the better part of six years, having first filed a provisional application for the idea in June of 2017. After filing the full application a year later, Apple was just granted the patent this month, formally titled Wearable Device for Facilitating Enhanced Interaction.

When a company files a patent, it generally aims to make the patent as broad as possible to maximize protection of its intellectual property. To that end, the patent envisions a wide range of use-cases that go beyond what we see on Vision Pro today.

At launch, Vision Pro’s EyeSight display shows a virtual representation of the wearer’s eyes, but Apple also envisioned stylistic representations of the wearer, including anime eyes, the eyes of a furry avatar, and even augmenting the wearer’s eyes with various graphics.

Apple imagined an even wider range of uses than just showing the person inside the headset, like showing the weather or a representation of the content the user is seeing inside.

The patent also covers the ‘breakthrough’ feature of Vision Pro, where Vision Pro detects people outside of the headset then fades them into view inside the headset so they can be seen by the wearer. While the feature currently shows the outside person in sort of a faded-in view, Apple also envisions showing them in a window with hard edges, or even placing them seamlessly into the virtual environment.

The patent also imagines a hilarious looking ‘FaceSight’ version of EyeSight which would use a display large enough to show the wearer’s entire face.

Patents like this can be interesting because sometimes these unrealized concepts will make their way into future versions of the product. One interesting bit that stood out to me in the patent described the system as detecting “whether the observer matches a known contact of the wearer,” and using that info to decide what to show on the display. As of now the headset does detect when people are near you, but doesn’t analyze who they are or if you know them.


Thanks to Collin B. on Twitter for the tip

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Christian Schildwaechter

    I’ll give Apple that their implantation on AVP is impressive, with not just showing the wearers eyes, but showing several perspectives of those eyes on the integrated display, separated by lenticular lenses like those on 3D post cards integrated into the glass front, making sure that others standing at different positions in the room will always see the eyes as they’d naturally look from their perspective.

    But showing eyes or anything indicating a mood or status on the outside of a headset itself doesn’t seem to be something that isn’t an obvious option to people familiar with the technology, which is usually a requirement for a patent to be granted, esp. a broad one. So I think that all the people that pasted large goggly eyes onto the front of their DK1 in 2013 should be allowed to claim prior art for this concept.

  • Arno van Wingerde

    Many pictures I see of the users eyes are some blue background where you can only just make out the users” eyes. Nothing even close to realism yet.

    • Christian Schildwaechter

      The “blue shade” is an indicator whether the user is looking at you and the actual room, or is inside a virtual environment with you pasted in as a ghost, or looking at something else not seeing you. But even in the most “clear” version the eyes doesn’t seem to be anywhere close to realism, it mostly shows a status to help with communication, not a proper replacement for looking at someone.