Meta’s Reported Plan to Add Facial Recognition to Smart Glasses Slammed by ACLU-led Coalition

16

An ACLU-led coalition representing more than 70 civil liberties advocacy groups are pushing back against Meta’s reported plans to bring facial recognition to its smart glasses.

The New York Times initially reported in February that Meta is currently exploring who should be recognizable through its smart glasses, as the company ostensibly hopes to bring some form of facial recognition to Ray-Ban and Oakley smart glasses.

According to the NYT report, possible options include “recognizing people a user knows because they are connected on a Meta platform, and identifying people whom the user may not know but who have a public account on a Meta site like Instagram.”

Now, as reported by Wired, an ACLU-led coalition hopes to oppose those plans, which the group says could turn Meta’s smart glasses into ad hoc “surveillance glasses,” capable of endangering consumers and vulnerable communities, and broadly undermining civil rights and civil liberties.

Ray-Ban Meta ‘Scriber’ model | Image courtesy Meta, EssilorLuxottica

The group, which also includes the Electronic Privacy Information Center (EPIC), Fight for the Future, Access Now, and the Leadership Conference on Civil and Human Rights, issued an open letter to Meta CEO Mark Zuckerberg on Monday urging the company to stop and publicly disavow its plans.

SEE ALSO
'VRChat' Breaks Concurrent User Record on New Year's Eve

“People should be able to move through their daily lives without fear that stalkers, scammers, abusers, federal agents, and activists across the political spectrum are silently and invisibly verifying their identities and potentially matching their names to a wealth of readily available data about their habits, hobbies, relationships, health, and behaviors,” the letter reads.

Meta Ray-Ban Display Glasses & Neural Band | Image courtesy Meta

“It isn’t hard to see how easily this technology could be abused by corporations, private individuals, and the government to target immigrants, LGBTQIA+ people, and other vulnerable groups,” an ACLU petition adds. “It also puts domestic violence and stalking survivors at risk and could even be used to go after protestors or people who criticize the government.”

Meta has bowed to public pressure before, albeit after years of costly litigation. As mentioned by Wired, in November 2021 the company ended Facebook’s photo-tagging system and said it would delete the facial recognition templates of more than a billion users, which at the time was called “a company-wide move to limit the use of facial recognition in our products.”

Neither Meta, nor its hardware partner EssilorLuxottica responded to Wired’s request for comment.

This follows news in February that Meta’s smart glasses partner EssilorLuxottica sold over seven million smart glasses in 2025 alone; that year the companies not only shipped a hardware refresh of Ray-Ban Meta, but also Oakley Meta HSTN, Oakley Meta Vanguard, and the $800 Meta Ray-Ban Display glasses—the company’s first smart glasses to include a heads-up display.

It’s not just Meta making smart glasses though. Meanwhile, a rash of competitors are currently preparing their own smart glasses for consumer release; GoogleSamsung and Amazon have all announced their own devices, while Apple is also reportedly developing multiple pairs.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.

Well before the first modern XR products hit the market, Scott recognized the potential of the technology and set out to understand and document its growth. He has been professionally reporting on the space for nearly a decade as Editor at Road to VR, authoring more than 4,000 articles on the topic. Scott brings that seasoned insight to his reporting from major industry events across the globe.
  • fcpw

    No one is forcing people to buy them. That said, you would be quite dumb to do so.

    • Yeah but the problem here is that even if you don't buy them, you're seen through the glasses of all the people around you that bought them

    • Oxi

      The entire problem is not that you might be forced to wear them, but that someone else will wear them and be able to pull up your name and info at will!

  • STL

    Let’s be very clear: the ability to recognize people based on photos or brief encounters is a genuine talent. Many people can identify almost everyone they’ve ever met. For those who lack this ability, an AI face-recognition tool would function as a prosthetic replacement—just like glasses, hearing aids, or other assistive technologies.

    I am one of the few people who struggles severely with this. I couldn’t even recognize my own mother (!) in person at times. This condition cost me my job as a highly paid consultant because I failed to recognize clients and coworkers when I saw them on the street or outside the office.

    Please keep this in perspective: people who wear glasses are not told they must manage without them just because most people have good natural vision. We don’t withhold artificial limbs from amputees or screen readers from blind people simply because others don’t need them. Face-recognition assistance for those with prosopagnosia (face blindness) deserves the same acceptance and support.

    • Herbert Werters

      Yeah, but not from Meta in a consumer device. Are you serious?

    • Oxi

      I don't think that's a fair comparison, sorry. The whole issue is not that I might ping someone and ask for permission for my device to be able to recognize them, it's that this gives any person the same ability to recognize and identify someone that facebook's algorithms have. Glasses don't let me see you a mile away through a security camera. Similar systems to this are already getting people seriously harmed as government agencies building off of the work by facebook and leading AI firms are buying software to pluck people out of crowds for arrest or documenting their presence to harass them later for repeated civil disobedience.

    • Christian Schildwaechter

      Now imagine a world where everybody can buy these and not only use them to put a name on faces, but also get extra background information from Meta's AI. You meet a potential new client, he looks at you with his Meta smartglasses and gets "This is STL, he suffers from prosopagnosia, so meetings with him and team socializing might be a problem." How many jobs as a highly paid consultant do you expect to still get in that world?

      The issue here is that this won't be limited to those who need it, and that it will connect faces to Meta's vast pool of personal data with reports generated by their AI. Based on past behavior, Meta absolutely cannot be trusted to not abuse collected data, and they didn't even bother to adress concerns. They could at least attempt, for example offering a phone app that sends out a bluetooth beacon that disables facial recognition within a 10m diameter, to allow people to IRL "opt-out". Instead they assume that everybody automatically accepts video surveillance with facial recognition.

      Meta will have to geofence this feature, as using their smartglasses this way in Europe would straight up be breaking lots of laws about data privacy and surveillance, getting the wearer into serious troubles. In Europe you aren't allowed to record people in a recognizable way, not even the police is allowed to use facial recognition wherever they like, and private security cameras are only allowed to cover a max of 1m of public space, for example in front of a shop. And there are lots of laws and supreme court rulings explaining why this is important.

      • STL

        While I do appreciate your considerations, the subject is a more philosophical one. Do I want to live in a society based on deceit or on honesty? In my case the solution was to switch to a job with lower social contacts and higher technology impact. At least I do remember numbers pretty well!
        Since I‘m lazy, I don’t want to dive deeper into it, but I‘m sure you got my point. And it’s still what I want to share and which information I give freely to the public, so why would I bother if this is just easier to find?

      • ichigo

        I think most people are being hyperbolic and looking at the worst case scenario with this emerging technology. But i do think safeguards need to be implemented and transparent here. I don't think laws/regulation from redundant overpaid bureaucrats made to prevent other bureaucrats from doing something is the solution. in most cases it's just applied and a burden against the average jo who points his camera from his small business into the street a little to catch 'repeat' shoplifters.

        Ironically Europe already has some of the world's densest surveillance networks despite or alongside its strict privacy laws……very protected from government thanks to them regulations am sure. (rules on paper =/= real-world surveillance). But I'm far more concerned about what silicon valley megacorp is doing with its conformist collective staff and their monoculture ideology. (have a look at what most the staff donate to and support).

        We can have the benefits of helpful AR tools (including for people with prosopagnosia) without handing over unchecked power. Let's demand better engineering and transparency. And not lean on overpaid bureaucrats looking to suck money out of it with backhand deals with threats of regulations. The same undemocratic bureaucrats who often retire from EU roles straight into high-paying positions at the very corporations they " "regulated" ".

        • Christian Schildwaechter

          Whether bureaucrats are overpaid, redundant or not, legislation has been the only thing putting a limit to endless data grabbing so far, and the GDPR is one of the EU's success stories alongside a lot of other consumer protection laws that ended up benefiting the whole world.

          Of course all regulation adds more friction and has some negative consequences for businesses. Companies like OpenAI wouldn't work in the EU for numerous reasons, from finances to regulatory oversight and very strickt data protection laws. And of course it would be convenient to just use whatever (camera) data you can get in whatever way you like, and there will be numerous applications that simply won't work if you have to always ask for permission first.

          But the philosophy behind this is that the limits of your personal freedom is the personal freedom of others. So just because someone lives on the cutting edge and can make use of advanced technology like AR and facial recognition, this still doesn't mean they are automatically allowed to do so in a way that restricts the personal rights of others that may not even be aware of what is already possible. Laws ideally allow people to better live together, which also requires protecting some of them from others with more know-how or tech or money, even if it means all those fancy new toys cannot always be used to their full potential. Simply because everytime figuring out data privacy has been left to the market, it ended in a few large companies hoarding (and selling) data without any acceptable restraint.

          And regarding existing suveillance one has to differentiate between countries. While EU laws are very strict and mostly limit surveillance, for example the UK has been a long time champion of larhe scale public camera imstallations, and as part of the five-eyes country is also very heavily involved in global internet surveillance.

          Which is still comparably
          harmless considering that the FBI recently admitted that they were buying tracking data from US companies to work around having to first get court orders as required by law. The whole discussion about Chinese owned TikTok being a national security threat to the US was a big farce, since TikTok went out of its way to ensure user data was secure and not accessible by the Chinese government, while that government didn't even need TikTok, because they could simply buy US citizen profile data from lots of large US internet companies, precisely because the US lacks a data protection law like the GDPR.

  • Tech

    NSA/CIA dream – you wear these and work as free spy for these agencies.

  • marco

    probably no one remember that 1st gen google class received a similar slam years back

  • From the title, I thought there was some legal constraint blocking Mtea. Instead it is just some people sending Zuck an email. Well, good luck with that

  • Oxi

    Just straight up using the chaos right now as a way to get this done without backlash.

    • Herbert Werters

      It won't do any good if there's an even bigger blow later on. This isn't some minor issue that society could really accept or get over. I think “move fast, break things” really isn't a good idea in this case. For anyone.

  • ichigo

    None of this make no sense. While i understand individual concerns. And let's ignore the surge in donations to ACLU from silicon valley. This comes more as a shake down by an ideologically captured group that targets and weaponizes.

    Anyone with common sense can have a good chuckle at the idea modern western governments and packed cilvil service would target the " "minorities" " they proclaim most affected here. Some of us live in the real world….And am not even sure why they would think these people world be most affected seem oddly specific and telling.

    Also i think they are trying to blur the definitions again when they use " "immigrant" ". Because i see no reason why they would be targeted unless they did something illegal that most countries have as a law.

    Meta should be transparent about safeguards but the ACLU's approach is less about liberty and more about control.

    TLDR; It either affects everyone or this is a BS shake down and divisive nonsense…

    (in regards to protesting the same group black covering their faces to " "protest" " all the time should be safe from any IDing. And there is plenty of cameras at these things anyway we all see what they do and get away with.)