Snap Plans to Launch New Consumer ‘Specs’ AR Glasses Next Year

2

Snap, the company behind Snapchat, today announced it’s working on the next iteration of its Spectacles AR glasses (aka ‘Specs’), which are slated to release publicly sometime next year.

Snap first released its fifth generation of Specs (Spectacles ’24) exclusively to developers in late 2024, later opening up sales to students and teachers in January 2025 through an educational discount program.

Today, at the AWE 2025, Snap announced it’s launching an updated version of the AR glasses for public release next year, which Snap co-founder and CEO Evan Spiegel teases will be “a much smaller form factor, at a fraction of the weight, with a ton more capability.”

There’s no pricing or availability yet beyond the 2026 launch window. To boot, we haven’t even seen the device in question, although we’re betting they aren’t as chunky as these:

Snap Spectacles ’24 | Image courtesy Snap Inc

Spiegel additionally noted that its four million-strong library of Lenses, which add 3D effects, objects, characters, and transformations in AR, will be compatible with the forthcoming version of Specs.

While the company isn’t talking specs (pun intended) right now, the version introduced in 2024 packs in a 46° field of view via stereo waveguide displays, which include automatic tint, and dual liquid crystal on silicon (LCoS) miniature projectors boasting 37 pixels per degree.

As a standalone unit, the device features dual Snapdragon processors, stereo speakers for spatial audio, six microphones for voice recognition, as well as two high-resolution color cameras and two infrared computer vision cameras for 6DOF spatial awareness and hand tracking.

There’s no telling how these specs will change on the next version, although we’re certainly hoping for more than the original’s 45-minute battery life.

Snap Spectacles ’24 | Image courtesy Snap Inc

And as the company is gearing up to release its first publicly available AR glasses, Snap also announced major updates coming to Snap OS. Key enhancements include new integrations with OpenAI and Google Cloud’s Gemini, allowing developers to create multimodal AI-powered Lenses for Specs. These include things like real-time translation, currency conversion, recipe suggestions, and interactive adventures.

Additionally, new APIs are said to expand spatial and audio capabilities, including Depth Module API, which anchors AR content in 3D space, and Automated Speech Recognition API, which supports 40+ languages. The company’s Snap3D API is also said to enable real-time 3D object generation within Lenses.

SEE ALSO
'No Man's Sky' Brings Massive Multi-crew Starships in Latest Update

For developers building location-based experiences, Snap says it’s also introducing a Fleet Management app, Guided Mode for seamless Lens launching, and Guided Navigation for AR tours. Upcoming features include Niantic Spatial VPS integration and WebXR browser support, enabling a shared, AI-assisted map of the world and expanded access to WebXR content.

Releasing Specs to consumers could put Snap in a unique position as a first mover; companies including Apple, Meta, and Google still haven’t released their own AR glasses, although consumers should expect the race to heat up this decade. The overall consensus is these companies are looking to own a significant piece of AR, as many hope the device class will unseat smartphones as the dominant computing paradigm in the future.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.

Well before the first modern XR products hit the market, Scott recognized the potential of the technology and set out to understand and document its growth. He has been professionally reporting on the space for nearly a decade as Editor at Road to VR, authoring more than 4,000 articles on the topic. Scott brings that seasoned insight to his reporting from major industry events across the globe.
  • Stephen Bard

    It would be ironic if the sleazy Snap company was the first to release transparent stereo waveguide AR glasses while Meta, Google and others had tiny monocular displays.

  • Christian Schildwaechter

    TL;DR: in contrast to Meta/Google/Apple, it is not really clear (to me) why Snap even bothers with developing smart glasses, as they don't really fit their existing products, and they probably won't be able to establish them as a separate platform.

    I still struggle to understand WHY Snap is developing XR glasses. I'm not a Snapchat user, so maybe I'm missing something, but AFAIK their business model is mostly selling lenses/filters for short videos and allow companies to sponsor geo-location specific lenses. The closest to a use-case for XR glasses they have are "Stories", a sort of mini vlog consisting of a series of very short video snippets documenting the last 24h of the user, and shared with others to keep up with what everyone is doing in video form. Which would obviously work well with a couple of glasses capable of recording video.

    But it is a far stretch going from something that works fine with a phone camera to sponsoring the very expensive development of now five generations of smart glasses just for that. I understand why Meta, Google and Apple are throwing billions at XR, as they have long term strategic goals as platform owners, expecting to make money from becoming the unavoidable middle-man in all transactions. Snap hasn't shown any such ambition.

    Their Spectacles smart glasses are based on Epiphany Eyewear, probably offering the first smart glasses looking like regular glasses in 2011. Those were a product of Vergence Labs, who had a very long term goal to release proper AR glasses, stating these would someday "give people what would previously be called superpowers". Snap acquired Vergence Labs in 2014, but unless they were so impressed by the idea of AR glasses that they were willing to embark on a decade long journey just to make it reality, I have no clue how this fits into their product portfolio.

    Sure, Spectacles could be used to create Snapchat Stories and be used to sell location based ads/lenses, but so far Snapchat runs just fine on iOS and Android. They could simply wait for Meta/Google/Apple to release smart glasses with cameras that can run 3rd party software and use these, no need to invest tons of money for what will very likely be only a niche product, as they lack any of the content that the other XR players have, basically only relying on selling add-ons for user generated content. So why are they doing it?