Snapchat CEO’s Open Letter Ties Spectacles AR Glasses to the Survival of the Company at Large

5

According to Snap’s CEO Evan Spiegel, the company behind Snapchat has reached a “crucible moment” as it heads into 2026, which he says rests on the growth and performance of Spectacles, the company’s AR glasses, as well as AI, advertising and direct revenue streams.

Snap announced in June it was working on the next iteration of its Spectacles AR glasses (aka ‘Specs’), which are expected to release to consumers sometime next year. Snap hasn’t revealed them yet, although the company says the new Specs will be smaller and lighter, feature see-through AR optics and include a built-in AI assistant.

Snap Spectacles (gen 5) | Image courtesy Snap Inc

Following the release of the fifth gen in 2024 to developers, next year will be “the most consequential year yet” in Snap’s 14-year history, Spiegel says, putting its forthcoming generation of Specs in the spotlight.

“After starting the year with considerable momentum, we stumbled in Q2, with ad revenue growth slowing to just 4% year-over-year,” Spiegel admits in his recent open letter. “Fortunately, the year isn’t over yet. We have an enormous opportunity to re-establish momentum and enter 2026 prepared for the most consequential year yet in the life of Snap Inc.”

SEE ALSO
Lynx Teases Next Mixed Reality Headset for Enterprise & Professionals

Not only are Specs a key focus in the company’s growth, Spiegel thinks AR glasses, combined with AI, will drastically change the way people work, learn and play.

“The need for Specs has become urgent,” Spiegel says. “People spend over seven hours a day staring at screens. AI is transforming the way we work, shifting us from micromanaging files and apps to supervising agents. And the costs of manufacturing physical goods are skyrocketing.”

Image courtesy Snap Inc, Niantic

Those physical goods can be replaced with “photons, reducing waste while opening a vast new economy of digital goods,” Spiegel says, something the company hopes to tap into with Specs. And instead of replicating the smartphone experience into AR, Spiegel maintains the core of the device will rely on AI.

“Specs are not about cramming today’s phone apps into a pair of glasses. They represent a shift away from the app paradigm to an AI-first experience — personalized, contextual, and shared. Imagine pulling up last week’s document just by asking, streaming a movie on a giant, see-through, and private display that only you can see, or reviewing a 3D prototype at scale with your teammate standing next to you. Imagine your kids learning biology from a virtual cadaver, or your friends playing chess around a real table with a virtual board.”

Like many of its competitors, Spiegel characterizes Specs as “an enormous business opportunity,” noting the AR device can not only replace multiple physical screens, but the operating system itself will be “personalized with context and memory,” which he says will compound in value over time.

SEE ALSO
Google's XR Studio Releases 'Job Simulator' Style MR Experience Exclusive to Android XR

Meanwhile, Snap competitors Meta, Google, Samsung, and Apple are jockeying for position as they develop their own XR devices—the umbrella term for everything from mixed reality headsets, like Meta Quest 3 or Apple Vision Pro, to smart glasses like Ray-Ban Meta or Google’s forthcoming Android XR glasses, to full-AR glasses, such as Meta’s Orion prototype, which notably hopes to deliver many of the same features promised by the sixth gen Specs.

And as the company enters 2026, Spiegel says Snap is looking to organize differently, calling for “startup energy at Snap scale” by setting up a sort of internal accelerator of five to seven teams composed 10 to 15-person squads, which he says will include “weekly demo days, 90-day mission cycles, and a culture of fast failure will keep us moving.”

It’s a bold strategy, especially as the company looks to straddle the expectant ‘smartphone-to-AR’ computing paradigm shift, with Spiegel noting that “Specs are how we move beyond the limits of smartphones, beyond red-ocean competition, and into a once-in-a-generation transformation towards human-centered computing.”


You can read Snap CEO Evan Spiegel’s full open letter here, which includes more on AI and the company’s strategies for growth, engagement and ultimately how it’s seeking to generate more revenue.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.

Well before the first modern XR products hit the market, Scott recognized the potential of the technology and set out to understand and document its growth. He has been professionally reporting on the space for nearly a decade as Editor at Road to VR, authoring more than 4,000 articles on the topic. Scott brings that seasoned insight to his reporting from major industry events across the globe.
  • XRC

    Interesting conversation today with my optometrist about the growing trend of smartphone induced myopia; quicker we can get away from staring at little screens the better…

    • Christian Schildwaechter

      In theory shortsightedness should only depend on genes, in reality it depends a lot on the environment you are raised in. I seriously doubt that smartphones had a lot of impact, as most of this happens during early child hood. But many studies have shown that over the last few decades more children started suffering from myopia, and it is mostly due to being raised in cities, where there simply aren't that many objects far away your eyes might have to focus on during development. And due to not exactly healthy trends in parenting, kids are kept a lot more indoors. It's basically a lack of training for the eyes during the early development years, and adopting to everything always being close.

      The problem with VR HMDs is of course that they all use a fixed focus depth of around 1.5m, so at least the current generation of passthrough HMDs without varifocal lenses would be a lot worse than smartphones, because there would be no training for different depth at all, the muscles stretching the eye's lenses would basically always remain in the same position.

      Current see-through smartglasses with displays also place the virtual display at a similar depth, so you'd still not train the eye to focus on things further away. But the main issue would probably be that most smartphone users will use them in cities where you simply cannot see the horizon due to all the buildings. You'd have to go hiking with smartglasses or use them in the countryside to somewhat compensate for their fixed focus depth.

  • STL

    These glasses look funny. Who is going to buy them??

    • We dont know how the glasses look like :-)

  • fcpw

    Good luck, no one with a social life will wear those.