Meta’s Reported $800 Smart Glasses with Display Won’t Shoot for the Stars, Claims Respected Analyst

21

Meta’s smart glasses with display, codenamed ‘Hypernova’, are reportedly slated to cost less than initially expected, with Meta allegedly slashing price expectations from the rumored $1,000 – $1,400 range to $800. Now, respected supply chain analyst Ming-Chi Kuo says Meta is nearly ready to begin mass production, although sales expectations aren’t very high.

Bloomberg’s Mark Gurman reported last week that Hypernova will be cheaper than initially reported, delivering a pair of smart glasses with a single display and a wrist-worn electromyography (EMG) based controller for input at “about $800,” Gurman says.

Notably, a number of recent leaks provided by data miner ‘Luna’ have also seemingly unveiled the glasses in full, suggesting not only is Hypernova (also referred to as ‘Celeste’) real, but it may be a Meta solo launch—i.e. not a partnership with Ray-Ban and Oakley parent company EssilorLuxottica.

Image courtesy Luna

Kuo, known for releasing insider info on Apple products, recently posted on X (machine translated from Traditional Chinese) that Hypernova is expected to enter mass production in Q3 2025.

Ostensibly sourcing supply chain info, Kuo says Hypernova will have a two-year product cycle, with shipments over the next two years estimated to be around 150,000 to 200,000 units in total—significantly less than the over two million Ray-Ban Meta units sold since release in 2023.

“Based on Qualcomm chip shipment forecasts, global smart glasses shipments in 2026 are estimated at about 13 to 15 million units, which shows that Hypernova’s market share is negligible, hence it seems more like Meta’s experimental product,” Kuo maintains.

Continuing:

AI will be the most important selling point of Hypernova, but the exploration of applications integrating AI and AR is still in the early stages, and with a selling price of about $800, this should be the main reason Meta is conservatively viewing Hypernova’s shipment volumes. Additionally, to pursue mass production feasibility, it adopts LCoS, but this also brings hardware design challenges such as appearance design, brightness, response time, and battery life.

Kuo posits that Hypernova holds a few strategic implications for Meta: to preempt Apple’s release and build brand image, accumulate ecosystem experience as early as possible, and understand user behavior.

SEE ALSO
XR Glasses Maker VITURE Secures $100M Investment as Wearable Segment Heats Up

Truly, the addition of a ‘simple’ display to its smart glasses platform changes things from both a user and platform holder perspective. As with early entrants into the ‘smart glasses with display’ segment, such as Rokid’s recently pitched Glasses, users won’t just be snapping photos and video, taking calls and listening to music, or talking with LLMs.

Rokid Glasses | Image courtesy Rokid

People will expect display-clad smart glasses to do things smart things like turn-by-turn directions, live text and audio capture real-time translation, and more interaction with apps, given Hypernova is supposed launch with more articulated input beyond simple swipes, button presses, and voice input can provide. Getting that right is no small feat, as Kuo suggests Meta may simply not be ready for the sort of wider adoption Ray-Ban Meta has driven.

Meta sees smart glasses as a stepping stone to all-day AR, likely making hesitancy the right move. The company needs to not only feed all of those learnings into a bigger and better AR platform down the line at some point, but also create something that won’t frustrate the glut of consumers with half-baked experiences or hardware limitations that could tarnish the segment before it even gets off the ground.

After all, Meta is banking on owning a sizeable piece of AR as it hopes to eventually generate a return on its multiple billions of dollars spent per year on Reality Labs, its XR research and product division, so rashly jumping into the coming wave of smart glasses may do more harm than good.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.

Well before the first modern XR products hit the market, Scott recognized the potential of the technology and set out to understand and document its growth. He has been professionally reporting on the space for nearly a decade as Editor at Road to VR, authoring more than 4,000 articles on the topic. Scott brings that seasoned insight to his reporting from major industry events across the globe.
  • Christian Schildwaechter

    Based on Qualcomm chip shipment forecasts, global smart glasses shipments in 2026 are estimated at about 13 to 15 million units …

    15M in one year is a lot, considering the Meta Ray-Ban smartglasses cost as much as a Quest 2/3S, and Quest 2 selling a little more than 20M units in four years. With 2M Meta Ray-Ban sold between 2023-09 and 2024-12, I wonder who is going to sell the extra 11-13M in 2026. EssilorLuxottica said that sales had increased a lot in 2024, but I seriously doubt sales will increase sixfold just two years later.

    I recently learned (thanks to @Stephen Bard) that Rollme is offering their AirView Smart Glasses for just USD 80, featuring an 8MP camera, microphone and speaker used with your smartphone that also drives real-time translation, an AI voice assistant and object recognition via ChatGPT and others. The product page pretty much claims their glasses do basically what the Meta Ray-Bans do, which should be taken with a grain of salt, and some important informations like the weight are nowhere to be found.

    Digging a bit further I found that there are a number of unknown brands all offering very similar smartglasses at surprisingly low prices, for example the Rogbid GS380 for USD 100. Similarly impressive on paper, also not mentioning weight. My favorite feature is the "Stereoscopic frame".

    Most of the 13-15M Qualcomm AR1 expected to ship in 2026 will probably end up in similar glasses. So Meta's Hypernova at USD 800 isn't only competing with Meta's own display-less smartglasses for USD 300/Ray-Ban or USD 400/Oakley, but also a large number of much cheaper smartglasses supposedly doing the same things. Which might explain why Meta expects to sell less than 200K over two years.
    https://uploads.disquscdn.com/images/5e0aefcca77de39e68918c7b3d3b12c051bfa1b6ce213b4bd88527ad52b1dc35.jpg

  • Foreign Devil

    Glad we don't have to pay an extra two or three hundred dollars for the "privilege" of wearing a RayBan logo on our AR glasses.

  • gothicvillas

    I haven't seen a single white person in these ads.. a bit weird. Maybe im not their target market.

    • Nevets

      I think you can safely bugger off. Cya!

      • gothicvillas

        Lol somebody is triggered

    • Christian Schildwaechter

      Let's see. Below the first (static) images from Meta's Ray-Ban smartglasses page. I count four people with very light skin, two with dark skin and two in the background that are too small to be sure. So if you aren't seeing a single white person in these ads, that's a problem with your eyes, not with Meta's promo material.
      https://uploads.disquscdn.com/images/3844bfd8823c1077de19dab8a611c355a7d30c617bcafa128e6ee233638e2f08.jpg

      • Bradley Jefferies

        If they are white, they're either trans, fat, or ginger or disabled lol! stay triggered!

        • Christian Schildwaechter

          Was I triggered? All I did was fact-check if there are no "white persons" in Meta's smartglasses promo material.I went to the most obvious place, Meta's homepage, clicked on "AI glasses > Ray-Ban Meta", and immediately got the above image clearly contradicting it, of which I took a screenshot. The whole ordeal took maybe 30sec, so this check would have been trivial for anyone before making incorrect assumptions/claims about Meta's ads.

          And now that there is clear visual evidence, you just move the goalpost to implied characteristics that aren't even determinable just by watching Meta ads, and therefore can be neither proven nor disproven. I assume you did this to instead of having to deal with verifiable facts, you'd prefer to hang on to whatever you believe. So maybe check if it isn't you that is on auto-trigger.

          • NL_VR

            Some people dont understand common sense.

  • Josef

    Why do journalists ignore the Even Realities glasses? RoadtoVR is a specialist source and I expect the team to know about popular products in this space.

    The Even Realities are existing heads up display glasses with microphones and AI in a frame that weighs the same as reading glasses. Titanium materials – not plastic. The displays are monochrome, but are likely better than Meta is intending to bring to market.
    The main problem is that Even have an annoying app and are a Chinese based company, but that isn't worse than any Meta app, which we should expect to have invasive user data tracking.

    • psuedonymous

      There are a bunch of HUDs using the same JBD green monochrome optical module. 'Even Realities' do not appear to have anything to differentiate them from the pack.
      These are HUDs after all, nor AR devices, so no functional difference to the HUDs that have been available for decades from any vendors (small FoV, small eyebox, fixed focal length, no tracking).

      • Christian Schildwaechter

        The Event Realities G1 have several features that differentiate them from the pack: they actually look like regular glasses, the glasses are available with custom prescription, they were among the first with a dual eye display, and they come with a very useful teleprompter function that makes them rather popular for a few specific uses cases. They also lack features most others smartglasses have.
        https://uploads.disquscdn.com/images/ceaae9ab5edd9675dbb5ed461125c77da96ba47e312b5c4a6ebd3a8964f584ab.jpg

    • Christian Schildwaechter

      TL;DR: The Even Realities G1 deliberately lack some important features of other smartglasses, and are positioned more as display glasses for a few very specific use cases.

      The Even Realities G1 offer a 20° FoV display, microphones and look more like regular glasses than any other available smartglasses. But they lack both cameras and speakers, and are overall a lot more limited than other smartglasses. They offer a fixed feature set, driven by their smartphone app, mostly notifications, navigation, auto-transcription and auto-translation of spoken language, and a teleprompter.

      The latter seems to be the most popular use, with a lot of TED speakers wearing the G1, which shows a 640*200 pixel green image floating about 2m in front of both of the wearer's eye. As everyone else they now offer piping whatever the user says into ChatGPT, but the lack of speakers makes them less usable as smartglasses than for example the Meta Ray-Bans, as any feedback would have to be read, which is more distracting while moving around. And lacking cameras it cannot do things like translate posters or recognize objects, and the main use of smartglasses so far seems to take pictures and record videos.

      The lack speakers and cameras are IMHO a much bigger problem for their use as smartglasses than the annoying app and China based company. The Event Realities G1 are actually an interesting product that focuses on a few core functions which it does very well, and the first "reality enhancing" glasses available with regular prescription glasses. They are more of a niche device, so I understand why they aren't covered in the same way as smartglasses.

      I'd expect further separation in the future based on feature sets. Most current smartglasses also offer only a limited feature set. Rokid are the first ones that announced an SDK/software development kit to allow developers to come up with more functions, and Meta is "considering" offering one for their smartglasses. This is similar to smartwatches, where the ones running Apple's watchOS or Google's Android based wearOS allow for 3rd party apps to enhance their function, while a lot of cheap smartwatches only offer fixed feature sets like notifications, navigation, wireless calling, weather and some fitness features when paired with a smartphone. The G1 is currently closer to these "dumb" smartwatches.

  • dz11

    Sounds like Meta are purposely estimating low sales numbers so they can later claim to their shareholders that they sold twice or three times more than they thought they would. Not falling for their crap.

    • NL_VR

      How do you know its crap?

  • Mike

    They need to implement autofocus so people will never need prescription eyeglasses again and you could have Eagle Vision for far vision as good as an Eagle.

    • Christian Schildwaechter

      This is actually way easier to implement in passthrough HMDs than in see-through smartglasses. Current HMDs have lenses with a fixed focus, and in most HMDs like the Quest 3, the image appears at about 1.5m/5ft distance, regardless of whether a virtual object is closer or further apart. So if you can see sharply at 1.5m, you can also see sharply at 1500m or 0.15m in VR. If you pair this with auto-focus lenses like in the XR-4 Focal Edition, you get exactly what you are looking for. At least if you don't get motion sick from the inherent vergence-accommodation conflict/VAC caused by the focal distance not matching the display distance.

      It is much more complicated with see-through glasses, as you first need very thin pancake glasses that can adjust focus on demand. You then need to adjust the focus depth of the projection of virtual objects to move with the new focal depth of the glasses. And the aging of the human eye causes a lot of problems with determining what the users tries to look at. Over time the muscles that stretch the lenses lose tension, so while a 17 year old can change the eyes' focus over a range of 17 diopters, this goes down constantly until people above 70 usually only have a range of 1 diopter left.

      So the required magnification would depend on the remaining focus capabilities of the eye, you'd basically need to apply a much larger zoom factor for older users, which will cause a number of problems, and there are a number of other issues. I'd expect this to one day become possible, but we will get very light/thin/small passthrough HMDs that can be used as zooming glasses much earlier, that could also do things like night or IR vision or highlighting objects. And a number of people will replace their regular glasses with these much superior visors not even for the extra XR features, but just to see better (again).

  • Donald Rull

    Meta has done a great job at killing off this tech. I have the Ray-Ban Meta Wayferers and have gone as far as to talk with iCare professionals in my area to get them to stop selling these products because they are completely unreliable as daily wear.

    • Christian Schildwaechter

      Could you elaborate on that? We currently lack proper reports how people actually use the smart part of the Meta Ray-Bans, so we also don't know what doesn't work. We just know that they are selling a lot, and most people might use them just as regular sun glasses, a function I assume Meta couldn't possibly screw up, because these parts are still done by Ray-Ban.

  • Ray Finney

    What ads? The product has not even launched. The pictures in this article are for Metas smart glasses that have been out for 4 years. Millions of people have them. The only picture with an individual is of their latest generation and last I checked are geared towards athletes and adventure seekers. There is a picture of there Luna project, but not is NOT the new project and is only a prototype for an AR internal development. Get your facts straight before you find out that you are being out earned by a model in a national print campaign for a product you can't identify.

  • John

    To bad meta ai is restrictive and stops acting like an ai when a tough question comes up atleast the meta ray bans ai