Speaking at the IEDM conference late last year, Meta Reality Labs’ Chief Scientist Michael Abrash laid out the company’s analysis of how contemporary compute architectures will need to evolve to make possible the AR glasses of our sci-fi conceptualizations.

While there’s some AR ‘glasses’ on the market today, none of them are truly the size of a normal pair of glasses (even a bulky pair). The best AR headsets available today—the likes of HoloLens 2 and Magic Leap 2—are still closer to goggles than glasses and are too heavy to be worn all day (not to mention the looks you’d get from the crowd).

If we’re going to build AR glasses that are truly glasses-sized, with all-day battery life and the features needed for compelling AR experiences, it’s going to take require a “range of radical improvements—and in some cases paradigm shifts—in both hardware […] and software,” says Michael Abrash, Chief Scientist at Reality Labs, Meta’s XR organization.

That is to say: Meta doesn’t believe that its current technology—or anyone’s for that matter—is capable of delivering those sci-fi glasses that every AR concept video envisions.

But, the company thinks it knows where things need to head in order for that to happen.

Abrash, speaking at the IEDM 2021 conference late last year, laid out the case for a new compute architecture that could meet the needs of truly glasses-sized AR devices.

Follow the Power

The core reason to rethink how computing should be handled on these devices comes from a need to drastically reduce power consumption to meet battery life and heat requirements.

“How can we improve the power efficiency [of mobile computing devices] radically by a factor of 100 or even 1,000?” he asks. “That will require a deep system-level rethinking of the full stack, with end-to-end co-design of hardware and software. And the place to start that rethinking is by looking at where power is going today.”

To that end, Abrash laid out a graph comparing the power consumption of low-level computing operations.

Image courtesy Meta

As the chart highlights, the most energy intensive computing operations are in data transfer. And that doesn’t mean just wireless data transfer, but even transferring data from one chip inside the device to another. What’s more, the chart uses a logarithmic scale; according to the chart, transferring data to RAM uses 12,000 times the power of the base unit (which in this case is adding two numbers together).

Bringing it all together, the circular graphs on the right show that techniques essential to AR—SLAM and hand-tracking—use most of their power simply moving data to and from RAM.

SEE ALSO
Vision Pro Ultrawide Display Turns a Little MacBook into a Productivity Powerhouse

“Clearly, for low power applications [such as in lightweight AR glasses], it is critical to reduce the amount of data transfer as much as possible,” says Abrash.

To make that happen, he says a new compute architecture will be required which—rather than shuffling large quantities of data between centralized computing hubs—more broadly distributes the computing operations across the system in order to minimize wasteful data transfer.

Compute Where You Least Expect It

A starting point for a distributed computing architecture, Abrash says, could begin with the many cameras that AR glasses need for sensing the world around the user. This would involve doing some preliminary computation on the camera sensor itself before sending only the most vital data across power hungry data transfer lanes.

Image courtesy Meta

To make that possible Abrash says it’ll take co-designed hardware and software, such that the hardware is designed with a specific algorithm in mind that is essentially hardwired into the camera sensor itself—allowing some operations to be taken care of before any data even leaves the sensor.

Image courtesy Meta

“The combination of requirements for lowest power, best requirements, and smallest possible form-factor, make XR sensors the new frontier in the image sensor industry,” Abrash says.

Continue on Page 2: Domain Specific Sensors »

1
2
Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Very fascinating in theory, not always applicable in practice. Sensor data fusion is made for a purpose… if you could avoid it and do most of computations directly on the sensors, we would do it already today. So I guess that this approach may help, but it is not the only thing to consider

    • Anastasia Mitchell

      I get paid $98 per hour to do a regular job at home.~kk138~I never thought this was possible, after all one of my closest friends made $25k in just 3 weeks doing this side job He made me join.~kk138~See this page for more information.
      —->>>> http://glo­.­wf/jOd3F

  • xyzs

    As he described, in a modern computer chip, most energy consumption is not the computation itself but carrying the data between each transistors. I think that a stacked memory directly on top of the computing transistor and new architectures are the way to go with this.
    Easier said than done though. If that is achieved, we could see a new era of power efficiency, now that reducing the node size is reaching atom limits soon.
    Happy to see that this problem starts being tackled.

  • All I know is I have faith in this guy and his team. They’ve impressed me at every turn with pretty much everything they’ve done in the VR space, both inside and outside the headsets.

    • Zelda Gutierrez

      Working online from home and earning more than $15,000 simply doing easy work. (MPB893) last month I made and received $17915 from this job by doing easy work part-time. Go to this web and follow the instructions.

      – – – – >>>> https://­r­q­.­f­y­i/OCd35q
      *******************************************************

  • blue5peed

    Facebook blocked me form watching this video because I was seeking through it too fast. Like wtf?! Why is that even a thing.

    • benz145

      I had a similar issue while referencing it for the report. Very weird.

      • Guest

        Mad scientist has got a captive-audience and no peer reviewed citations. Have faith when in his company!

        • guest

          Yeah, like Tesla but with a quarter trillion dollars for the next couple decades. He could be the president of America by then!

  • Jose Ferrer

    Michael, if you are reading this, where are the 140 degrees FOV and 30 ppd you predicted in 5 year time in the Oculus Connect 3 (Oct-2016)???
    And the 4K per eye??

    Cambria will be released end of this 2022 with just half 4K (ie 2160×2160 per eye) and miserable FOV.

    Please, meet your predictions before talking again about future.

    • Kraut

      hmm.. has he sais that ALL this features will be seen at once in one headset?

      we have 140degreees FOV
      we have 4k per eye
      we have 37ppd

      but not all in one headset so far

      • Jose Ferrer

        Yeah, he was getting us hot in 2016 predicting that ALL that will be in the future headsets of Oculus. Unfortunately most of the Oculus founders left when Facebook didn´t like PC-VR

        • brandon9271

          He wasn’t wrong about the pace of technology. Those things are here. It’s not really his fault the market or corporations have different ideas. Such a headset could come out tomorrow but it would be expensive as hell and be for “enterprise.”

          • Jose Ferrer

            Big corporations can be very wrong as well. One as to be brave enough to state their ideas even if they are differnt from the CEO. A succesor of Rift like Half-dome with 4K per eye and 140FOV (without the varifocal thing) and DP cable will be sold like candies at the gate of a school.

          • kontis

            140 deg FOV is a huge problem to efficiently render in a rasterizer while preserving resolution in the center.

            And purely path traced games won’t be a thing any time soon.

            So, this isn’t only a HMD hardware challenge. It was already a well known problem in the DK1 era.

    • dk

      all that running on xr2 is tricky ….that and price and production is the limiting factors

      looks like Cambria might be as high as $1200 and was delayed by a year and still doesn’t have everything …and that is still subsidized

  • Alexander Grobe

    Hardwired algorithms can not be updated. With the current progress in computer vision and pattern recognition hardware will become obsolete very fast.

  • Chris Leathco

    It’s a cool idea, but I am not sure our current tech level can build something at a consumer price point that would be profitable. Batteries are heavy, even tiny ones. That weight just increases with the ability to store more charge. That’s not even considering the projection system to place the AR images on the lens, the logic board to code it, memory storage, etc. I want to see research continue, but I think we are gonna need incredible advances before a consumer model becomes reality.