At CES 2017, Lumus is demonstrating its latest waveguide optics which achieve a 55 degree field of view from optics less than 2mm thick, potentially enabling a truly glasses-sized augmented reality headset.

Israel-based Lumus has been working on transparent optics since 2000. The company has developed a unique form of waveguide technology which allows images to be projected through and from incredibly thin glass. The tech has been sold in various forms for military and other non-consumer applications for years.

But, riding the wave of interest in consumer adoption of virtual and augmented reality, Lumus recently announced $45 million in Series C venture capital to propel the company’s technology into the consumer landscape.

“Lumus is determined to deliver on the promise of the consumer AR market by offering a range of optical displays to several key segments,” Lumus CEO Ben Weinberger says.

img_82061This week at CES 2017, Lumus was showing off what they’re calling the Maximus, a new optical engine from the company with an impressive 55 degree field of view. For those of us used to the world of 90+ degree VR headsets, 55 degrees may sound small, but it’s actually been difficult to achieve that level of field of view in a highly compact optical system. Meta has a class-leading 90 degree field of view, but requires sizeable optics. Lumus’ 55 degree field of view comes from a sliver of glass less than 2mm thick. Crucially, you can also get your eyes very close to the Maximus optics, potentially enabling truly glasses-sized augmented reality headsets.

SEE ALSO
Hands-on: Meta 2 Could Do for Augmented Reality What Rift DK1 Did for Virtual Reality

Looking Through the Lens

Unlike some of the company’s other optical engines which were shown integrated into development kit products, the Maximus was mounted in place and offered no chance to see any sort of tracking (though Lumus primarily in the optical engine, not entire AR headsets).

Stepping up to the rig and looking inside, I saw an animated dragon flying through the air above the convention floor. The view was very sharp, and for an AR headset, felt like there’s some immersive potential. However, the contrast didn’t seem great, with bright white areas appearing blown out. The image also had a silvery holographic quality to it. This may mean a lack of dynamic range, or that the display was not adjusting for ambient light in this demonstration. The brightness of the Maximus optical engine seems among its strong qualities, as even without adding any dimming lenses to cut back on ambient light, the image was bright and clear. Ultimately I was very impressed by the capabilities of the Maximus optical engine. Assuming there’s no major flaws to the display system, this waveguide technology seems like it could be a foundation for extremely compact AR glasses, similar in size to regular spectacles (and that’s something the AR industry has been attempting to achieve for some time now).

dsc_0168The image I saw in the Maximus was 1080p, quite sharp at the 55 degree field of view, though Dr. Eli Glikman said that the resolution is limited only by the microdisplay that feeds the image to the optics. With a higher resolution microdisplay (such as Kopin’s new 2k x 2k model perhaps), there’s great opportunity to scale image fidelity here.

Glikman said that the Lumus Maximus still has about a year of R&D left before it’s ready to be productized, but says that partner companies this year will introduce product prototypes based on the Maximus.

Sleek Prototype

untitled-1To prove that the company’s optical engines are capable of enabling glasses-sized AR headsets, Lumus was also showing a prototype headset they called ‘Sleek‘. It uses some of the company’s other optical engines and has a smaller field of view, but it’s made to show the impressively small form factor that these optics make possible.

How it Works

lumus-maximus-optical-engineIt’s actually a pretty awesome feat of physics to channel light down a slim piece of glass and then get it to pop out of that glass when and where you need it.

The Maximus optical engine, as seen at CES 2017, relied on the bulky electronics above the optics. There, a pair of (microdisplays which function as the light source of the optics) are housed. The image from each display is stretched and compressed to be emitted along the top of the lenses. From here it cascades down the optics and—from our understanding of Lumus’ proprietary technology—uses an array of prism-like structures in the glass to bounce certain sections of the injected light out toward the user’s eye. During that process, the image is reconstructed into that originating on the microdisplay (somewhat like the process of pre-warping visuals to cancel out the warping of a headset’s lenses).

With Lumus’ advances in waveguide optics, coupled with other impressive microdisplay advances seen at CES this year, it seems that practical everyday solutions for lightweight augmented realist hardware are rapidly approaching. CES 2018 may prove to be a fascinating milestone for augmented reality.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Foreign Devil

    With everyone jumping on board VR/AR tech is advancing faster than I would have assumed a couple years ago.

  • John G

    All you’d have to do is silver the front to make a really light HMD too. This looks fantastic

  • psuedonymous

    “There, a pair of (microdisplays which function as the light source of the optics)”

    Looks like Lumus and ODG (with the R-9) are using the same solution to sidestep the fundamental refractive index FoV limitation for diffraction optics: Place two separate modules side-by-side on the same piece of glass. Watch out for any AR display with a similar weird super-wide aspect-ratio, they’re probably using the same trick. The limitation is that you still need one ‘edge’ occupied by the microdisplay for each optical module, so cannot stack them more than two across in at least one axis.

    • OgreTactics

      This is, for now, the best way to go. Microdisplay diffractive optics means you still get to have see-through glasses rather than occluding lenses-on -microdisplay glasses. True transparent lighfield display with reasonable FOV are like…10 years away maybe.

      • You know, what they need to do is print the wave-guides into the LCD chip itself on a pixel by pixel basis, so you bypass the need for external lens all together. Bug’s eyes work *kinda* like this, with each receptor only allowing a very narrow point of light in. When you merge all of these tiny points, you get an image that focuses in a very thin space.

        A pair of glasses like this would appear to produce a blob of light right until you get them into that sweet-spot, and then the image would come perfectly together.

      • Dagottfr

        Watch out for IMMY Inc. – been in stealth for a while.

        They’ve created a Natural Eye Optic (N.E.O) that emulates human vision. Only lens in the entire optical engine is your own eye and utilizes direct retinal projection. It also has a 60 degree field of view as well as true AR/VR capabilties.

        Also solved the accommodation/convergence conflict….

        I was lucky enough to try their engineering sample on myself, it’s real.

        • OgreTactics

          Hum…looks like you’re advertising for something that doesn’t make sense and sounds like that Magic Leap vaporware…

          • Dagottfr

            I agree that Magic Leap has caused a ripple effect in the form of a “too good to be true” offering.

            This is simply not the case with IMMY, be on the lookout over the next few weeks/months.

  • Odd Arne Roll

    Hehe, Teslas kommende frontrute skal dere se.
    Tipper denne teknologien blir inplimentert som et tilvalg i kommende Model S, 3, X og Y.

  • OgreTactics

    Why is AR tech advancing faster that VR tech. 50° then 55° FOV already?

    Also when reporting on lightfield display, it’d be good to not forget that critical spec that is opacity factor…is it still 80%?

  • Tim

    The tradeoff seems to involve bulky hardware feeding it…

  • Jack H

    The FoV is often more limited in the plane of the direction the light travels inside the waveguide. I believe the Lumus waveguide anisotropic mirrors sample light rays at different angles instead of just intensity. We can expect many future waveguides to similarly have vertically injected display sources. Another option to increase FoV is to have multiple exit pupils which are basically side by side copies of the image exiting whilst overlapping. Such a method can work better for active waveguides like SBG Labs/ DigiLens.

    • Jerome Lacote

      Hi Jack, you sound really knowledgeable. I would love to chat with you about a project I’m working on!

  • NooYawker

    How much power is needed to run them. Why are they attached to that huge bar?

  • Their “tech” for “compressing the image” is about as interesting as the arrows painted on a street. It’s a simple trick with mirrors, stretching and unstretching the image. I’m not saying their tech has no value, any advancements in HMD’s for VR and AR would be great, but bending an image is nothing new.

  • Tommy

    AHAHAHA MIcrosoft is doomed

    • NIkolaus

      This technology doesn’t seem to be much of a step forward from what Microsoft has done with HoloLens. The actual lenses used inside HoloLens are about the same size as these, probably even smaller. And the sleekness of the prototype glasses shown by this company is due to the lack of tracking functionality. In other words, if you want these glasses with tracking and with the computing power (and the image quality) of HoloLens, you’ll end with a piece of hardware that looks a lot like HoloLens.

    • Greet

      I imagine HoloLens’s main function will be to assist them in developing an popularising Windows Holographic. A device like this could then use that as its OS