Outside of a few demonstrations at tradeshows, Meta has only give glimpses of the content on its impressive AR headset dev kit in short clips. During a recent visit to the company’s Silicon Valley headquarters to catch the latest improvements to the headset, I got to see a selection of demo content which Meta has allowed Road to VR to capture and share in full for the first time.

Having initially taken pre-orders for the $3,000 Meta Pro glasses back in 2013, the company rebooted the headset and revealed it in early 2016 as the Meta 2, this time poised as a more affordable (at $949) development kit with a much wider and more immersive field of view at 90 degrees (compared to 36 degrees on the original). After taking pre-orders throughout 2016, Meta tells Road to VR that the Meta 2 dev kit is shipping in limited quantities, and the company expects to ramp up shipments in the months ahead.

Photo by Road to VR

We went hands-on with the Meta 2 back at its reveal in early 2016 and concluded that while it’s AR in general is still early in development compared to VR (a sentiment shared by Facebook), the headset could play a significant role in the development of the eventual consumer technology:

Many people right now think that the VR and AR development timelines are right on top of each other—and it’s hard to blame them because superficially the two are conceptually similar—but the reality is that AR is, at best, where VR was in 2013 (the year the DK1 launched). That is to say, it will likely be three more years until we see the ‘Rifts’ and ‘Vives’ of the AR world shipping to consumers.

Without such recognition, you might look at Meta’s teaser video and call it overhyped. But with it, you can see that it might just be spot on, albeit on a delayed timescale compared to where many people think AR is today.

Like Rift DK1 in 2013, Meta 2 isn’t perfect in 2016, but it could play an equally important role in the development of consumer augmented reality.

At the time we weren’t able to share an in-depth look at what it was actually like to use the headset, save for a few short clips taken from promotional videos released by the company. Now, running on the shipping version of the development kit, Meta has exclusively allowed us to capture and share the full length of several of the headset’s demo experiences, blemishes and all.

Photo by Road to VR

A few notes to make sense of the videos below:

  • The footage is captured with the Meta 2’s front-facing camera, and then overlaid with the AR content, so this isn’t precisely what it looks like through the headset’s lens itself, but it’s pretty darn close.
  • The real resolution actually looks a lot sharper than what you’re seeing here; all the text in the demos was quite readable. For comparison, the captures are done at 1280×720 (and not at a very good bit-rate), while the Meta 2 provides a 1280×1440 per-eye resolution.
  • The AR content is rendered to be correct for my perspective, not the camera’s. That means that when I reach out to touch things it sometimes looks like my hand is way off from the camera’s vantage point, but to me it actually looks like I’m touching the objects directly (which feels surprisingly cool when you reach out and it’s right where you think it is).
  • Occlusion is not rendered in these videos, though it is present in the view through the headset, so while it looks like I’m reaching behind some objects (also an issue of the above point), there’s some (rough) occlusion happening in my view which, combined with stereoscopy, maintains the illusion that the objects are actually behind my hands and out into the real world.
  • The tracking fidelity and latency seen in the videos is a fair representation of the current latency seen through the lens itself.
  • You will see some moments when I clip through the AR content, though the clipping doesn’t align with my own view through the headset due to the camera’s different perspective.
  • The picture-in-picture video from the outside view is hand-synchronized and isn’t representative of actual latency.
  • The field of view seen here is technically representative of what it looks like in the headset, except that the field of view of the human eye is much wider than what the camera captures; thus while the AR content can be seen here stretching across the entire width of the camera’s captured field of view, it doesn’t stretch across the entire human field of view. That said, it’s quite wide and much more compelling compared to a lot of other AR headsets out there.
  • There’s some occasional jumpiness in the framerate of the capture (especially during the brain scene), likely due to the capture process struggling to keep up; through the headset the scenes were perfectly smooth.

Meta 2 Evolving Brain Demo

Meta 2 Cubes, Earth, and Car Demo

Take Away

Meta 2’s biggest strength is its wide field of view and its biggest weakness is tracking fidelity (both latency and jitter). I’ve often said that if you mashed together HoloLens’ tracking and Meta’s field of view, you’d have something truly compelling. Meta said they would ship with “excellent” inside-out tracking, but as of today they aren’t there yet.

Photo by Road to VR

That said, this is a development kit. The company tells me that there’s still optimization to be done along the render pipeline, including the use of timewarp—an important latency-reducing technique—which is not currently implemented.

Like the Rift DK1 back in 2013—it didn’t have perfect head tracking or low enough latency, but it was still enough to stir the imaginations of developers and get the ball rolling on AR app development. Meta knows that their tracking isn’t consumer ready yet, but the company intends to get there (and they seem committed to doing it in-house). Meanwhile, the Meta 2 could serve as that kick-in-the-imagination that gets a core community developers excited about the possibilities of AR and actually starting to build for the platform.

For a deeper dive into our hands-on time with Meta 2, be sure to see our prior writeup.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.

Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Sponge Bob

    Where are the controllers ???

    Controllers are a MUST for VR, therefore for AR too

    LeapMotion style of bare hand interactions does not work for majority of productivity apps. Period.

    Plus, does it work outside in direct sunlight ?

    Me thinks not so well…

    • NooYawker

      Well they did say they’re a few years behind VR so they’re just trying to get the visuals working first.

      • Sponge Bob

        its nice to be able to burn through millions dollars of finding for many years without fully functional products on the market… sign me up… sigh…

        • NooYawker

          Well that depends on he company. Hololens has developer products out there already. This is new tech it’ll cost 10s of millions to get a commercial product ready.

        • Shawn MacDonell

          AR is where VR was in 2013 in the best case scenario; it will be a at least 3-5 years likely until it gets to where VR is now. Leap Motion and other computer vision-based hand tracking will be far more important for AR than introducing physical controllers; in all honestly, physical controllers are not needed nor desired for AR interactions even though they make sense for VR due to what AR is targeting for the technology’s many use-case scenarios.

          • Sponge Bob

            VR or AR – leapmotion does not do it. period.
            I played a lot with it.
            Controller is a must.
            Plus infrared-based comp vision is dead in direct sunlight – that will limit AR to strictly indoors and would suck big time.

          • Shawn MacDonell

            As I said, other computer vision-based hand tracking as well, that may or may not be on the market. A controller is not the future for AR, it may be for early development kits potentially but it’s not the end-game for hand-based interaction in AR. Remember, AR will be more or less primarily indoors for these early years as computer vision continues to develop (though HoloLens operates outside to a degree from what I’ve seen); AR is meant to replace the smartphone and other daily devices we utilize, creating a hands-on experience with the only hardware to be a HMD/glasses. You should look towards the future if you want consumer hardware, and a controller is not and should not be in that future for consumer AR.

            AR is very early in its life cycle at this time and we can not expect the current technology to do it service except for development purposes. As Road to VR has stated in its opinion, one that I believe, we will not see AR hardware to the degree of consumer readiness as Rift or Vive for another 3-5 years at minimum.

          • Sponge Bob

            I say: VR and AR will eventually merge and at least 1 small ergonomical hand-held or hand-attached (ring ?) controller is a must for both
            Also , indoor only AR sucks

          • Shawn MacDonell

            You need to look at the reality of the technology situation with AR, it’s still early days and at best, as stated by Road to VR also, is at DK1-level of hardware and capabilities. And again, I’m discussing the future needs of AR rather than the current limitations, which are fine limitations for the present given technology has not yet progressed enough for high accuracy in hand-based AR interactions; hence why I stated motion controllers MAY be utilized for early developer kits if desired, though I personally oppose this as software optimization betters tracking (as we’ve seen with Meta 2’s progression, though it’s still under-performing in its positional tracking against HoloLens).

            AR and VR will eventually merge but not for the foreseeable future (i.e. ~10 years).

  • ARwar

    It really comes down to: will Microsoft fix the fov first or Meta fix the tracking?

    • Sponge Bob

      before that happens some other company (from China mostl likely) will fix them all

    • MS is doing consumer now in 2019 so expect a wider fov. The fov for AR isn’t as important as in VR. Way too much is made of it, which you soon know when using one every day. The device isn’t everything either, but I won’t buy any theathered future device and give up the freedom. One other important factor is the ecosystem and I like what MS has already published so far.

  • David Barry

    I did the demos at Unite 2016 and I was wondering if they had cleaned up the artifacts of the interaction between the hands and the digital objects. This is not visible in the video but I am wondering if that is because the AR was added post you said. I really noticed it on the cube demo. Also the chosen demos I saw where all very pastel feeling (not washed out) but not to the brightness or occlusion levels of say the Land Of Dinosaurs experience on the hololens has that improved?
    Biggest drawback for me though it has a chord.

    • Sponge Bob


      If you played with some LeapMotion basic demos then the crap they show is in the video is just laughable

      Try LeapMotion cubes demo – it is VR but will work just as well with AR with very minor software twicks – there is nothing VR or AR specific in detecting your hands via projected infrared light

      This said, controller is a must – VR or AR regardless

      • Shawn MacDonell

        A hand controller may be a must for developer kits as computer vision develops (though I believe it’s good enough for early development purposes, remember this is at best DK1-level hardware at this stage for AR), but for the end-game consumer AR product a controller is not the future for AR systems.

        • Sponge Bob

          I beg to disagree

          how are you gonna select objects and areas to focus on if you don’t have reliable click ???
          quasi-pinching your fingers (without even touching) sucks big time
          not even talking about haptic feedback – much desirable for VR and AR alike

          • Shawn MacDonell

            And again you’re looking at currently-available options for interaction whilst I’m talking specifically of the end-game consumer AR goals, which later development kits will not require the use of motion controllers and will have superior computer vision-based tracking than is currently available for Meta 2 or HoloLens.

            Look to the future, not the present, when discussing your wants for AR as a whole. Motion controllers or even rings would not be the ideal future for AR interaction, computer vision-based hand tracking will be the end-game; as I stated before, motion controllers may be used for early AR development in its early years but it is not the desire nor the end-game for interaction. The ideal is the only hardware for AR we need is the HMD itself, nothing else, with everything built into the HMD; now this is a good 5+ years off most likely but that is the end-game and goal of AR for ease-of-use sake. Haptic feedback will be a hurdle to conquer with additional hardware early on as well most likely, but will also turn to BCI-based manipulation once that technology becomes available far down the pipeline.

          • Sponge Bob

            you can’t have a reliable and identifiable “click” with computer vision based technology. period.
            controller is a must (a small one, not like current monstrosity Touch or Vive)
            BTW, touchscreen never replaced mouse and keyboard, not for major productivity applications

          • Graham J ⭐️

            Where are you going to put the cameras? The HMD can’t see your hand movements at all times no matter where you put cameras on it. There’ll have to be something external to detect hand movements, and even if you do that our hands are not great at doing precise manipulations without feedback.

    • benz145

      Occlusion rendering is still coarse and has a lot of latency; it didn’t feel improved from when I had tried it previously.

  • Sponge Bob

    I would put my money on Vrvana instead of Meta


    camera pass-through is the way to go in the near future

    • Lucidfeuer

      Agreed with that.

  • metanurb

    May be intels realsense tracking would improve things or a vive puck.

  • Guest

    And what genius chose that font?

  • AirHairLair

    How does it compare to Avegant?

  • Christian Vogelgesang

    I’m still waiting for an AR headset that is using stable outside in tracking and let me use a workstation to render everything. Special AR work spaces are totally fine… Just let me throw as much rendering power at CAD data as possible.

  • Wow, it’s awesome to see movies recording with Meta2! Super interesting article!
    Anyway, the other big problem of Meta is that it is tethered, while HL is a completely standalone device, so you can move freely inside the room.

  • Lucidfeuer

    90 degrees of AR FOV? How is that?