Epic’s HoloLens 2 Demo ‘Apollo 11: Mission AR’ Showcases Impressive PC-quality Graphics


Epic Games today released a new video featuring a demo for HoloLens 2 that aims to show off just what sort of graphics can be achieved on Microsoft’s latest standalone AR headset. Called Apollo 11: Mission AR, the interactive demo is streamed wirelessly in real-time from networked PCs running the company’s game engine, Unreal Engine.

Unveiled earlier this summer at Microsoft Build 2019, Apollo 11: Mission AR is a recreation of the historic 1969 Apollo 11 mission and lunar landing, showing off the Saturn V’s launch, a reenactment of the lunar landing, and Neil Armstrong’s first steps on the Moon, which Epic says was reconstructed based on data and footage from the actual mission.

Epic says the demo features 7 million polygons in a physically-based rendering environment, and includes fully dynamic lighting and shadows, multi-layered materials, and volumetric effects.

Image courtesy Epic Games

That isn’t done on-device though. To achieve this level of detail, Epic says the experience’s “holographic elements” are actually streamed wirelessly in real-time from networked PCs running UE 4.23, the current version of Unreal Engine.

According to Epic’s HoloLens 2 streaming guide, the headset sends eye tracking, gesture, voice, current device pose, and spatial mapping input to your PC, and then streams rendered frames back to HoloLens 2. This, the company says, is designed to boost app performance, and make development easier since devs won’t need to package and deploy the app on-device before running it, however it’s clear it also allows HoloLens 2 to play host to more graphically involved experiences than were originally intended for the standalone device’s on-board processors.

Image courtesy Epic Games

We reached out to Epic to see whether this could also be achieved via cloud streaming, or if it’s a local machine-only implementation. We’ll update this article as soon as we hear back (see update below).

Ubisoft's VR Escape Rooms Have Quietly Reached 200+ Locations Across the Globe

Released in early September, Unreal Engine 4.23 is the first iteration of the company’s game engine to feature production-ready support for HoloLens 2, which includes tools such as streaming and native deployment, emulator support, finger tracking, gesture recognition, meshing, voice input, and spatial anchor pinning.

Outside of the demo’s visual polish, Epic says Apollo 11: Mission AR also shows support for UE4 Composure, color temperature, and post-processing, plus OCIO LUTs, I/O for AJA video systems, and additional features that streamline mixed reality media production.

Update (2:00 PM ET): An Epic Games spokesperson has left us with this statement regarding cloud rendering for remote PC-to-HoloLens connections:

“While it is technically possible to use the HoloLens 2 Remoting over the Internet, we would strongly recommend against it due to significant latency and uncontrollable network conditions. When using HoloLens 2 Remoting, you should always aim to use a local network to minimize the latency and ensure there are minimal other devices connected to it to maximize the bandwidth available for the HoloLens 2.”

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.

Well before the first modern XR products hit the market, Scott recognized the potential of the technology and set out to understand and document its growth. He has been professionally reporting on the space for nearly a decade as Editor at Road to VR, authoring more than 3,500 articles on the topic. Scott brings that seasoned insight to his reporting from major industry events across the globe.
  • Xron

    Hard to believe they can squeeze this kind of performance.

    • Nejham Mosquera

      Did you even read the article?

    • Smokey_the_Bear

      “the interactive demo is streamed wirelessly in real-time from networked PCs running the company’s game engine”


  • Adrian Meredith

    The video is missing the fact that you can only see a tiny square in the middle

    • Immersive Computing

      Is the tiny square less tiny for HL2?

      • Smokey_the_Bear

        yeah, a little. It went from roughly 35° to 50°.

  • Behram Patel

    It’s amazing that this is now an option. @scotHayden can you ask Epic to publish the hardware specs for this demo ?
    Would be great if this can run off an Intel Nuc8 or the putative NUC 9.


  • Richard Servello

    Afaic, hololens isn’t actual a product… It’s somewhere between vapor, concept, and supercomputer. Anytime I see an article about it I feel like it might as well be talking about a cray in the 80s.

    • Ardra Diva

      You realize you can buy them? https://www.microsoft.com/en-us/hololens/buy

      • Richard Servello

        Of course you can…but for $3000 with no application support. It’s a Dev kit for a product that will never exist.

  • nodryland1

    looks cool but AR pricing for headsets need to drop for people see and try this experience

  • I would like to know what is the latency of this streaming

  • paleion

    But it doesnt actually look like this – you dont pop it on and suddenly you can see this massive 4K rocket on your table in full size. What you actually see is a small box with a tv screen playing an image thats semi transparent and a bit juttery if you move too fast.