Hands-on: Intel’s Project Alloy ‘Merged Reality’ Roomscale Multiplayer Demo


At CES 2017, Intel was showing off more of its Project Alloy VR headset, including a roomscale multiplayer demo featuring ‘merged reality’, where the game environment adapts to your physical surroundings.

Project Alloy

Project Alloy is Intel’s VR headset reference design. The company expects device makers to use it as a starting point to create their own VR headsets (based on Intel chips, of course). Alloy is an all-in-one headset which does everything—game rendering, computer vision, processing, and more—directly on the headset. It’s literally an x86 computer running Windows 10 that you wear on your head.

That means the headset is completely untethered, and with inside-out positional tracking, it can track your movement without the use of external sensors.

Merged Reality Multiplayer

intel project alloy (2)At CES 2017, Intel was showing off Project Alloy with a roomscale ‘merged reality’ multiplayer demo. The idea behind merged reality (AKA virtual augmented reality) is to make the virtual world account for the physical environment around you. In the demo, that meant turning the physical objects in the room into virtual objects in the game world that could be used for cover. You can see the experience in action in the video at the top of this article.

After putting on a pair of the fairly bulky prototype headsets, a colleague and I saw a virtual version of the same chair, desk, bookshelf, couch, and coffee table that were in the room we had been standing in. The room was scanned ahead of time using the sensors on the Alloy headset. We were able to physically navigate around the virtual version of the room thanks to the headset’s inside-out positional tracking.

intel project alloy merged reality (3)

After a few minutes of walking around and inspecting the virtual version of the room, the walls faded away to reveal a vast skybox of distant mountains and clouds. It really did feel like the walls had opened up before us and we had been transported to anothe realm. Before we knew it, the objects in the room had transformed into geometry that thematically matched the game world; the couch became a big rectangular metal storage bin, the desk and chair became a futuristic metal chair and computer terminal, the bookshelf turned into a door frame, and the circular coffee table turned into a futuristic metal… circular thing. The digital versions were not inch-for-inch facilities of the real furniture, but the assets were at least as big as the footprint of the real furniture. There was more virtual geomtry added too which didn’t exist at all in the real world, like a computer monitor on a tall mount and a crane-like mechanism perched overhead.

Using 3DOF controllers which were parented to the location of our head, we were able to aim and fire a rifle. The shooting mechanics worked fine, but the lack of positional tracking on the controllers meant it was a simple point-and-shoot affair with no ability to actually raise the weapon to your shoulder and look down the scope to aim properly (as we’ve seen on other VR systems with more advanced VR controllers).

Waves of flying drones came at us and were easily dispatched with one shot each. After clearing a swarm we would advance to the next wave which had a few more enemies. Thanks to the headset’s positional tracking, we could walk around the entire space as we played and duck behind the virtual/real cover. But it wasn’t exactly a high-action experience as the drones weren’t quite competent enough to make us really feel pressured into cover.

After running out of ammo, we’d needed to find an ammo pickup to replenish the weapon’s clip. I remember inching my way toward the pickups because I just didn’t feel quite confident in the mapping and tracking. Impressive as it was, I wasn’t able to achieve a sense of totally forgetting about the physical objects around me. Inside the demo, it felt as if the scale of the virtual environment was slightly off; when I took a step forward, it didn’t quite feel like I’d traveled the same distance in the virtual world as all my bodily sensors said I traveled in the real world. That amounted to taking careful steps with each movement.

Just a Demo, but Promising

intel project alloy merged reality (1)

As a demo and a concept, it was pretty cool to see this working. But there’s still a lot of work to be done to bring this sort of experience to everyone’s home. For one, the objects in the environment were not automatically identified and replaced with virtual objects. The demo appeared to be made for this specific room size and this specific arrangement of this particular furniture. Adapting a VR game to any room and any furniture automatically will require some smart game design thinking, especially for anything beyond a wave-shooter where your couch turns from a couch into a virtual metal box for cover.

There’s also work to be done in building confidence in users so they can trust they aren’t going to bump into the real environment. The limited field of view makes knee-high objects like coffee tables and chairs a notable threat because you’re much less likely to see out of the corner of your eye. This is compounded not only by the scale issue I described above, but also because it’s hard to tell exactly where your legs are when you can’t see them in VR (like most VR experiences, I didn’t have an accompanying virtual body beyond my head and gun. With a VR headset like the HTC Vive, the chaperone system is so competent that I can almost completely forget about the real world around me, because I know the system will alert me if I’m in danger of running into the physical world 100% of the time. That sort of “freedom to forget” is essential for immersive virtual reality.

There’s also the added complication that a virtual asset may not perfectly align with the real one. This was demonstrated quite clearly by the coffee table in the middle of the room which—to the dismay of my shins—I bumped into more than once, even when I felt like I was well clear of the virtual counterpart. One of the developers running us through the experience also gave his knee a good whack on the table, at which point he figured out that the table had probably been moved after the scan. This is the sort of thing that needs to be solved if this tech is going to take off in people’s homes.

But of course that’s why all of this is just a demo, and a pretty exciting one at that. In fact it was the first time I did VR multiplayer where both players were inhabiting the same physical space. There’s kinks to work out before this sort of merged reality experience can work well in a wide range of environments, but the vision is promising and could be very compelling with the right execution.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.

  • OgreTactics

    Project Alloy a is conceptual miss backed by a great promising tech (RealSense). This is not an untethered Virtual Headset, this is a headset that directly integrates and is limited to a mobile hardware that nobody wants to be using, save for the convenience of an actual mobile device.

    And then, implementing VSlam reconstruction instead of see-through camera overlays is a mystery….

    • Sponge Bob

      see-through camera would show you real objects – not their virtual reincarnations.
      fully integrated (and thus truly mobile) computing in the headset is the way to go.

      for mobile VR to succeed they need to ditch inside-out tracking for a while – to free computing resources to do what VR is all about – hi-res rendering with enough FpS

      • OgreTactics

        1. But isn’t that the point of AR? Seing your real world but overlaying or customising it however you want? In fact if these virtual incarnation are a retranscription of where you are, there’s absolutely no point in reconstructing these especially given how limited and ugly Vslam 3D reconstruction is.

        2. You made two contradictory statement: first you say integrated computing is the way to go (which I clearly disagree with except if we are talking about the convenience of a smartphone that can be unplugged), thus needs to ditch inside-out tracking? Then you say, a point with which I completely agree, that VR needs to do what it’s made for, which is not a computing or gaming device, but just a new technology interface that replaces not just screens, but also keyboard/mouse, with natural FOV and hand/body/voice interaction (which means the inside-out tracking is yet again an absolute necessity of any true Virtual Headset)

        • Sponge Bob

          inside-out head position tracking eats precious computing resources (on smartphone-based mobile VR) while providing inferior head tracking data
          Unless you want to move from one room to another while fully-immersed n VR (highly NOT recommended for your safety) there is no substantial benefit it can provide to justify the expense

          • OgreTactics

            One component, a dual-camera with infrared sensor, does simultaneously to a smartphone first and VR then: sudden double-resolution, double the color and light sample to ponder, mechanical-digital zoom, warp and focus, environment tracking thus distance and object measurements, tracking, 3D reconstruction, analysis and identification (of objects, images, 3D shapes, color and typography), thus body/head/motion tracking, see-through smartphone screen or VR headset depth AR with the dual camera, thus infinite intangible virtual objects, screens, interfaces, landscapes, scenes, games, buttons, tools, people, fucking pokemons overlay on the real world (head/motion tracked), thus real physical places, informations, devices, controllers or keyboard, decorations etc…alteration and overlay (which means any object even unplugged like a keyboard can be use and keys changes to different languages or functions, then tracked), and hand interactions.

            There is HIGHLY substantial benefit it can provide to justify the expense. Then the question of computing resources is not really an arguments for anything, processing just evolves, it’s rather a matter of battery which is why I think since it should be wireless, mobile/PC VR headset should integrate a supplemental battery.

          • Sponge Bob

            slow down, dude

            ToF IR depth cameras give you crappy low-res depth maps with about 1 cm depth accuracy

            battery just runs out, but phone can explode because of heat dissipation or rather lack of it…
            smartphone form factor is wrong for mobile VR

            your fantasies are many years away

          • OgreTactics

            Tango is fantasy? We have to start somewhere and this is a great start, let’s leave the rest like optimisation, processing and heat to engineers and devs of who it is the job to.

  • This confirms what Ian of UploadVR already said after CES: Alloy is very promising, but it still lacks precision… and once you bump once on a table, you don’t trust the sytem anymore and lose all the presence.
    We’ll see if they’ll fix this

  • iUserProfile

    I don’t get the appeal of AR at all.

  • MosBen

    If it doesn’t update the room scan while you’re playing I could see a lot of bumping into family members.