‘Project Alice’ Tracks Real Objects to Enhance Multi-user VR Experiences


Noitom, a leading mo-cap company, are using their affinity with capturing reality by fusing it with virtual reality to create a multi-user mixed reality experience that allows users to interact in VR and others using real world objects. Noitom calls this combination of systems and modalities Project Alice, and they were at SVVR 2016 to show it off. Road to VR’s Executive Editor Ben Lang went hands on.

Project Alice is mashup of several different systems to create a unique VR experience which highlights the power of social VR and mixed reality. The system, which Noitom is positioning as a B2B product priced around $100,000, comprises a high-end optical mocap system combined with Noitom’s own IMU tracking sensors. The company also says they created the backend software which ties these systems together and runs VR experiences on top of them; the result is an immersive multi-user VR experience which allows users to interact with virtual and real objects.

Virtual reality is the most seamless and natural way of connecting with someone else through a computer. Matching real-world props to virtual objects takes this to the next level. As part of the Project Alice experience, we got to see this first hand (see video at the very top of the page). In one scene there were a number of real-life objects that were exactly matched to their virtual counterparts. That meant that when I saw a virtual stool sitting in front of me and reached out to touch it, it was actually there in real life. When I picked up the stool, it moved correctly in VR, and I was even able to hand it to the person standing next to me. Basically, it felt just like interacting with a stool in the real world, and the closer we can make the virtual world to the real world, the more immersive it becomes.

Meta Releases New Mixed Reality Showcase for Unreal Engine Developers
See also: OptiTrack's Precise 'Void' Style Tracking Lets You Play Real Basketball in VR
See also: OptiTrack’s Precise ‘Void’ Style Tracking Lets You Play Real Basketball in VR

Mixing real items into a VR environment makes human-to-human interaction in VR even easier and more natural than using a purely virtual modality; if I want to hand you an object, I can simply hand it to you, and you can reach out and know that you will be able to physically grab the object. When I was holding a foam block, for instance, it’ was incredibly easy to rotate and manipulate it, because everyone knows how to rotate and manipulate a foam block in real life.

The exciting part comes when you append digital information onto that block that couldn’t exist in real life (maybe the front has a virtual screen on it showing some important data from the web, while the back has virtual buttons to change what’s displayed on the front). Now instead of sending someone a link with the info I’m seeing, I could literally toss the object containing the info to them. It’s a more human and natural way of communicating from one person to another inside of a computer, and large, multi-user VR systems like these are likely to take real-time collaboration to the next level.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.

Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Sorry to be a party pooper, but you’re putting alot of emphasis on a very unimportant story. High-End motion capture is nothing new. If they were finally reducing it to home-level cost, this would be GREAT, but right off the bat you say it’s $100,000. This isn’t a story on telepresence, because these objects have to all be in the same room, as do the people. It’s not even a theme park any of us could even conceivably visit and enjoy, like The Void. It’s just a handful of journalists spending an afternoon toying around with Hollywood-level tracking.

    You want to see something AMAZING, spend some time in AltVR with a Leap Motion attached to your HMD. Now *THAT* is impressive. Project Orion has finally made the Leap reasonably solid for use in VR. I’ve been using it in the Unreal Engine and it’s FREAK’N AMAZING! And it’s hella cheap, available right now, no $100,000 studio equipment required.$30 that blows away $100,000. Now THAT’S a story!

    • Bryan Ischo

      I have the leap motion with Orion and enjoyed the Geometry and Blocks demos with my DK2. However, the tracking was nowhere near good enough, even with Orion; while it was fun during those moments when the tracking was spot-on, a significant percentage of time is spent flinging your hand/fingers around trying to regain tracking or get the tracking to recognize your actual gesture/pose.

      Is the experience better in AltVR? Maybe they’re using the leap motion for sensing with lesser fidelity than was required by the Geomery and Blocks demos?

      In my opinion, unless the tracking matches the quality of the Vive motion controller tracking, I don’t think I’d really want to use it for anything other than playing around.

      • Marc

        Agreed. I have a CV1 and was disapointed when I tried my leap motion with Orion.

    • Tommy

      Fully agree with you on the first para. Seriously. These have been going on for almost as long as CGI was available. Hollywood uses it all the time!

  • Random thought, when I read Noitom I was like… ha, that feels as if I just read something backwards! Yeah it did, funny, I wonder why that is… wait a minute… AHA!

  • Gaspar Ferreiro

    I wonder if any of the poster below have actually tried the hardware mentioned on the article with the respective demo. I had a chance to try it, and to be fair, the integration of the hardware/software and trackers open a lot of possibilities. For starters, not many demos out there can have 5 people in the same environment at the same time (which is very important for enterprise training applications). In addition, the tracking is super precise and can do full BODY tracking… not just hands like leap motion, but full body joints and limbs. Then add the feature of being able to STICK a tracker on ANY REAL WORLD OBJECT, and now you are bringing reality into the virtual world. This might seem trivial to some, but as an active developer, this is huge.. cause now, if you want to put a REAL golf club into a game, all you need to do is place a tracker on it, and then align it to a model in your VR scene (you could use your favorite photogrammetry solution for this). Think military training with real weapons or devices, or medical training with real objects.
    Now, I also see that this implementation needs a little polish, and some of the solutions for mobility need to be a bit more elegant (like the back pack they show to carry PCs on your back). So i can assure you, that despite it not being a perfect solution, it is also impressive in its own right (there is a reason why their booth was constantly being approached my military, govt and large enterprises).
    And yes, I would love for this to be available at a price point for the consumer market… but this is not targeted for that market. Yet, (alluding to a comment below mentioning the void), the fact is that this allows people to create VOID LIKE experiences for a FRACTION of the cost.

  • metanurb

    still rocking the dk2 I see. I am not really a fan of those fresnel lenses either. My be you could interview optitrack and see what their thoughts are on VR? Then there are the others Vicon and phase space etc.