Project Holodeck Seeks to Create a Low Cost Full-motion VR Gaming Environment with the Help of the Oculus Rift HMD

A team of 25 members is working at the University of Southern California to create a full-motion virtual reality gaming environment, they’re calling it Project Holodeck. By ‘full-motion’ I mean that the playspace allows the user to actually walk or run around the environment; their movement is tracked and used to control their navigation within the game. Players will don the Oculus Rift head mounted display / VR headset, which has head tracking and a high field of view, to give them the appearance of actually being inside a virtual world. I spoke with the producer of Project Holodeck, James Iliff regarding this exciting project.

The Precursor: Shayd

Iliff is a senior at USC and well versed in the world of virtual reality (you can find his thoughts on the topic on his blog). He has prior experience creating a Holodeck-like virtual reality rig thanks to his position as Lead Designer on a prior project called Shayd. This previous project created a virtual reality instillation which utilized a head mounted display, motion tracking hardware, and multiple Kinects to achieve an experience of wandering around an alien world in first person (see the Hardware page on the official site for detail on the setup). Motion tracking of the body and head-tracking of the VR headset allowed the user to experience full avatar embodiment (a full virtual body) within the game world.  Here’s a trailer for Shayd which shows the world that the players experienced:

Shayd was a great VR demo and apparently a predecessor to Project Holodeck in many ways. The Shayd rig utilized the PR4 head mounted display / VR headset which was developed by Palmer Luckey (the man behind the forthcoming Oculus Rift) who was the hardware engineer for the project. The PR4 has an impressive 130 degree field of view and attached LEDs which are used for head-tracking. As you can see, it’s quite bulky!

Shayd was shown at the Spring 2012 Graduate Thesis Showcase Other Worlds on May 5-10, TEDxUSC on May 4, and the First Look Festival on April 26.

Unfortunately, expense made the Shayd rig impractical for any sort of commercial adoption. The key contributor to cost is the PhaseSpace motion capture system which was used for body-tracking and head-tracking; Iliff says that this alone cost $80,000.

The New Rig: Project Holodeck

Enter Project Holodeck:

a virtual reality platform built with consumer facing technology, DIY off-the-shelf components, cutting-edge custom software, and creatively integrated peripherals. The goal of Project Holodeck is to bring 360-degree 6-DOF full-body virtual reality out of the research lab and into a fun, accessible consumer gaming platform.

Project Holodeck seeks to do what the Shayd rig did but better and cheaper. Elimination of the PhaseSpace system and a more streamlined HMD bring the price down significantly and enhance the experience.

The PhaseSpace motion capture system is being replaced with four Kinects; at $150 each, using Kinects for body-tracking is far less expensive.

Update (9/16/12): New Introduction Video from the team of Project Holodeck:

As with Shayd, Palmer Luckey is the lead hardware engineer on Project Holodeck. As such, the HMD being used to display the game world is the upcoming Oculus Rift VR headset that the VR community is buzzing about and will soon see a Kickstarter for funding. The Oculus Rift is lighter and more streamlined than the PR4 that was used with Shayd.

The PR4 and Shayd rig relied on PhaseSpace for head-tracking. Project Holodeck uses the Playstation Move (which I assume will be mounted to the Oculus Rift) for head-tracking. The Oculus Rift will be launching with integrated head-tracking which could replace the Move.

The ultimate goal is to create a 1:1 virtual play space with 1:1 avatar movement. What this means is that if you move one foot in the real world, you’ll move one foot in the Holodeck virtual reality game environment. Currently the play space is 20′ by 20′ which gives the user a large area to roam. As for 1:1 avatar movement, this involves high-speed skeletal tracking (tracking of joints, limbs, etc.) and then projection of that data into the game world to show the player their own virtual limbs and the avatar of additional players.

The Game: Wild Skies

Iliff and the Project Holodeck team are currently working on a game called Wild Skies for use with the setup. The game is being built with the Unity game engine. Wild Skies puts two players on the deck of fantasy airships and asks them to steer, fire cannons, and fend off would-be hijackers using guns and swords. This is a clever choice because it makes narrative sense that the players will be restricted to the deck of the ship which would put a limit on their play space (so that they don’t run into walls) but still allow for a game of exploration and adventure by traversing the open skies. According to Nathan Burba, project lead and creative direction for Project Holodeck, it was Palmer Luckey’s idea to have the game set on an airship.

This is how the team describes the game:

Wild Skies Character Concepts. Serai (left) and Zendair (right).

Two players play co-operatively as the son and daughter of a famed aviator. This is their first time on their parent?s ship. The ship is suddenly attacked by air pirates and, in the confusion; their parents are kidnapped along with the rest of the ship?s small crew. Our heroes must journey across the skies in search of their parents. Both players must work together to control a complicated airship vessel while also firing turrets and cannons to fend off attacks from airborne enemies.

Along the journey they visit exotic floating islands and encounter strange ships from small speeders to armadas. Each port they visit installs a new upgrade on the ship from more powerful sails to a battering ram. Eventually they confront the pirates who abducted their parents.

Here’s a concept video for Wild Skies that shows what the team hopes to achieve:

The game will make use of some interesting haptic feedback including a fan (shown in the concept video) to demonstrate windspeed, and even a scent sythesizer for players to experience the smell of cannon smoke among others; just be sure to stay clear of the poop deck.

Here is a concept for the ship’s deck, the virtual playspace that the players will interact with while using Project Holodeck:

Here’s how the same virtual space looks to the game engine:

When I asked him about keeping players within the Project Holodeck playspace, Iliff told me that the physical space may be built to resemble the deck of the ship:

We might even try constructing a physical set for Wild Skies, much like we did for Shayd, so that the railings of the ship in game actually match up with railings in the real world, and the players get a sense of walking on a wooden deck, etc.  This mixed reality experience can add a lot of tactile immersion, in addition to the visual and auditory immersion.

I also asked Iliff about the tracking capabilities and full avatar embodiment of a four Kinect setup compared to the $80,000 PhaseSpace system that was used with Shayd:

In terms of Kinect tracking, it is indeed fast enough to track a player’s avatar in real-time – the hard part is keeping it accurate.  Using four Kinects provides a lot of extra data, even when two players are sharing a space together, so this enhances accuracy by a large margin depending on the type of algorithm we are implementing.  So to use your specific example, if the player reaches out and pulls a cannon (or lights it, etc), then the player will see his virtual body do the exact same thing, and will be able to see if he is touching the cannon or not.  However, the accuracy of the avatar is not the biggest challenge to tackle in this example – what is really more important is the input that tells the game to make the cannon fire.  The multi-Kinect setup is doing two primary things at all times: using skeleton data points to animate a player’s avatar, and also receiving motion inputs for in-game actions.  And sometimes its difficult to get the Kinect to recognize what kind of motion you are doing, and you can always run the risk of accidentally firing a cannon just because you waved your arm near it.

In order to avoid these issues, one solution we are playing with is a simple bluetooth button that is strapped to your hand, resting in your palm. Every time you want to fire a cannon, for instance, you not only have to hold your hand near the cannon and pull back abruptly, but you have to hold down on the button and then hold your hand near the cannon and pull back.

Overall, the visual feedback of the multi-Kinect setup (avatar embodiment) is pretty damn good, and its the Kinect inputs that we are really focusing on because that is more challenging at the moment.

Project Holodeck is still a work in progress and I will be watching it carefully!

The Experience: Where Will Project Holodeck Manifest?

The major goal of Project Holodeck is “to bring 360-degree full-body virtual reality out of the research lab and into a fun, accessible game installation”.  Accessible here is the key word. There have been plenty of virtual reality tech demos, but the actual reality is that people rarely get to experience them. Thanks to the relatively low-cost nature of Project Holodeck, it should be practical to create a number of these rigs and several games for players to experience.

Obviously the complexity and size of the Project Holodeck rig is too great for in-home instillations. “Where then will the setup be used?”, I wondered. Iliff says the team is currently talking to a number of major companies about utilizing Project Holodeck, but the future of Holodeck could be in your living room!

We see Project Holodeck as more of an arcade experience, because the space required is larger than the average space available in a consumer’s home.  We plan on taking this to expos and festivals like Indiecade, Maker Faire, IGF, and others, as well as our own local showcases in Los Angeles like First Move, Demo Day, Other Worlds, etc.  We could potentially license our software platform if we wanted to, and we can put together kits with our own custom hardware, but in the long run we also want to reach home users with a simpler consumer system that can fit comfortably in the living room.  We’ve been talking to several big names in the industry, including Microsoft and Disney Imagineering, and there’s a number of different directions we can go with it – all equally exciting!

Iliff further noted that, for the time being, the team is focused primarily on development.

Stay tuned for more on the project as it develops and be sure to check out the project’s website for official updates from the team.

Comments

  1. Outer says

    They should defiantly be using a portable omni-directional treadmill, designed by Swedish company MSE Weibull. The only factor with that would again be cost, but seeing as how it frees you from a 20×20 room to infinite 360 movement, I cant see how you couldnt include this :)

  2. Mark2036 says

    About the issue with getting the Kinect to recognize inputs. You mention putting a bluetooth button on your hand.

    Wouldn’t it be better to situate actual buttons / levers / wheels / etc in real space, and have them wired up for direct input? Good for tactile feedback too :).

    Also would love to know the algorithm you use to get the 4 kinects working together. Do the cameras need to be positioned at 90 deg angle from each other? Do the infrared signals from the kinects interefere with each other, or do you sample them at interspaced frequency so only one is on at a time?

Leave a Reply