X

VIVE – the Very Immersive Virtual Experience that Fuses Motion Capture with the Oculus Rift

    Categories: Motion CaptureNewsOculus RiftOculus Rift Compatible GamesOculus Rift Compatible SoftwareOculus Rift GameVR Game

Naturalistic input for users wishing to take that extra step towards full body immersion for their virtual reality experience is still some way off. You can see the pieces forming and coming together, and things are evolving quickly, but we’re not there yet. For now, the only real way to get all your limbs tracked and modelled in VR is to grab yourself a fully fledged motion capture studio.

VIVE (short for Very Immersive Virtual Experience), is a project by a team based at the Emily Carr of Art and Design University, Vancouver. The goal of the project was to create an untethered VR experience which tracks your body and limbs as you move freely through a space. The project and it’s application source code is being freely distributed to share the team’s work.

The setup comprises the following:

  • 40 Camera Vicon Motion Capture for realtime mocap
  • Custom data translation code to facilitate communication between Vicon and Unity
  • Unity Pro for realtime rendering
  • Autodesk Maya for scene creation
  • Dell workstations and NVidia Graphics Cards for data processing and rendering
  • Paralinx Arrow for wireless HDMI (untethered operator)

..so not exactly the kind of setup the average person is likely to have in their basement, but the results are undeniably cool. Motion data is captured at 120FPS for an impressively fluid translation of body to avatar mapping. MoCap data is then fed into Unity, which then translates your virtual presence in realtime through the environment.

The system uses a Paralinx Arrow wireless HDMI transmitter and receiver to beam images to the Oculus Rift directly, leaving the user to wander through space unencumbered by a heavy backtop. The team have developed a custom interface to read data from Vicon, an industry Motion Capture system, and convert the positional data into usable positions for Unity, in this case a custom version of Oculus’ SDK demo, Tuscany.

As stated, the team have released their software and have even included a rather straightforward looking set of instructions. You know, just in case you really do have that killer MoCap system in your basement after all.

It’s another example of research that could one day inform systems that are available to us regular consumers. And before you scoff, cast your mind back a short 3 years and ask yourself if you thought VR would be where it is today.

For more information on VIVE, check our their dedicated web page here, and Emily Carr’s University site is here.