Naturalistic input for users wishing to take that extra step towards full body immersion for their virtual reality experience is still some way off. You can see the pieces forming and coming together, and things are evolving quickly, but we’re not there yet. For now, the only real way to get all your limbs tracked and modelled in VR is to grab yourself a fully fledged motion capture studio.

oculus2VIVE (short for Very Immersive Virtual Experience), is a project by a team based at the Emily Carr of Art and Design University, Vancouver. The goal of the project was to create an untethered VR experience which tracks your body and limbs as you move freely through a space. The project and it’s application source code is being freely distributed to share the team’s work.

The setup comprises the following:

  • 40 Camera Vicon Motion Capture for realtime mocap
  • Custom data translation code to facilitate communication between Vicon and Unity
  • Unity Pro for realtime rendering
  • Autodesk Maya for scene creation
  • Dell workstations and NVidia Graphics Cards for data processing and rendering
  • Paralinx Arrow for wireless HDMI (untethered operator)

..so not exactly the kind of setup the average person is likely to have in their basement, but the results are undeniably cool. Motion data is captured at 120FPS for an impressively fluid translation of body to avatar mapping. MoCap data is then fed into Unity, which then translates your virtual presence in realtime through the environment.

SEE ALSO
Oculus Quest is Now Meta Quest as New Branding Hits Official Website

plx_arrow_txrx_webThe system uses a Paralinx Arrow wireless HDMI transmitter and receiver to beam images to the Oculus Rift directly, leaving the user to wander through space unencumbered by a heavy backtop. The team have developed a custom interface to read data from Vicon, an industry Motion Capture system, and convert the positional data into usable positions for Unity, in this case a custom version of Oculus’ SDK demo, Tuscany.

As stated, the team have released their software and have even included a rather straightforward looking set of instructions. You know, just in case you really do have that killer MoCap system in your basement after all.

It’s another example of research that could one day inform systems that are available to us regular consumers. And before you scoff, cast your mind back a short 3 years and ask yourself if you thought VR would be where it is today.

For more information on VIVE, check our their dedicated web page here, and Emily Carr’s University site is here.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.


  • snake0

    The Paralinx is far more interesting than any of the motion capture bollocks. Wonder what kind of latency they’re getting.

    • Dawiiz

      Less than 2 ms latency. But at the “affordable” price of $ 869.95

  • marald

    Kind of similar as what NuFormer posted last week on Vimeo,
    http://vimeo.com/98645954

  • DevinWeidinger

    Whats wrong with Prio VR and the Sixense Stem system? This stuff seems really far from consumer product.

    • marald

      Prio vr doesn’t give you positional tracking and stem is not available yet and has a limited range, although it can be used with three base systems to enlarge the area. But I don’t know how accurate it will be. But I will try as soon as they arrive :)

      • Ben Lang

        To be clear, PrioVR does give positional tracking, but it isn’t ‘absolute’ position, but I think that’s what you were getting at : )