At CES 2014, SoftKinetic, the company that powers the Creative Depth Camera, was showing off a pretty cool Oculus Rift demo which used the depth camera mounted to the headset. The demo enables the user to put their own hands in the virtual world and build basic structures. The camera mount is available to be 3D printed.

Getting hands into a virtual world is a huge step up for immersion over keyboard and mouse. The most widespread implementation of virtual hands that we’ve seen so far comes from Oculus Rift demos that use the Razer Hydra motion controller to give users hand input. And while this adds significantly to intuitive control and immersion of the scene, it is still limited to showing the user mere avatar hands and relies on unnatural button presses to articulate those hands. The next step up is showing the user their own hands and fingers, not those of an avatar, and allowing them to use their hands just like they would in real life to reach out and grasp objects within the world.

And that’s exactly what SoftKinetic’s technology is enabling. Take a Creative Depth Camera, which is powered by SoftKinetic’s sensors, mount it to the Oculus Rift, and you can get your own hands and fingers into the virtual world.

Tricking the Brain is the Only Way to Achieve a Total Haptics Solution

When I checked out SoftKinetic’s cube building demo, I noticed immediately an extra sense of presence when greeted by my own virtual hands moving like a mirror of the real world. While not perfect for this application, the Creative Depth Camera worked fairly well.

The latency of the Creative Depth Camera was relatively impressive, but there’s still room for some improvement. As with most computer-vision based approaches to scene tracking, there’s still some jumpiness, and there’s the issue of the camera’s field of view not matching that of the Oculus Rift—not to mention the camera losing your hands if you look away from them.

Still, the demo worked as a great proof of concept. Having used the Razer Hydra extensively, I can say that there is just something different about using your own hand and fingers to grab and manipulate objects, rather than holding a controller in your hand which adds a layer of abstraction between you and your virtual input.

While this type of natural hand input will work great for casual VR experiences where training-less intuitive input is important, controllers are unlikely to go away anytime soon. The type of responsiveness and accuracy that serious gamers crave can only be handled by controllers for now, and on that front, Sixense’s STEM system is confidently paving the way.

Previously we’ve seen similar implementations with the Leap Motion sensor, though developers have expressed frustration about using Leap Motion for VR input.

Print Your Own Creative Depth Camera Mount for the Oculus Rift DK1

Live1200_preview_featuredIf you happen to own an Oculus Rift and Creative Depth Camera, you can download and print your own mount for the Oculus Rift DK1 from Thingiverse.

GTA Boss on VR: "No Market" for VR that "Requires You to Dedicate a Room"

The download files also include a Unity demo similar to what I saw at CES 2014. Source files are also included if you’re looking to get started with some development.

At CES, SoftKinetic told me that the cube building demo was available for public download from the company’s website, but I haven’t been able to track it down. We’re in touch with the company to try to make that available to you; we’ll update this article with developments.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.