Tomáš “Frooxius” Mariančík, creator of the stunning VR experience SightLine: The Chairis back! This time he’s demoing an educational prototype that aims to provide an intuitive virtual reality user interface that allows you to reach into VR and control the environment with your hands.

Reaching in to Virtual Spaces

One of the scenes from Sightline: The Chair
One of the scenes from Sightline: The Chair

Not content with developing with one of the best virtual reality concepts at last year’s VR Jam event with Sightline and blowing mine and everyone else’s socks off with his follow up tech demo SightLine: The Chairdeveloper Tomáš “Frooxius” Mariančík has come up with what looks to be one of the most intuitive and responsive VR user experience I’ve yet seen.

See Also: Why ‘Sightline: The Chair’ on the DK2 is My New VR Reference Demo

His new project, which is still a prototype, uses the combination of an Oculus Rift DK2 and its positional head tracking fused with a Rift-mounted Leap Motion controller. The Leap detects your real life hands and fingers and allows Tomáš to translate any gestures or movements into equivalent actions in the virtual world. And in order for your virtual hands to have something to do in virtual space, he’s also conceptualised and built a series of menus and gesture commands that allow the user to navigate and control the world around them.

The current prototype uses a human skeleton, complete with organs, to show how natural interaction with our hands can be used to manipulate and inspect objects in 3D space. The Oculus Rift DK2 brings its positional tracking to the party, allowing the user to grab, hold, and then glance around the virtual object in a natural way. The demo video above shows an extraordinary amount of precision and grace in play here, something that was difficult to achieve in earlier versions of the Leap SDK. But with Leap Motion’s recent skeletal tracking advancements, recently made available to developers in V2 Beta of the SDK, it seems you can clearly achieve some incredible things.

SEE ALSO
This is What a Vision Pro Competitor From Meta Could Look Like

See Also: Leap Motion’s Next-gen ‘Dragonfly’ Sensor is Designed for VR Headsets

The Leap Motion sensor attached using the dedicated mount
The Leap Motion sensor attached using the dedicated mount

The project looks impressive on multiple counts, but it’s the precision of control that’s enabled here that blew me away. Tomáš (featured in the video) taps delicately at menu sliders and scrolls through a test textbox deftly and with apparent ease. I also particularly liked the gesture of a clenched fist, which allows the user to drag the world’s position around them (or vice versa depending on how you look at it).

Mount Your Leap

vr_feature_fovLeap Motion used in conjunction with DK2’s positional tracking seems to be a potent mix and the company has done everything it can to try to evangelise this. They now offer a special mount for your DK2 which allows you to slot your Leap Motion sensor onto the front of your headset (cleverly missing most of the IR LEDs covering the DK2) and presto, you have a sensor capable of spotting your wavy arms within its horizontal FOV of 135 degrees (so claims the maker). It certainly manages to make a compelling argument for an answer of naturalistic VR input, something that even Oculus hasn’t yet publicly addressed.

VR input and ways to allow humans to interface with these new digital worlds lags behind rapidly advancing VR headset solutions. Tomáš’ prototype gives us a glimpse at how we could be interacting with our digital worlds very soon indeed. What’s more, this seems to mark a new lease of life for the Leap Motion device, which had appeared to be searching for a fitting application.

SEE ALSO
'Astro Bot' Inspired VR Platformer 'Max Mustard' Lands on Quest This Month

We’ll be digging deeper into this project along with some hands-on impressions and thoughts from the Tomáš himself on the project. In the mean time, you can find more on SightLine here and the facebook page.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Based in the UK, Paul has been immersed in interactive entertainment for the best part of 27 years and has followed advances in gaming with a passionate fervour. His obsession with graphical fidelity over the years has had him branded a ‘graphics whore’ (which he views as the highest compliment) more than once and he holds a particular candle for the dream of the ultimate immersive gaming experience. Having followed and been disappointed by the original VR explosion of the 90s, he then founded RiftVR.com to follow the new and exciting prospect of the rebirth of VR in products like the Oculus Rift. Paul joined forces with Ben to help build the new Road to VR in preparation for what he sees as VR’s coming of age over the next few years.
  • Jacob Pederson

    Looks a bit laggy there in spots, but I don’t think lag in your hands would cause motion sickness in the same way lag in headtracking does. This looks very promising (and very lawnmower man). So promising I hope Oculus is taking a look at integration / partnership with leap :)

    • Alex Colgan

      You’re right, there is some latency, which is the result of the software evolving really quickly — as a result, tracking for VR beta hasn’t yet been fully optimized. We’ll be working to bring it back down in the coming weeks.

  • Walex

    Maby I’m crazy but this seems to be THE technique to solve a lot of problems.

    First seeing your hands really adds to the immersion and people seems to be looking for them when testing the Rift.

    Secondly being able to interact with a world without using stuff like a handcontroller must be the best way to go, right?

    • Bookoo

      Latency is a problem and also not having tactile feel could be odd.

      I think it is going to be hard for a single controller to solve all the problems. For example for FPS style games, I would want a VR gun, but for exploration games something like this could be neat.

    • horizone

      I fully agree. Here is my opinion on the matter: Every desktop OS uses a mouse. VR needs it’s own “mouse” and there really is no alternative to presence than to see your hands and fingers fully tracked with fine precision. We add gamepads, hotas, steering wheels in addition to our mice and that will still be true for VR but without hand-tracking as a _default_ input, VR will have a hard time becoming mainstream. Compare to putting someone in front of a desktop OS without a mouse.

  • francoislaberge

    I have tested this at home. Latency is a bit of a problem, but the real problem still with the Leap Motion even with the latest SDK is that it loses track of your hands and especially fingers so often that it’s unusable for anything but a very controlled demo with someone who understands the technology and line of site of it’s optical based tracking.

    In this demo for instance the creator really knows what he’s doing, he’s moving his hands in ways that the Leap generally tracks well.

    Exoskeletal tracking still feels like a more likely to succeed approach. But maybe it’s just that more work needs to be done with more sensors for this approach. Or maybe at least gloves with markers on them that improve the tracking from any angle.

    • Alkapwn

      Would you be able to comment on the image I uploaded in my comment? First time I saw the FOV comparison I thought it might not be accurate. Since the Leap Motion is further in front of your view it’d have to have blind spots. Unless I’m not interpreting the data accurately.

  • Alkapwn

    From what I’ve seen in the few demos that use the Leap Motion to track hands, it seems to track best when the hands are directly in front of the headset. When you start spanning your hands out to your sides it seems to not track as well.

    I’m not sure who created the FOV comparison image, but I think it’s slightly off and perhaps misleading.

    The issue I think it has is the fact that it shows the Rift FOV from Leap’s location/POV. When it should in fact be where the user’s eyes would be while inside the Rift. When you move the Rift FOV back you’ll see that they no longer overlap completely. This seems to leave a blind-spot right where you’d expect to see your hands.

    Here’s an image I made showing my concerns. I haven’t used my Rift in conjunction with Leap Motion, but I’m assuming this is how it would work.

    http://imgur.com/fDH1LM5

    If anyone can provide more legitimate, calculated numbers I’d be interested to see.

    • Alex Colgan

      Because the Leap cameras become your eyes.

      It’s useful to think of it in terms of augmented reality applications, where the controller’s POV and your POV are one and the same — you see what it sees. However, since its FOV is wider than what the DK2 screens allow you to see, the device can still see your hands before you do.

      This works because the distance from your eyes to the sensor is only a few inches, so your sense of proprioception naturally adjusts once you see your hands onscreen. From the viewer’s perspective, it’s no different than if you had craned your head forward slightly — as human beings, proprioception only gives us a vague idea of where our body parts lie in space, so visual cues give us the rest.

      • Alkapwn

        It wasn’t as much the offset that I thought would be an issue. I figured that since the Leap is in front of your current POV that there would be missing tracking data on the edges when your arms come into view.

        My question is for when you are looking straight forward and have your arms outstretched like a bird. When you bring them in to clasp in front of you, at about 95° arm opening, in theory you should already start seeing your arms in the virtual space if the Rift FOV is 100°. Does it start tracking that soon, or does it start when your arms are closer together than 95°?

        Not sure if I’m explaining this correctly or not.

        • Alex Colgan

          It’s true that there would be blind spots in relation to where the Rift’s screens lie in true 3D space; I guess the issue is that we’re really talking about two quite different things.

          The image shows how an AR/VR experience using the offset enables tracking beyond the FOV that you actually experience. As far as the user is concerned, there is conceptually no difference between the Rift screens and the Leap FOV — they appear (and feel like they are) one and the same.

          Even if our sense of proprioception were perfect, and the offset felt unnatural as a result, this would only have an effect for about 18cm from the device on either side. (FWIW, that’s my quick napkin math, based on your image and the DK2’s 145 mm width.) Beyond 18cm, the lines intersect — so even at 30cm away, your hands would fall into the Leap FOV well before entering the ideal line-of-sight FOV.

          Naturally, we’d like the cameras and the screens to be closer, to allow for an even better match between what you would see in real life vs. what you see in VR/AR. The mounted setup is really just a first step into this space, which is why we’ve released it as a developer beta for what are currently just developer devices. We’re working on a prototype module, called Dragonfly, which is designed to be embedded by VR OEMs.

          • Alex Colgan

            Also, full disclosure: I’m the head writer at Leap Motion.