This project uses Kinect depth cameras to the fuse virtual and the real to produce real-time mixed reality footage with no green screens required, all using Microsoft’s depth camera Kinect.

Mixed reality is all the rage and has become one of the most effective methods to convey the power of immersion afforded by new virtual reality technologies. Spearheaded most recently by the Fantastic Contraption team for their excellent teaser series, the Vive’s room-scale positional tracking used in conjunction with green screen backdrops to fuse the virtual and the real.

A new experimental project has come up with an alternative method, one that does away with the requirement for draping your demo area with green sheets and leverages Microsoft’s Kinect depth camera to achieve a similar effect, in real time. The new technique allows potential exhibitors to show off a user interacting with a virtual application an retain any demo space design (say, a stand at a conference venue) and still produce a compelling way to visualise what makes VR so special.

“We built a quick prototype using one of HTC Vive’s controllers for camera tracking, Microsoft Kinect v2, a Kinect v2 plugin and Unity running on 2x machines,” says the team, who have demonstrated their work via the above YouTube video. “The server ran the actual VR scene and the client extracted data from the Kinect, placed the point cloud into the scene, resulting in a the mixed reality feed. The depth threshold was altered dynamically based on the position of the headset.”

kinect-mmixed-reality

Of course, the compositing effectiveness is not as precise as a professionally produced green-screen equivalent, and there will be occasional pop-in from other objects which creep into the demo space, but it’s a neat, low cost and potentially more practical approach to getting your VR app noticed at a venue.

However, the biggest drawback for the technique will likely be it’s Achilles heel, specifically the requirement for the target VR application to provide integration for displaying the point cloud imagery alongside the view captured by the virtual camera. No mean feat.

kinect-mixed-reality-2

Nevertheless, it’s an intriguing approach that once again reminds us how Microsoft’s gaming peripheral seems to have found a life much more productive than it’s original, ill fated designed purpose.

You can read in detail all about the project over at the team’s YouTube channel here.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Based in the UK, Paul has been immersed in interactive entertainment for the best part of 27 years and has followed advances in gaming with a passionate fervour. His obsession with graphical fidelity over the years has had him branded a ‘graphics whore’ (which he views as the highest compliment) more than once and he holds a particular candle for the dream of the ultimate immersive gaming experience. Having followed and been disappointed by the original VR explosion of the 90s, he then founded RiftVR.com to follow the new and exciting prospect of the rebirth of VR in products like the Oculus Rift. Paul joined forces with Ben to help build the new Road to VR in preparation for what he sees as VR’s coming of age over the next few years.
  • Vinny

    I’ve done quite a bit of prototyping in this exact area and what I found is that the Kinect’s IR blasts interfere with the Vive tracking – was this something you ran into as well? if so, how did you overcome it?

    • We work with kinect+vr. With Vive you have to put the kinect on a low height desk… the problems arise if you put the kinect at the same height as the headset or the controllers. Anyway yes, it’s a problematic setup… because if you crouch, interferences arise

      • Vinny

        I’ve played around with the positioning of the Kinect – and while it did help, it did not solve all of the interference issues.In the video they posted, it looks like it’s pretty high and it did not seem like they were getting much interference.

        • Yep, you’re right. Possibilities are two: one is that the video is partially fake (some post-processing… or they have taken only the good parts of their experiment); the second is that they’re using filters to filter out kinect rays from Vive one (but never tried this solution…)

          • Alex Anpilogov

            Hey guys, thanks highlighting the tracking issue – I’ve added a note to the youtube video too.
            We assumed such interference may happen, but actually haven’t had any issues, though it may well be because of the angles we chose to experiment with – those angles, as well as height and position of Kinect’s location were chosen completely randomly as we went along.

            One issue I can recall from the dev process was the user’s controller ‘slipping’ away, which normally occurs when the controller is not clearly visible by either base stations. It only happened a couple of times overall, each time no more than a second or so. As we’ve had a similar issue before during other work, we didn’t associate it with the IR interference.

            There are a few discussions re this on reddit and steam forums, with some saying that tracking wasn’t an issue and some suggesting that the sync cable may help the situation. To identify the specifics of the interference, one could simply beam the Kinect onto the Vive scene in various configurations and isolate instances of interference. This will highlight most scenarios when it is happening, and then one can setup the Kinect accordingly.

            Another thing I’d mention is that I got an email from a guy who’s building a portable depth cam to be used with mobile devices – http://www.vicovr.com/ – not sure if it gives you access to the point cloud, but may well be worth experimenting with, alongside RealSense or Tango-based devices to see if any similar issues arise.

  • Had the same idea… Kinect is cool because you can remove the green screen. Have to say that if lighting condition are not that good, the silhouette retrieval operation gives bad results. But you can obtain a decent result in most conditions

  • As this sort of tech matures its easy to see it becoming a new way to:

    – Patch players in their VR games via Twitch.

    – Film makers filming actors with the set visible in the background (though think they were doing that for some of the last Transformers film).

    – Playing the old 80’s/90’s UK tv show Knightmare…. with kids guiding the blind folder player through the VR maze…

  • DiGiCT Ltd

    The lighthouses interfere with other light related equipment.
    I had the issue that i thought my Airco was broken as i could not power it on or off anymore.
    The real issue was that one of the Lighthouses was near it, interfering the IR signal from my remote.
    Funny to figure that out that way, just some equipment can have a non working state when those lighthouses are running. Same could be on you living room TV remote.

  • Simon Conley

    Hi I have a vive I have the kinect but I have no idea on how to set this up to give it a try, can anybody post a how to guide! Step by step if possible! Thankyou