Catching up with Leap Motion at CES 2016 last week, CEO Michael Buckwald tells us about the company’s latest work like the Interaction Engine and newly reprojected hands. Buckwald also says that Leap Motion tech will ship in hardware from “major” OEMs this year.
Although Leap Motion was founded in 2010, some three years before Oculus struck a new spark in the VR industry with their initial VR headset development kit, the company has made a major shift toward virtual reality, believing there to be potential for their gesture camera technology as an input method for VR.
“We are 100% in AR and VR with the exception of the fact that… obviously there are hundreds of thousands of developers who use [the existing Leap Motion camera] to do all sorts of things,” Michael Buckwald, CEO of Leap Motion, told me. “But… we try to be very very focused as a company so we are 100% focused on VR and working with VR OEMs to embed the tech and making the software to work better for VR.”
While the company hasn’t given any hints as to who it’s working with, Buckwald said that we’ll see Leap Motion technology shipping in hardware from “major” OEMs in 2016. As Oculus, HTC, and Sony have already shown their preferred method of input to come in the form physical controllers, VR hardware from these company seems like unlikely candidates for where Leap’s tech will end up, at least in 2016.
Instead, Leap may be aiming to play in the mobile arena where toting around physical controllers doesn’t make much sense. If that were the case, we could see the company’s tech either built into mobile VR headsets or even into a phone itself.
On the software side, the company has been continuing to refine their computer-vision stack to make hand-tracking as accurate as possible. Internally the company has been developing what they’re calling the ‘Interaction Engine’ which they hope in a way will standardize and improve how users interact with objects in VR using the company’s hand-tracking tech, making it easier for developers to focus on the experience of their Leap-based applications rather than spending time trying to fine tune the interactions between user’s hands and those experiences.
We got to try out the Interaction Engine for ourselves at CES and saw notable improvements in the reliability of object manipulation using Leap Motion. The code isn’t released to developers just yet but the company tells us they plan to release it widely in the near future.
Latest versions of Leap Motion’s hand-tracking software now utilize a reprojected view when using the so called ‘Image Hands’. Image Hands are a virtual view of the user’s actual hands as seen through the Leap Motion camera. Developers can opt to use the Image Hands or instead use computer modeled hands shown in place of the user’s hands.
When using the Image Hands, Leap Motion’s software now ‘reprojects’ the view to be more accurate than before. Because the cameras on the Leap Motion are not as wide apart as human eyes, and because the cameras are mounted several inches away from where the user’s eyes actually are, the camera’s view of the hands is not exactly the same as how the user would see their hands if looking as them without the headset, Buckwald explained. Reprojection is used to address this, fixing both the scale and distance to the hands so that they feel as natural to the user as possible.