Lots of companies have made this claim actually, but Eonite specifically says they have the “world’s most accurate, lowest latency, lowest power consuming software to democratize inside-out positional tracking for VR and AR.”
Inside-out positional tracking—the ability to precisely determine where an object is in space using only sensors mounted on the device itself—has been an obvious need but an elusive challenge for the VR and AR industries. AR in particular requires such accurate and low latency tracking that virtual objects can appear fixed to the real world.
We’ve seen very impressive inside-out positional tracking before on Microsoft’s HoloLens—a $3,000 full-blown Windows PC on your head—which houses a bevy of sensors. But today Eonite is announcing their inside-out tracking solution, and claims that it supports low-latency, high accuracy headtracking using just commodity depth sensors and tablet-class processing power
That’s potentially revolutionary for the VR and AR industry if true. And while we’ve haven’t gotten our hands on the tech just yet, Eonite has attracted the attention of Silicon Valley venture capitalists and angel investors who dropped $5.25 million on the company in a Seed investment in 2016. Among the investors is Presence Capital and The VR Fund, who specialize in VR & AR tech investing.
The company’s tech is not hardware, but general purpose software for achieving high performance inside out tracking. “It’s not a future promise. It works,” Eonite CEO Youssri Helmy said, speaking with Road to VR. He says Eonite’s tracking is capable of sub-millimeter accuracy and just 15 milliseconds of motion-to-photon latency; both under the threshold of what’s considered high enough performance for VR tracking.
The company is calling the capabilities of the tracking ‘homescale’, to suggest that it can enable tracking across a multi-room, home-sized space, and is tuned to track well given the sort of objects you might find in a common home (furniture, shelves, doors, thin etc). Helmy says that the tracking tech integrates IMU and RGB data, and can work with “any depth sensing, from high def stereo, time-of-flight, rolling shutter, global shutter. Anything. The software doesn’t have much to do with the camera.” The software is also said to support both static and dynamic real-time obstacle detection for avoiding things like walls and pets.
Helmy says the tracking software is built on years of work on artificial perception in robotic and consumer applications by co-founders Dr. Anna Petrovskaya and Peter Varvak. “It’s the same core technology for tracking robots as tracking headsets,” he said. “The tech they had blew me away [when I first saw it].”
But it isn’t just for tracking. Eonite is working on a Unity SDK which the company says will allow developers to bring real-time 3D scanned data from the user’s environment into the virtual world for mixed reality and AR applications, including support for persistent virtual content, shadows, and occlusion.
While the company is primarily pitching the tech for AR and VR tracking for now, it’s also said to be a solution for other industries like automotive, robotics, and manufacturing. The first product using the company’s tracking will launch in the first quarter of this year. Helmy says.
While Eonite’s technology sounds promising, 2016 saw demonstrations of major progress on inside-out tracking by a number of companies. First was the aforementioned HoloLens, followed by the impressive tracking of Qualcomm’s VR reference headset, along with Oculus in October who showed off highly functional inside out tracking on the Rift ‘Santa Cruz’ prototype. Inside out positional tracking is likely to be a dominant theme of AR and VR in 2017, and if truly solved by any of these players, will mark a major next step for the industry.