HTC has announced a new set of tools allowing developers to build applications which take advantage of the Vive Pro’s stereo front-facing cameras, effectively turning the device into an AR headset dev kit. The new tools allow the cameras to capture depth, spatial mapping data, hand input, and seamlessly shift between VR and AR worlds.
While the original Vive launched with a front-facing camera, it went sorely unused. This time around, with the Vive Pro, the company is offering the VIVE SRWorks SDK. Announced last week, HTC says that the SDK includes three modules: a depth module, see-through module, and a 3D reconstruction module—effectively a foundational set of tools enabling the headset to sense the world through its front-facing cameras, and allowing developers to use that data for creating interesting experiences that can be pure AR or VR, or a combination of both.
This example video, using the Vive Pro and the SRWorks SDK shows how the tools can be used to seamlessly link VR and AR worlds, creating new and interesting gameplay and application possibilities:
The company says that the SRWorks SDK includes support for native development with plugins for Unity and Unreal, and that the modules can enable the following:
- Spatial Mapping (static and dynamic meshes)
- Placing virtual objects in the foreground or background
- Live interactions with virtual objects and simple hand interactions
This example shows how the Spatial Mapping module can create a model of the room’s geometry for use in applications:
The Vive SRWorks SDK is available in beta through the company’s developer portal.
Like the ZED Mini depth camera add-on, thanks to the new tools, the Vive Pro’s front-facing stereo cameras effectively emulate the sort of experience that AR glasses will ideally achieve in the future—an immersive, wide field of view with precision tracking and environment mapping, potentially turning it into a great dev kit for developers building toward XR hardware of the future.