Ximmerse, a tech company based in China, are working on a suite of optically tracked input systems based around a proprietary stereo camera. The system is claimed to work with both PC and mobile based virtual reality.

The three big players in the PC virtual reality market have now demonstrated their solutions for naturalistic, motion input for virtual reality applications. Valve has Lighthouse, Oculus has Touch and Sony have Move.

Mobile virtual reality however is however sorely lacking promising motion input technologies, with Gear VR’s peripherals limited to wireless gamepads currently.

ximmerse-stereo-camera-1
Ximmerse’s stereo camera with a claimed 160 degree FOV

Ximmerse is a Chinese company who claim to have developed a cross-platform optical tracking system based around a high FOV stereo camera. Ximmerse’s solution comprises a suite of controller peripherals, all with embedded IMUs and glowing orbs. The latter recalls Sony’s Playstation Move, which adopts a similar method to provide 6 DoF (degrees of freedom) via the PS Eye camera.

x-immerse-xcobra-controllers-2
The X-Cobra input controllers from Ximmerse

Ximmerse claim that, with their system, they’re able to track up to 240 individual orbs or ‘blobs’ as the team calls them with no interference, all tracked by a single stereo camera. An impressive claim.

The company have developed a suite of controllers based around the company’s stereo camera. The X-Cobra, a pair of handheld motion controllers, similar in design to the Playstation Move or the Razer Hydra systems, uses optical tracking  ‘blobs’ for 6DoF positional tracking as well as IMU’s for rotational movement. The controllers are wireless and offer a neat way to bring your hands into VR – although it’s hard to judge just how effective they are from the video the team have released demonstrating the tech.

SEE ALSO
Etee Finger-tracking VR Controller Kickstarter Concludes with Over $110K in Funding

Additionally, Ximmerse have a wireless, body motion tracking IMU. Demonstrated in the above video, the team suggest it could be used as an alternative input device for mobile VR platforms, with the user moving their body to influence the experience.

The company also have a haptic glove, although details on the mechanics of it’s tracking solution is a little unclear. Ximmerse’s website seems to suggest the glove utilises IMUs from Perception Neuron yet also claim the glove is tracked via Ximmerse’s stereo camera system.

ximmerse-standalone-imu-1
Ximmerse’s ‘standalone’ IMU, used for body motion capture

All of these devices it’s suggested can be used with mobile virtual reality platforms such as Gear VR, details on how this works however are thin on the ground right now.

Ximmerse will be making the trip to Los Angeles this week to exhibit at SIGGRAPH 2015 at the LA convention center. If you’re attending the show, you can find the company’s booth at PD11. The team will also be attending VRLA later in the month.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.


  • SuperDre

    Looks promising, I wonder how much it’s going to cost.. Funny thing is, that it seems to even track the orbs when he’s turned..
    But I hope Sony is smart and releases real PC-drivers for their PS3/PS4 camera/Move controllers, as it’s a great tracking mechanism (and I already own the PS3 camera, a few move controllers and a few sharpshooters)..

    • Simon

      There is a community written (userland) driver for the Move:
      http://thp.io/2010/psmove/

      I’m not sure whether there will be a ‘new move’; it seems that many stores are clearing their move/navigation accessories. What is sure is that Sony won’t be replacing the DS4, which contains appropriate Acc/Gyros/Mags/LEDs to make positional tracking work.

      The occlusion problem is not one which would be solved easily (perhaps multiple cameras), but remember that the optical detection/yaw compensation is relatively slow compared to the sensors themselves. PS3 eye is up to 120fps, PS4 camera is up to 240fps (at low res). The sensors can operate much faster ~1000Hz (there was recently a patch to Linux to get this speed).

      Ultimately whatever optical setup, the sensors will have to ‘carry through’ positional data with little drift and modern sensors are dam good. The positional solution can be further constrained to improve accuracy, with skeletal models and the like.

      It’ll be interesting to see how well the XBone controller works with the PC/Oculus.

  • bteitler

    I’m not sure why people keep using this design of only one position tracked orb plus IMU. It provides really terrible data (compared to what Vive and Oculus Touch have / will have). The drift problem simply cannot be correctly precisely with only one absolute position. Theoretically, you can use magnetometer to correct yaw drift, but it never works in real environments (consumer environments) due to interference. Theoretically, you could also use fast movements of the single tracked point to detect incorrect yaw and align, but again this performs poorly in practice due to the need for double integration of IMU data.

    I keep waiting for Sony to announce new controllers for their Morpheus since the data quality discrepancy between move controllers and the other VR solutions will be really obvious to consumers once they’ve tried a real drift free solution. If you’ve ever played a Wii / Move game and had to frequently “recenter”, this is what you can continue to expect from these types of solutions.

    I’d really like someone who has designed one of these systems to explain why they don’t just put at least two orbs on the device.

    • brandon9271

      @bteitler. Perhaps having two camera is the same as having double the orbs? :) I’m no expert on this by any means but isn’t the PSmove only a mono camera? If so, then having a stereo camera should make a difference.

      • bteitler

        Stereo camera helps with tracking the single point, especially depth, which previously had to be solely inferred from the radius of the sphere (this is what original PSMove does, and it isn’t very accurate). It does not really help with rotation of the device or drift correction.