Mobile VR motion controller manufacturer Finch Technologies today announced it’s working with Qualcomm and HTC to deliver its six degrees of freedom (6DOF) FinchShift controllers to a multitude of devices supporting Vive Wave.

Back in November 2018, HTC revealed that 15 hardware manufacturers had already adopted Vive Wave, the company’s open platform offering interoperability between several classes of mobile VR headsets and accessories—including HTC’s standalone VR headset Vive Focus.

An upcoming Vive Wave SDK release is said to include support for FinchShift, which will bring support for the controllers to a “wide range of devices and headsets and work with both iOS and Android operating systems,” Finch says in a press statement.

Image courtesy Finch Technologies

Qualcomm is also confirming FinchShift compatibility with its Snapdragon 845 VRDK, a standalone VR headset reference design that gained Vive Wave support shortly after it was revealed early last year. As Qualcomm’s latest VRDK, the headset essentially gives prospective manufactures a basis for creating their own headsets, and of course turn-key access to HTC’s mobile version of its Viveport app store.

FinchShift is a mobile 6DOF controller that lets users engage in room-scale experiences without the need of basestations or external sensors of any kind. A pair of FinchTracker armbands come along with the kit, providing additional points of tracking data.

Hands-on: Vive Focus 6DOF Controller Dev Kit

Unlike other motion controllers, which can rely on IR cameras or even ultrasonics to provide positional tracking, positional data is inferred from the system’s inertial measurement unit (IMU) sensors—something akin to what you might find in a 3DOF controller that ships with Oculus Go or Mirage Solo. Although Finch’s own sensor fusion algorithms and inverse kinematics are said to create a 6DOF experience that can track both parts of your body and hands naturally.

Image courtesy Finch Technologies

The controller and armband are said to offer up to 18 hours of active use and weigh less than 3.6 ounces for the controller, and less than 1.9 ounces for the armband.

The company is releasing their FinchShift controllers (available in touchpad or thumbstick varieties) and FinchTracker armbands via a developer kit directly through the company’s website; shipping is expected to begin this month. The entire kit and SDK costs $250, although the company is providing discounts on bulk purchases.

We haven’t had a chance to try out FinchShift yet, although the company will be at CES 2019, taking place January 8-12 in Las Vegas. We’ll have feet on the ground at CES this year, so check back soon for more coverage on all things AR/VR debuting at one of the world’s largest consumer electronics show.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.

  • FireAndTheVoid

    If the controllers infer 6DOF movement from IMU’s and inverse kinematics, why do the controllers have what appear to be tracking dots on them?

    • This is the real question

    • Christian Schildwaechter

      The keyword is “sensor fusion”. By itself optical tracking is limited by the framerate of the camera, topping at about 120 FPS, plus latency due to the processing time required to interpret the data. An IMU can be queried several thousand times a second with minimal delay, so IMU data for rotation or translation is inherently more precise.

      The problem with acceleration sensors and gyroscopes is that they give values relative to their previous position/rotation, which causes small errors in measurement to accumulate to big differences after a short time. So you need an external, absolute reference to occasionally correct the current error and to measure the typical difference, so a correction factor can be applied to all future data.

      Adding tracking dots achieves this. Every few seconds the controllers can be tracked with the regular camera sensors, and since this measurement is used only in retrospect (“The IMU was off by xyz 2sec ago, so we have to correct by x1y1z1”), it doesn’t have to processed in real time, the processing cost is a lot cheaper. So IMU data plus IK does the main work for tracking, and a lightweight optical controller tracking fixes accumulated errors.

  • Huh, that’s really interesting. From what I’ve researched online, positional tracking information provided by IMUs alone is so noisy that the device will succumb to extreme drift in a matter of a few seconds. I wonder if the key lies in that arm band, perhaps something is happening there that gives the controller enough of a reference point to account for drift.

    • FireAndTheVoid

      It sounds like they are using additional pieces of information in order to help eliminate drift – that a user’s arms don’t drift away from their bodies, a user is a human with the same arm bones and range of movement, and users will frequently exhibit the same general arm movements (e.g. place their arms by their side when resting). I would imagine that taking these into consideration could eliminate 95% of the drift issues.

      • Blaexe

        But the sensors inside the armbands drift too. I don’t see a way for this to work without a external reference. Curious about the reports from CES.

        • KuraIthys

          I realise this is an old comment, but… Gravity.

          No really, this is how Wii remotes did it with motionplus.
          An accelerometer doesn’t drift dramatically, but you can’t use it to predict position because you need to do double integration, plus gravity is always present.
          This is why the Wii remotes in practice were largely using 2 axis rotation, even though you’d think an accelerometer is a linear sensor.
          The motionplus gyroscopes did drift gradually, but the calibration process is to place it on a flat surface for a few seconds.
          This works because the accelerometer can tell you’re not moving, and thus recalibrate the gyros.
          Since the gyros give rotation rate, while accelerometers give rotation angle vs gravity, you can untangle the whole mess and get at least 3 axis tracking reliably.
          6 axis tracking is a problem due to the error induced from double integrating the velocity (though due to the gyroscopes you can now at least isolate the gravity component)

          a modern IMU is typically a 3 axis magnetometer, 3 axis gyroscope & 3 axis accelerometer, so you have 3 sensors to compare against one another.

          Anyway, all you need to be able to do is determine with some degree of accuracy when any given controller is stationary.

          Since they include armbands, they have 4 points of tracking.
          That doesn’t sound like it matters, but take into account the constraints
          All 4 of them are physically constrained to be within a specific distance of one another (assuming they’re actually being held/worn correctly)
          Given this fact we can assume that if one sensor says it’s moving and the other 3 do not, then you aren’t moving, and we can work from there to recalibrate.
          the arm trackers in particular are mostly going to have readings that are physically very similar to one another.
          But in general the constraints mean that if the 4 sensors are giving wildly different readings there’s either an error, or equally wild arm movements involved.

          Still, I can’t say if this is enough to resolve the problem, but you can recalibrate without an external reference as long as you can find a reliable way to determine when a given tracking sensor isn’t moving.

          Remember each sensor has 7 axes of rotational information to work with, one of which is (to a point) an absolute external reference (magnetic field).
          This over-abundance of rotational information can then be used to identify and remove the gravity component of the accelerometers, which means what remains is the linear movement.
          However, this value is acceleration, which means you have to integrate it to get velocity (introduces error) then integrate it again to get position (introduces even more error)
          So the drift you get, if you can’t find a way to determine when the controllers are stationary, is due to the massive compounding error in position…
          What an external tracking system allows is a way to know with certainty when a controller is stationary, and what position it’s actually in, which is very helpful in removing the error.
          But in theory it’s possible to do this over extended periods.
          It’s a matter of accuracy.

          Before GPS, long distance navigation by aircraft was at times done using dead reckoning from gyroscope and airspeed data.
          This is the same basic technology, just using much more expensive components, and it allowed reliable tracking over many hours even in the 1970’s…

          Still, only way to be sure what they’ve done here is to get hold of one and pull it apart…

      • I think I see what you mean. The elbow IMU will generally pivot around a static point at your shoulder and the hand IMU will generally pivot around the elbow so in a sense you’re limiting the each IMU back to 3DOF, but in a compounded way that results in a sort of pseudo 6DOF. Then figure in physical limitations of rotation in relation to each other and there’s a lot of noise you can filter out.
        You’d probably still need a hot recalibration button, like Go or similar 3DOF headsets have, but this would be a novel solution to the occlusion issues with current ultrasonic and camera based systems.

  • impurekind

    These are so close to my designs from before even the Wii was released that I really wish I’d been able to patent the design back then:

    Note: Despite my meh renders, the actual shape of the FinchShift controllers above is basically exactly what I was thinking of in terms of the overall ergonomic shape of them.

    Remember, I was thinking of these designs before even the original Wiimote was shown.


    • WyrdestGeek

      Did you get to try out building it?

      Because what I’m thinking is: if the big companies haven’t jumped on this technology yet, maybe the reason that they haven’t is that the tracking isn’t actually very accurate.

      • impurekind

        Nope, never had the resources to do so, unfortunately.

  • WyrdestGeek

    Yes, please test at CES. Because, as it stands, it sounds as though the company is claiming to have working something that probably can’t work.

    That either means they’re very clever (which is unlikely, but would be great), or else it means their 6dof doesn’t actually work so great (which would be unfortunate, but if so, then the sooner we know the better).

    • Blaexe

      It’s not very accurate and slow. That’s what other news sites report.

      2mm to 25mm (!) accuracy and 27ms latency.

      • FireAndTheVoid

        2.5 cm max (or 1 standard deviation?) positional error is not that bad compared to the current no positional tracking for standalone headsets. I don’t see that as much of a problem for casual games that do not require much accuracy. The 27 ms of latency may be more of a problem, however.

        That being said, I don’t see much of a market for these once the Quest is released.

  • 6DOF from “internal sensors”? No way. Either it’s watching the world around it, or being watched. Gyroscopes and the best software in the world won’t keep track of a human body for very long. I bet those dots are visually tracked, and it only offers 6DOF on headsets that can see them.

  • They seem cool, but I can’t trust the statement about tracking using only IMU… dead reckoning has never been a smart choice, unless it happens for very short periods of time