e3-sixense-stem-ben

Today Sixense is revealing the reason for the delay of their STEM motion input controller—the company revamped their board design to include an IMU sensor which augments the motion tracking capabilities of the controller. The IMU is used for calibration-free correction of magnetic distortion. At E3, Road to VR went hand’s on with the latest STEM prototype and came away very impressed with the system’s performance.

sixense-stem-vr-controller-with-imuBack in April, Sixense announced that the target date for the STEM VR motion controller had been delayed by three months. At the time, the company only noted that the delay was “for the sake of performance optimization.” Now, the full story is revealed. It was a tough choice for the company to delay on their backers after a successful Kickstarter, but in the end, they wanted to make sure that STEM provides the performance that gamers and VR enthusiasts demand. Adding the IMUs meant changing the design of the boards used in the controllers and STEM Packs. From what I’ve seen of the company’s latest prototype, which include the added IMUs, they made the right call.

One of the key issues with the Razer Hydra, the popular VR motion controller which used Sixense technology, was its susceptibility to magnetic distortion. Such distortion made for inconsistent experiences depending upon the environment in which the Hydra was used. I, for one, have always seen great performance with the Hydra when using it with my Oculus Rift. My colleague Reverend Kyle said that he has had issues aplenty with distortion, causing odd behavior and improper input when the controllers were held in a certain areas.

SEE ALSO
This Unreal Engine Mod Brings the Internet's Hottest Pokémon-style Game to VR

Both of us got to check out the latest prototype system at E3 last week and came away very impressed with what we saw.

sixense-stem-e3-imu-reverend-kyle

It was explained that the IMU data augments the magnetic tracking data. If the system senses that the controller is starting to pitch in some direction, it checks with the IMU to reference the downward vector. If the two don’t agree, the system smartly adapts to provide accurate motion input by using the IMU’s frame of downward reference.

Sixense had me try the latest prototype of the STEM with and without the added IMU data. With it enabled, everything seemed normal; my arms move where and how I expected them to. When they disabled the IMU, my arms would veer strangely downward as I went from holding them forward to holding them out to my sides in a T-pose. In addition to the impressive automatic distortion correction, I was blown away by the latency of the system. Even when vigorously shaking my hands, my virtual hands seemed to keep up without issue.

e3-sixense-stem-delay-hands-on-benOne of the things that impresses me most about the STEM system is it’s absolute positioning. Even after shaking my hands like crazy and moving them all over the place in different poses, bringing them back together, so that my virtual knuckles were touching each other, was dead-on from start to finish. This sort of one-to-one aboslute positioning is vital for consistent interaction with items on your virtual body, be it a holstered gun, a quiver of arrows, etc.

The team challenged Revered Kyle to try a an SDK demo shooting range with and without the IMU’s contribution to tracking. The results were clear in both how it felt and how it looked:

SEE ALSO
Warner Bros Announces New 'Matrix' Movie Directed by 'The Martian' Screenwriter Drew Goddard

“Prior to E3 I had a chance to try the STEM system at the SVVR Expo, and I was fairly happy with the progress. When I tried their latest prototype, I was completely blown away by how much better it was. I appreciate when a company decides to delay shipment a bit to provide me with a better product. It’s just good business. I think backers will be very impressed with what they end up receiving,” Kyle told me.

Sixense is taking pre-orders for STEM at their store and expects newly ordered units to begin shipping in October of 2014.

Sixense SDK Designed to Support Other Motion Input Devices

Sixense has been working hard on their ‘SixenseVR SDK’ which the company says will be free and is designed to work with multiple input devices.

“The Sixense VR SDK was designed to be hardware agnostic, so that any device that can provide true position and orientation could be compatible,” said Danny Woodall, Creative Director at Sixense.

While I’m not a developer, from what I gather in discussions with the team working on the project, developers who integrate STEM using the Sixense SDK will have a very easy time adapting for other motion input devices. In particular, it should make it easy for developers to end up with one package that supports any range of motion devices, rather than making one specific to, say, STEM, PrioVR, and Control VR.

As for updating games that already support the Razer Hydra, Sixense says they’ve designed the update process to be as easy as updating a .dll file. Several developers have tried this upgrade implementation and had their game up and running with STEM in minutes, according to the company.

SEE ALSO
‘COLD VR’ Takes Smash-hit ‘SUPERHOT VR’ Time-freeze Mechanic & Completely Inverts It

The latest version of the shooting gallery demo, which will presumably ship with the SixenseVR SDK as an example for developers, now has a back room with some physics based objects to play with. As I entered the room for the first time, I pushed the door open slowly with one hand while peeking through the crack with a gun in my other hand—yes another example of natural motion gameplay and just a little taste of stuff that you simply cannot do without a true motion controller.

The SixenseVR SDK also uses a custom inverse kinematics (IK) solution which I found to be very impressive, even when trying to pose in positions which I thought would be confusing for the system to figure out. Even with my arm straight up and behind my head, pulling my hand straight down to my shoulder resulted in an accurate solution to my arm position, and impressively, no jumpiness whatsoever to how the system estimated my arm position.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • sponge101

    Nice to see they’re making improvements to the STEM but the biggest barrier to entry is, and has always been, the price. Until they somehow lower production cost and that being reflected on the retail price, I don’t how they can see deep market penetration. This is a problem that other vr feedback peripherals are challenged with and the solution is, unfortunately, waiting for competition to force the price to be lowered for consumer.

  • Why this sounds absolutely fantastic. I have been wondering if optical is the only solution to good positional tracking, as Oculus went that path with the DK2, but this sounds very promising.

    Perhaps it will even be possible to augment the DK2 with a STEM-pack to liberate it from the camera :D Personally I have been very inspired by using the Hydra in Half-Life VR and Drash’s Bestiary XI for positional tracking, then going from standing to crouching and almost touching the floor with my head. It’s an awesome feeling, and I will miss that if I cannot do it with the DK2.

    Hopefully developers will still keep adding support for using a Razer Hydra, or STEM, for positional tracking for the people who want a larger volume to move through or has a DK1.

    I did back the STEM also because they have promised to be backwards compatible, which means I can enjoy old demos wireless, which almost alone motivates the purchase :D I was a bit disappointed by the shipping delay, but much rather a better product than having it earlier. I mean, I’ve gotten used to thinking that way with Oculus ;) Even though I did get the DK1… and the DK2 first day.. uh… yeah.

  • Roy

    Sooooo no one is going to ask about the stuff on your head, which you don’t appear to mention in the article?

    I think I know quite a bit about the Steam (followed the kickstarter but couldn’t justify the cost) and having both a hydra and a rift (and a lot of fun with Zombies on the Holodeck and HL2 VR) I assume this is trying to solve the problem of associating the controller position to that of your head?

    I was under the impression that the Stem was going to be using a fixed place base station, like the hydra, but this seems like the approach that a lot of the fully mobile VR solutions (like project holodeck) have been adopting by having the base station on your body.

    Any insight as to what it’s doing or whether this is something they’re looking at? I.e. a wearable “VR” base station?

    • Alkapwn

      I believe that the thing on his head is actually just a stripped down version of the new STEM with IMU and EM. This would provide Positional tracking for the DK1. And as an added benefit they can perfectly co-ordinate the head and hand positions in relation to each other.

      • Roy

        Yeah, good call. That makes perfect sense :)

  • MaxSmalls

    Just wanted to clarify what the significance of the SDK acting as a standard API is. It’s actually closer to what Direct X does or more specifically the Direct-Input part of it. They are acting as a standard set of methods/functions between the drivers and software for the developer (or not as accurately, think of it as a list of commands that a game can call that doesn’t change no matter what device talks to it). So basically, it’s going to be the piece that stands between the driver and the software. The significance here is that if a game developer talks to the SDK rather than the devices themselves, they automatically support any hardware that implements the SDK or more likely, any driver that implements it. So to use graphics cards as a reference, a game does not support specifically a video card like in the olden days… the game uses either Direct-X (Direct 3D) or OpenGL. The developer does not need to know anything about the video card as long as the video card’s driver knows how to make the hardware do what the API (Direct3D or OpenGL) is asking it to do. On the flip side, the video card company doesn’t need to know how other video card manufacturers actually make their card display the graphics. They also don’t have to maintain their own API per a device or even worse make developers talk directly to their hardware.

    • MaxSmalls

      Whoops, forgot to put the point. So the point is that if they open their SDK as an API anyone can use, it means I could take a Wii-Remote, a Kinect, Leap Motion, or a PS Move controller, implement their API for it, and suddenly those controllers would work without the need for a developer to specifically add them to their software (obviously as long as I can find a way to make each of those devices fully implement the API).