While the Oculus Touch controller is largely designed to tell the computer where a user’s hands are, the controller also has an idea of what a user’s fingers are doing. New documentation adds to our understanding of the controller’s hand-tracking capabilities, and reinforces the device’s deliberate name.
Released alongside the latest version of the Oculus SDK (v0.7) the updated Oculus Rift Developer Guide describes how devs can access data provided from the Touch controllers. Orientation and position of the controllers is provided in the same coordinate frame as the Rift headset itself, separate from the input state (button presses). “Having both hand and headset data reported together provides a consistent snapshot of the system state,” the document explains.
Touch Sensitive Buttons
The input state of the controller expectedly tells the developer when buttons are pressed, triggers are pulled, and joysticks are tilted. But it also tells something that most other controllers don’t: when a user’s fingers are touching (but not pressing) certain buttons.
This data isn’t particularly important for non-VR controllers but, inside VR, giving the controllers a way to sense finger position means the user’s hand/finger position can be matched closely, leading to a greater sense of Presence. It also provides important feedback for users; when you can’t see your hands on the real controllers, it’s hard to tell which button your finger is on. But with capacitive buttons that can sense touch, the game world can show users where their fingers area located on a virtual representation of the controller. I imagine this will be especially helpful for in-game tutorials explaining the controls to first-time players.
Calling ovr_GetInputState checks the controller’s Button State, which includes Button Touch State, indicating which buttons are being touched (but not pressed). Every input on the controller, with the exception of the side ‘hand trigger’, can sense a user’s touch, including the index trigger and the joystick.
While both the index trigger and hand trigger report analogue values (to report partly-pressed states), the controller’s two face buttons and joystick button (clicking down on the joystick) are binary (not pressure sensitive), according to the documentation.
Hand gesture recognition is something we’ve known about since seeing Oculus Touch for ourselves at E3 2015, but the new documentation explains what data developers will have to work with.
Oculus told us that the sensing of hand positions is indeed analogue, but as far as the SDK documentation reads, the company has opted to bake in pre-defined gestures. It doesn’t appear that developers will have raw access to the hand gesture data for now.
There are only two supported gestures at this time: pointing with the index finger and thumbs up. These can be checked using ovrTouch_RIndexPointing and ovrTouch_RThumbUp respectively, switching out the R for and L to check the left hand’s state.
Although no middle finger gesture is documented, Oculus Founder Palmer Luckey confirmed to Road to VR that the only fingers the controller doesn’t detect are the pinky and ring finger, leaving the possibility of more gesture states being added to the SDK down the line. The current set of two is likely the result of testing which gestures could be identified with high consistency, so as not to have players see their fingers jumping around unnaturally.
Here’s to hoping for a ‘peace sign’ gesture so I can start work on Hippie-Sim 2016.
Haptic Feedback & Two Trackers
Haptic feedback is perhaps the area we know the least about on the Oculus Touch controller. The Oculus Rift Developer Guide describes the controller’s haptic feedback simply as “vibration”, which makes it sound like the same sort of ‘rumble’ you’d find from a gamepad.
But looking closer at the documentation, it’s possible that the Touch controller uses a linear actuator, rather than the usual ERM motor that produces the rumble in many gamepads. Linear actuators are capable of producing more fine-grain haptic feedback events like clicks, and seem to be the haptic basis for the HTC Vive controller as well.
The hint comes in the way that developers are able to specify how the vibration should function. Using ovr_SetControllerVibration, devs set which controller to vibrate and independently set the vibration and amplitude—that last part being the hint which may indicate a linear actuator over an ERM motor which generally has just one variable for control.
Interestingly, Oculus warns against extended durations of vibration:
Prolonged high levels of vibration may reduce positional tracking quality. Right now, we
recommend turning on vibration only for short periods of time
The documentation also specifies that “at least” two positional cameras will be used with the controllers.
“For installations that have the Oculus Rift and Oculus Touch controllers, there will be at least two constellation trackers to improve tracking accuracy and help with occlusion issues.”
Oculus Touch controllers are expected to go on pre-order at the same time as the Oculus Rift, sometime in 2015, with an expected release date of H1 2016. Pricing has not yet been announced.