X

Photo by Road to VR

Oculus Details ‘Buffered Haptics’ for Advanced Haptics on Touch Controllers

    Categories: HapticsNewsOculus TouchVR Development

Oculus has added new documentation to their developer knowledge base detailing the ‘buffered haptics’ feature of the Oculus SDK, a method for programming more advanced haptic feedback from the company’s Touch controllers.

Oculus Touch uses linear actuators to provide feedback, a haptic technology which has been increasingly replacing the simple ‘rumble’ feedback common console gamepads. Linear actuators can move very quickly compared to the rotating mass motors of yore, allowing a broader variety of haptic effects, faster response time, and better control. The newly documented ‘Buffered Haptics’ feature gives developers fine grain control of the controller’s haptic feedback.

The SDK supports two approaches to controller haptics, Buffered and Non-buffered. Oculus advises they not be used together to avoid unpredictable haptic behavior.

Non-buffered Haptics is more simple to conceptualize and control, and amounts to simply switching vibrations on and off with a specific frequency (160Hz or 320Hz) and amplitude (0 to 255). Oculus writes that Non-buffered Haptics is “designed for simple effects that don’t have tight latency requirements since the controller requires 33ms to respond to the API call that modifies the haptics settings.”

Photo by Road to VR

Buffered Haptics is not only faster to respond (10ms), but allows for a wider and more complex set of haptic effects, “such as patterning vibrational amplitudes around sine wave or tangent functions, panning the vibrations across controllers, generating a variety of low-frequency carrier waves, and more,” Oculus writes. The feature allows developers to queue up a string of bytes representing desired amplitudes which are then played back in sequence at 320Hz, allowing developers to finely adjust the amplitude between 0 (min) and 255 (max) once every 3.125ms.

A haptics sample app is provided with the Oculus SDK and gives an example of some of the haptic effects that can be achieved with Buffered Haptics:

  • Smooth sine wave vibration with a “buzz down” effect at the end of each wave cycle
  • Vibrational panning across the left and right controllers, again with a “buzz down” effect at the end of the panning cycle
  • Ultra low-frequency buzz, essentially a series of ticks at 64Hz
  • A “messed up” low frequency vibration based on a chaotic formula that utilizes a trigonometric tangent wave function

The documentation goes into additional detail about how the feature works, which involves queuing up the buffer with the desired haptic instructions before sending it to the controllers:

A buffer consists of a series of bytes with values from 0 to 255, where 0 represents no amplitude (i.e no vibration), and 255 represents the maximum amplitude (or intensity) of vibration that is allowed by the SDK. After your code fills in the values within a buffer, you send the buffer to one or both Touch controllers via ovr_SubmitControllerVibration. Each byte in the buffer is then “played” in sequence at a rate of 320Hz. The maximum buffer size (i.e. the maximum number of bytes that can be sent to a controller at one time, and also the maximum size of the controller’s internal buffer) is 256 bytes. The length of time that it takes to “play” a single 256 byte buffer is 0.8 seconds (256 bytes played at a rate of 320Hz). So, you have full control over the amplitude of the vibrational effects down to a resolution of 3.125ms (which equates to 320Hz). However, the frequency can only be 320Hz or some integral quotient of 320Hz, such as 320/2=160Hz, 320/3=106.7Hz, 320/4=80Hz, 320/5=64Hz, etc. You can achieve these lower frequencies by sending bytes that are zero filled, interspersed with bytes that have amplitude values that are greater than zero. Here are some examples:

  • 320Hz, full amplitude – [255, 255, 255, 255, …]
  • 160Hz, full amplitude – [255, 0, 255, 0, 255, 0, 255, 0, …]
  • 320Hz, half amplitude – [127, 127, 127, …, 127, …]
  • 160Hz, half amplitude – [127, 0, 127, 0, 127, 0, …, 127, 0, …]
  • Single sharp tick (320Hz) – [0, 0, 255, 255, 255, 0, 0] [delay x ms] [0, 0, 255, 255, 255, 0, 0]
  • Single blunt tick (160Hz) – [0, 255, 0, 255, 0, 255, 0] [delay x ms] [0, 255, 0, 255, 0, 255, 0]

In general, use the 320Hz resonant mode for lighter, sharper actions and the 160Hz mode for heavier, blunter actions.

Oculus also notes that developers can “vary the vibrational effects based on input streams, such as controller movement or position,” and pre-mix multiple input streams ahead of passing the information to the buffer, which could potentially allow for some interesting dynamic haptics which depend upon what the player is doing in the virtual world.

As you can see, filling up the instruction buffer to achieve the type of haptic effect you want could be quite challenging if the interface for doing so is literally a string of numbers. That’s something which the folks at Immersion Corp hope to fix to help developers more easily make better haptic effects.