eric-vezzoliDeep in the basement of the Sands Expo Hall at CES was an area of emerging technologies called Eureka Park, which had a number of VR start-ups hoping to connect with suppliers, manufacturers, investors, or media in order to launch a product or idea. There was an early-stage haptic start-up called Go Touch VR showing off a haptic ring that simulated the type of pressure your finger might feel when pressing a button. I’d say that their demo was still firmly within the uncanny valley of awkwardness, but CEO Eric Vezzoli has a Ph.D. in haptics and was able to articulate an ambitious vision and technical roadmap towards a low-cost and low-fidelity haptics solution.


Vezzoli quoted haptics guru Vincent Hayward as claiming that haptics is an ‘infinite degree of freedom problem’ that can never be 100% solved, but that the best approach to get as close as possible is to trick the brain. Go Touch VR is aiming to provide a minimum viable way to trick the brain starting with simulating user interactions like button presses.

I had a chance to catch up with Vezzoli at CES where we talked about the future challenges of haptics in VR including the 400-800 Hz frequency response of fingers, the mechanical limits of nanometer-accuracy of skin displacement, the ergonomic limitations of haptic suits, and the possibility of fusing touch and vibrational feedback with force feedback haptic exoskeletons.

Hands-on: 4 Experimental Haptic Feedback Systems at SIGGRAPH 2016

Support Voices of VR

Music: Fatality & Summer Trip

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.

  • Lucidfer

    They (and I) beg to differ:

    • user

      i cant imagine how arrays of small touchscreens would have to be constructed in order to make them work with exoskeletal gloves.

    • Eric Vezzoli

      Hi, one of the features of Haptics is that it does not exist one technology viable for all the applications. Tannvas technology, similarly to Hap2u technology, is based on the modulation of the friction of a finger sliding on a surface. This system is viable for touchscreens, and it will revolution smartphones and tablets how we know them. However, it is not suitable to render object in an empty space. For VR, the approach should be different, and we are pursuing a different way.

      • Lucidfer

        Well I see Tanvas as the starting point, and whenever smartphone screen haptic feedback is ready (it is, but when are those billion dollars corporations going to integrate is another question), this will be the first generation of haptic interaction in trackpads, tablets and smartphones.

        Then the future is for VR to be accompanied by some bracelet (because nobody is ever going to wear gloves) that can integrate haptic transducers arrays like this one ( when they are reduced to the sizes of transistors, therefor anywhere our hands moves toward a virtual object or interface, the bracelet will “project” ultra-haptic sensation to the hand relevant to position the of virtual objects.

        I’d give it about 15 years before it’s there, around the same time VR glasses will replace headset if they do pick-up now, but I’m glad other people are looking into other solutions. Is skins-through haptic induction is possible for example?

        • Eric Vezzoli

          Thanks for the comment! I’ll try to give my answer for each point that you addressed.
          I agree with you that gloves are not the answer, that’s why we are pursuing our approach. In prospective we have some really interesting technology in development to address the wearing action of the devices.
          Regarding the ultrasound technology, the problem is not in the array size, but it is with the amount of force that is possible to provide. Touch sensations are in the range of the Newton, and I see extremely difficult to have a stable pressure field of 10 kPa on an area as Big as the finger and a action range of meters. Moreover, there is the occlusion effect. I see it as a great tech where you should not touch an actuator (hospitals, cars), but not forcefully for VR.
          Regarding the friction modulation techs on which I got the PhD, (tanvas, hap2U, senseg, etc) they are really dependent on sliding, so I hardly imagine an application for manipulation, which is the benchmark for VR. I might be wrong anyway.
          For the direct stimulation, a lot tried with small success since 40 years. The stimulation is not precise enough to avoid nocireceptors, which are responsible of the pain sensation. Quite unpleasant (I tried for a neuroscience experience).

          • Lucidfer

            Thanks for the reply, interesting! In terms of prospective I do think there are coherent scenarios for the mid term already: I don’t see any other socially acceptable wearable haptic device than a convenient bracelet. In terms of technology, all we have is these transducers which by being reduced in size probably also means being more precise in “micro-pressure” (like friction modulation techs are very precise now), however I don’t it has to project on a area bigger than your hand: since this is a bracelet and not a fixed projector, it would not project the virtual haptic object in a specific place, it would just track your hand/fingers in space and project only the relevant haptic feedback to your hand/finger depending on it’s position. Then for the occlusion problem I think this is a problem only when you hands or fingers are oriented up which you rarely do except when surface like walls or object higher, but I’m sure there’s a solution.

            As for direct simulation, I’d like to know more about the inner working, but I didn’t realise this activate nocireceptors. It’s funny because a few years back when starting VR I realised that eventually the only physically possible way to track the hand in actual real-time was not to do it afterwards, but to intercept and “read” the nerve signals sent to the hand. So I guess the topic or incentive of an haptic feedback bracelet also matches the challenge of a real-time hand movement anticipation tracking. In fact, while not directly related, there exist bracelets that can “project” temperature like cold or hot to the body ( or, alongside the plethora of health or skin-make-up tracker, but this just made me realise that the incentive of smartwatches, and further wearables (including the like of VR headset/glasses) is found in biomechanical/chemical interactions with the user, not just hardware and functions which is starting to have limited incentive. Haptic is a big part of it, and I think it’s starting soon.

  • Maddy25

    i first read “Tracking the brain…” XD