Tony Bowren recently shared his rather ingenious solution to the problem of the Oculus Rift Development Kit’s lack of positional tracking using the Razer Hydra on YouTube. He talks to us about this project, a little of himself, and walks us through the code changes needed to achieve Oculus Rift positional tracking with the Razer Hydra yourself.

If you’d like to skip to the Code walkthrough, select ‘Page 2’ from the select box above or click here.

The Oculus Rift Dev Kit and Its Greatest Shortcoming

What’s missing from the Oculus Rift developer kit besides a high resolution display? Positional tracking, or the ability to detect the position of a user in physical space (we went into more detail a little while back if you’re interested).  Tony Bowren caused a stir online when he recently posted a video demonstrating his clever solution to this issue.

That solution was, the Razer Hydra. Cunningly strapped to the back of the HMD, one of the Hydra’s controller units acted as a positional sensor and allowed Tony to use that data to customise the Oculus Rift Tuscany Demo, part of the Oculus SDK. One of the stand out moments of the original video (see above) is when Tony’s son demonstrates leaning out of one of the windows and looking around, something not currently possible with the Rift dev kit alone. We reached out to Tony, eager to learn more about his approach to this issue and to find out a little more about him. Happily, he not only agreed to an interview but also to share his annotated code with us (see the code walkthrough on the second page of this article).

Quest 3 Black Friday Sales Might Not Be Huge But Here They Are

Road to VR: Can you tell us a bit about yourself?

Tony: My education is a Mechanical Engineering (Robotics Controls) Degree from UCI but I quickly went into gaming instead of engineering.  My first graphics job was to make a 3D intro for Interplay Productions back in 1994ish.

I have worked at Interplay and Acclaim before going into commercial and film work.  I was an FX artist on Warner Bros’ Osmosis Jones before coming back to games again. October will mark 10 years with NCsoft,  2 of that working on GuildWars cinematics and the other 8 making Wildstar. I am most interested in working on human / computer interaction especially as it relates to art creation.  My goals in getting more involved with VR programming are to reduce the barriers for people to create in VR and create tools that make it fun and intuitive. I am interested in developing approaches that really disrupt the way people think about content creation.  In order to do this, work has to be done on better input methods, specifically better tracking of head and hands.

Road to VR: Would you describe yourself as a VR enthusiast? When did you become aware of the Oculus Rift?

Tony: Growing up in the 80’s I was always interested in the idea VR but there was never any way for me to really be “enthusiastic” about something I could never experience much.  I heard about the Rift from some friends of  who saw “John Carmack’s” new goggles at E3 last year.  I was immediately on Google, finding and educating myself on all the details.  I woke up every morning and checked Kickstarter for the Rift, and finally on August 1st I was able to be the 22nd backer.

Hands-on: 'The 7th Guest' Delivers Disney's Haunted Mansion Vibes & Tons of Visual Flair

Road to VR: What interests you about the Oculus Rift and where do you think it might lead the games industry? Is there one game in particular you’re interested in seeing ‘in’ the Rift?

Tony: What interests me about the Rift is the ability to put the user and the computer in the same “space”.  When I watch IronMan, and see Tony Stark physically interacting with all his data and models I get very excited to be able to work like that.  I have always loved the Kinect and [PlayStation] Move, but to effectively use them I have to be 6 to 8 feet from the screen.  Bringing that screen up to my face eliminates all that and suddenly all that technology becomes incredibly more compelling.

Road to VR: Do you think there are significant implications for non-gaming interfaces and applications? Is there anything in particular you’d be interested in seeing?

Tony: Minecraft, Skyrim, and any good flight and driving simulators would be fantastic.  I would love to see an MMO in VR, but it would have to be designed with much less emphasis on UI and keyboard interaction than most currently are. I have thought a lot about these challenges and doing it well really does require significant design time.

Road to VR: The Razer Hydra has been out quite some time (and some would say ahead if its time), do you feel it might be about to enter a renaissance with the reinvigoration of virtual reality?

Tony: In terms on non-gaming interfaces, back in January I started playing with the Hydra in an attempt to track a head and create a virtual window into my computer.  You can see the video here:

'Bulletstorm VR' Gets Brief Delay, Pushing Launch to Early 2024

Tony: Tracking with the hydra was a pretty effective test.  I had tried using OpenCV to face track, but the processing was to slow.  The Hydra seemed to have no computational overhead, but one thing I did notice was that the aluminum frame of the Macbook Pro definitely warped the magnetic field. Different areas of the house or desk also effected it.  This is the primary weakness of the Hydra but because typical head motions are limited in range and speed, and because there are not typically a lot of metal object near your head, I feel it has potential.

Road to VR: What interests you about the Razer Hydra?

Tony: The Hydra interests me because it is a true 3 dimensional input device and it has buttons.  I have played with many type of gestural input schemes that read hands, but they don’t give reliable results. My mouse button ALWAYS clicks, it always drags and it always stops moving when I remove my hand. Without physical buttons, I have never been able to make any other gestural scheme work as effectively.  If its frustrating and inconsistent then it will not replace the mouse.

Road to VR: What next for you and the Oculus Rift? Any special projects you’d like to share with us?

Tony: My next project is going to be with the Kinect and the Rift.  I have all the code now to merge the two devices so once I get some time that is what I plan to play with.

Head over to page 2, where Tony walks us through the code changes to add Hydra positional tracking to the Oculus SDK’s Tuscany Demo.


This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.

  • Daan Kortenbach

    Hi Tony, great work! Happy to see more people are working on the idea of using Sixense tech for positional tracking :)

  • WormSlayer

    Awesome, I remember that interplay logo! :D

    • Paul James

      That’s pretty much word for word what I said to Tony when he told me. :)

  • realbogart

    I really want to get this to work but there seems to be some class (GamepadState) and parameters in RenderParams missing. I am using version 0.2.3 of the oculus sdk. Thanks for this!

  • eyeandeye

    Does anyone else have a problem with “File Load Error” when trying to run it? I downloaded the compiled version, put the two missing sixense.dll files in with it, but can’t figure out which files from the tuscany demo to copy over. I’ve tried copying all of them, and putting them in various locations within the hydraheadtracker folder, no luck. Sorry if I’m asking in a totally random, inappropriate or outdated location.