If you haven’t heard of the Leap Motion, you’ve got some catching up to do. It’s an $80 motion sensing peripheral, designed to sit on your desk, that boasts ultra low latency and ultra high accuracy for tracking hands and other objects. It seems like a natural fit for the Oculus Rift and virtual reality input, but so far we’ve seen much more development happening with the Razer Hydra than the Leap. Here’s a video showing early implementation of the Leap in unit combined with the Oculus Rift and positional webcam tracking.
Harley Witt is the author of this prototype and he tells me that this prototype is built in Unity with models from Asset Store. The positional head tracking is done with the webcam, which is likely part of the reason that it’s so jumpy.
“I took a sample Unity project that had simple hand and finger tracking and with Unity Mecanim feature it has an inverse kinematic system for head, hands and feet. It was fairly simple to pick a relative pointable object (Finger) and map it to a hand position for the IK,” said Witt.
It would seem that the Leap API is not yet terribly adapted for virtual reality input.
“…there are still big issues with the way Leap Motion implements it’s pointable objects. In a nutshell there is no persistance of object order and you can’t rotate your hand beyond sideways. Thumb is missing most of the time. Unity only has tracking for the palm, not the fingers anyway. When the coding starting getting more complicated I stopped (to hopefully let Unity and Leap Motion software catch up). There is much more one could do, but my priority will likely shift to the Hydra,” said Witt.
“…I seriously doubt it,” he said when I asked if the Leap was a viable VR input option for hands. ” The dev team doesn’t seem interested in pushing their capability into adequate skeletal tracking. They have point cloud data but they aren’t sharing it through an API (a little sad about that).”
Sad to hear given the capabilities of the Leap, but it should be noted that the company shipped out developer kits several months ago and the consumer model is shipping early next month. Thus there’s likely many improvements to be made down the road. Or at least we hope!
Last time we checked in on Nathan Andrews, he was working on a Half Life 2 VR mod to support independent head and gun tracking. Andrews has continued his work on the mod which now supports the Oculus Rift and brings the Razer Hydra controller into the mix. The early results look positively enjoyable and you can download alpha version of the Half Life 2 virtual reality mod today!
Half Life 2 is one of the games that I’ve been dying to play in virtual reality and it looks like I’ll finally have the chance to do so thanks to the work of Andrews.
The mod, which is an early alpha release, adds the independent head and gun tracking, as well as Razer Hydra support, to Half Life 2. The mod is used in conjunction with the Vireio Perception drivers which add Oculus Rift support. Andrews has a guide which lists all of the files and procedures necessary to get the mod working:
Andrews also offers some pointers for first-timers:
Don’t expect to be amazing right of the bat (and play on easy). It’s easy to forget how much time we’ve spent perfecting our ability to aim with a mouse, aiming with an actual physical object in 3D space is challenging at first. It’s also really rewarding as you realize how much better you are after a few hours.
Try not to run and gun too much, take your time, use walk whenever you don’t need to explicitly run, take in the atmosphere, use cover and pick off the enemies, etc. Sprinting around firing wildly and spinning around are all pretty disorienting with the rift, even though I don’t ever get motion sick it’s still just not an enjoyable experience.
This shouldn’t be surprising to anyone following along with the VR scene. Virtual reality is going to massively impact game design, and users are going to want to play differently than they would with a mouse and keyboard.
Take the Virtuix Omni VR treadmill for instance. In any one of the Halo games, the player’s character might traverse several miles on foot over the course of the game — sprinting the whole way. While there is a huge opportunity to naturally introduce exercise into the realm of videogames, players probably aren’t going to take well to games where they are expected to sprint constantly around a massive world. An example I’ve used before: a GTA style VR game might do better with a single detailed city block in which an intricate narrative plays out, rather than the city-roaming gameplay of current GTA games.
Seeing the reload animations up close in Half Life 2 is really cool and I’m looking forward to a game where someone introduces realistic gun mechanics (ie: clip loading, bolt pulling, shotgun cocking, etc.) which you’ll be able to actually do thanks to two virtual Hydra hands.
Below you can see some gameplay teasers of Half Life 2 with the Oculus Rift and Razer Hydra:
Christer-Andrew (aka Namielus) shares his 10 step illustrated guide on how to demo your Oculus Rift safely and effectively to your friends, family, colleagues or customers. Originally started as a post on the MTBS3D forums, this expanded and now illustrated version should ensure you’re well prepared for any possible Rift-demo-based scenario.
Today you can pick up the Razer Hydra for $40, which is 60% off of the retail price, from Woot.com. The Razer Hydra is a motion gaming controller created by our friends at Sixense which offers highly accurate 6 DOF magnetic tracking — many early Oculus Rift developers are using it for immersive virtual reality input.
Also to be noted, Razer has been running a ‘VR Promo’ since last month which will net you 50% off the Hydra ($50) or the Hydra Portal 2 Bundle ($70) which comes with a full copy of Portal 2 along with Hydra-exclusive levels. The latter of which is offered by Amazon for $91. The promo expires in little over a week, on April 30th at midnight.
And for those of you in the awkward position of already owning Portal 2 but not wanting to shell out for the Hydra Portal 2 Bundle, Sixense will kindly provide you with the Hydra-exlusive DLC for free when you send them the serial number of your standalone Razer Hydra. Just email them with your request and serial number here.
Hat tip to r/oculus user ‘Hangonasecond’ for pointing out this deal!
Razer Hydra Tuscany Demo
At GDC 2013 I got to play around inside the Oculus Rift Tuscany Demo which Sixense had fitted for full Razer Hydra support:
At GDC 2013, Oculus VR Inc. gave a series of presentations on virtual reality. The company has now posted Palmer Luckey’s talk, ‘Virtual Reality: The Holy Grail of Gaming’ online for your viewing pleasure.
Palmer Luckey’s Oculus Rift GDC 2013 Presentation, ‘Virtual Reality: The Holy Grail of Gaming’
“For years, developers have strived to make immersive virtual worlds, and gamers have spent countless billions on the systems that play them best. Software, hardware, and input devices have all leapt forward, but the connection between the player and the virtual world has remained limited. We’ve dreamed of stepping inside of our games, but the best we’ve been able to do is puppet characters through a tiny window! Technological progress in a variety of fields has finally brought immersive virtual reality within reach of gamers. We’ll discuss VR’s false starts, what’s different this time, and why virtual reality is poised to revolutionize the way we play games,” reads the official description of the presentation.
Luckey talks fast. I think it’s a combination of both nervousness and excitement. His enthusiasm for the topic pours out. He’s an impressively humble guy; his presentation claims that virtual reality is the “the holy grail of gaming,” not the Rift itself. He readily admits that where we’re still a long way from a perfect VR experience.
Luckey uses an interesting analogy that positions books and movies as mature media, while he says that video games are still young.
The crux of this analogy is that books and movies are done making advances that “fundamentally change the experience”. Advancements in these areas, he says, are changing how you read books, or how you watch movies, but the experience is still largely the same as it has been for some time.
Video games, on the other hand, are still a young medium and the leap to virtual reality will fundamentally change the experience of playing games, according to Luckey. It’s now possible to immerse people in ways that couldn’t be done before.
Google Glass videos are starting to pop up as the AR glasses reach early adopters. One video shows a cool perspective on a go kart race. Google has published the Glass manual online which reveals some official Google Glass specs and features.
The Google Glass manual gives a fairly detailed breakdown of what can be done with the unit. There’s a more graphical ‘Getting to Know Glass’ page available here.
The manual gives us the first official confirmation of some Google Glass specs:
Fit
Adjustable nosepads and durable frame fits any face.
Extra nosepads in two sizes.
Display
High resolution display is the equivalent of a 25 inch high definition screen from eight feet away.
Camera
Photos – 5 MP
Videos – 720p
Audio
Bone Conduction Transducer
Connectivity
Wifi – 802.11b/g
Bluetooth
Storage
12 GB of usable memory, synced with Google cloud storage. 16 GB Flash total.
Battery
One full day of typical use. Some features, like Hangouts and video recording, are more battery intensive.
Charger
Included Micro USB cable and charger.
While there are thousands of Micro USB chargers out there, Glass is designed and tested with the included charger in mind. Use it and preserve long and prosperous Glass use.
Compatibility
Any Bluetooth-capable phone.
The MyGlass companion app requires Android 4.0.3 (Ice Cream Sandwich) or higher. MyGlass enables GPS and SMS messaging.
The display section tells us that the display is ‘the equivalent of a 25 inch high definition screen from eight feet away’ which is a very long and annoying way to say that the Google Glass has a field of view is ~14.7 degrees (diagonally).
In the Google Glass FAQ section, it is mentioned that Glass shouldn’t be used by children under 13, along with those who have had Lasik surgery:
Like when wearing glasses, some people may feel eye strain or get a headache. If you’ve had Lasik surgery, ask your doctor about risks of eye impact damage before using Glass. Don’t let children under 13 use Glass as it could harm developing vision. Also, kids might break Glass or hurt themselves, and Google’s terms of service don’t permit those under 13 to register a Google account.
MyGlass App for Android
Google has launched a Glass companion app on Android called MyGlass.
MyGlass allows you to configure and manage your Glass device.
If you don’t have Glass, then downloading this will be a waste of time. Sorry about that. But if you swipe the screenshots to the right you’ll see there’s a picture of a puppy in pajamas. So not a total waste of time after all.
MyGlass also adds a few other features like identifying your location for turn-by-turn directions, sending and receiving SMS messages, ‘screencasting’ your Glass display, and more. The app requires Android 4.0.3 (Ice Cream Sandwich) or higher. Screencasting lets you stream your Glass camera view to your device.
As for iOS, BlackBerry, and Windows Phone, Google says that “all of the functionality of Glass is available via these devices’ Bluetooth connection, with the exception of SMS and directions.” I’m not entirely sure how screencasting could work without the MyGlass app, but I’d be happy if it did!
In partnership with Road To VR, I present Cymatic Bruce’s Video Breakdowns! For those that do not know me, I am a VR Vlogger that has been producing Oculus Rift gameplay videos and VR Development observations. You can check out what I have been up to on my YouTube channel. Nice to meet you!
These segments will be a space for me to dive a little deeper and expand upon my experiences with the Oculus Rift Development Kit. I almost always find something useful after I post a video, or one of my viewers may pass on information to make an experience better. Unfortunately, these nuggets of knowledge often get lost in the comments section never to be seen again. That all changes here! Let’s dive right in with a closer look at my latest experience, Portal 2.
Vireio and Head Tracking
Portal 2 is on the short list of games that work with Vireio Perception, an open source program that forces a game to recognize the Rift’s headtracking, play in stereoscopic 3D, and warp the screen properly for Rift viewing. Perception is an exciting tool, because the games we already love to play can be “retro-fitted” for VR. However, some tweaking will be needed to achieve a comfortable experience in the Rift. One of the easiest tweaks is head tracking.
Perception assigns pitch (nodding your head up and down) and yaw (turning your head side to side) to the in-game mouse control. By adjusting the mouse sensitivity, you can control the relationship between the distance you turn your head and the distance the in-game camera moves. In the case of Portal 2, turning the mouse sensitivity all the way down results in an approximate 2:1 relationship . Turn your head 90 degrees, then in-game camera turns about 45. Turn sensitivity all the way up for an approximate 1:7 ratio. Turn your head 90 degrees, and the game turns 630!
The desired setting will depend on personal preference. Some will prefer tracking 1:1, close to real life. I tried out the lowest setting, and found it very comfortable. I was able to use my head for precise aiming very easily. I will say that higher settings seem to increase the chance of mouse drift.
A strange side effect of having head tracking in a game that was not designed for it is “shadow crawl”, where real-time shadows and reflections shift around when you roll your head. I did the quick and dirty solution: disable the real-time shadows. To do so, type the following into the console:
r_shadows 0
FOV is Key
One of the most important adjustments to make in Portal 2 is to change the field of view, or FOV. This is accomplished by typing the following commands into the console:
sv_cheats 1
cl_fov 110
The FOV of Portal 2 will not go beyond 90 degrees unless cheats are activated.
Changing the FOV is a necessity in almost every game compatible with Perception. Most games have a standard FOV of 90 degrees, which is actually the same horizontal FOV of the Rift. However, you have to consider how little of the screen you see. Below is a rough estimate of what I see on screen, using the C lens cups. As you can see, my in-game horizontal vision is rather limited.
If you leave this alone, the entire game will look entirely too close to your face. It may make your eyes feel funny too – at least it did for me. Setting the FOV to 110 works in most games for me, give or take a few degrees. Depending on the lens cups you use and your level of comfort, this number may vary.
Why No Hydra?
Portal 2 is one of the few AAA titles that feature full Razer Hydra integration. Given how well the Hydra and the Rift work together, it is natural to assume that Hydra + Portal 2 + Rift is going to be the ultimate experience. Unfortunately, its not quite there yet. When the Hydra is in use, the mouse is deactivated along with the head tracking. You suddenly find yourself riding the movement of your right hand, which is a recipe for wooziness! Perhaps down the line we will see a proper implementation of VR into Portal 2, and be able to enjoy this stellar combination of peripherals!
Motion Sickness and Eye Strain
Portal is a franchise famous for its acrobatics and death defying drops. Experiencing these feats first hand in VR is not recommended for Rift rookies! I have been playing with the Rift for at least a few hours every day for the past 2 weeks, and even I almost fell out of my chair. Fortunately, I never felt queasy during gameplay, just had a pleasant tingly feeling that one might have after riding a really awesome roller coaster.
After 2 weeks of daily Rift use, I have no issues with eye strain or discomfort. When the IPD (Inter Pupilary Distance or distance between your virtual eyes) and convergence settings are right, your eyes will focus in the Rift similar to how they focus in real life. Thankfully, the latest build of the Vireio Perception includes a calibration tool that will ensure a pleasant viewing experience!
Thank you so much for reading and watching my humble content! If you have any questions, please feel free to post them to the VR Forum, or in the comments below.
Google Glass Explorer Dev Kits are being delivered to developers – and following suit are the first unboxing videos that show the device’s great production quality.
Last year at Google’s I/O conference, the web behemoth offered an exclusive look into its vision of the future of head-mounted devices: Google Glass. The augmented reality glasses created lots of buzz when Google released an interesting video of the technology in action. Interested developers could sign-up for the Google Glass ‘Explorer’ program at the conference, offering them early access to the high-tech glasses. Participation in the program didn’t come for free though – the development kit comes at a hefty price of $1,500.
It seems that Ben was spot on with his prediction more than 9 months ago about the number of pre-orders taken for Google Glass at I/O 2012:
“There’s no official word on how many pre-orders were placed. There were around 6000 people in attendance at Google I/O and pre-orders were open to US-based atendees only which probably cuts that number down to 4500. My best guess for how many pre-orders were placed is around 2000,” he wrote in an article in July of 2012.
As reported by Engadget, Google now says that “around 2,000” developers pre-ordered Google Glass through the ‘Explorer’ program. 8,000 additional pre-orders came through Google’s #ifihadglass campaign which ran from February 20th to February 28th, 2013.
Google Glass Glass Unboxing
Google Glass uses a small prism to display information to the wearer in a non-intrusive way, overlaying their view of reality and accepting speech commands for easy control of the device. Developers will be able to program apps for the glasses, so early participation of the developer crowd is one of Google’s main goals. The company has announced that the production of the first dev kit units has finished a few days ago and the first developers have now received their devices.
One of them is Dan McLaughlin, who has uploaded his Google Glass unboxing video to Youtube. And, like a true Google Glass wearer would do, he took his unboxing video through Google Glass itself!
According to his video, Google Glass comes in a very sophisticated package with great industrial design and high production value. Included in the sleek bag are the device itself, a charger, a carrying bag, extra nose-pads as well as a shaded and a clear version of the lens glasses. The video quality offered by the 720p camera is pretty good; his comments through the microphone are also easily understandable.
Unfortunately, sound farther away from the device isn’t recorded quite as well. You can hear this in an interview taken through Google Glass, which was done by Robert Scoble – who was pre-order number 107 at Google’s I/O 2012 conference.
Google says that Glass is shipping out in waves and that they’ll notify pre-orderers as they become available.
A new video of the Virtuix Omni omnidirectional treadmill shows the latest prototype unit being used to play TF2 with the Oculus Rift. As expected, using your entire body to move inside of the game looks significantly more immersive than using a keyboard.
Ladies and gentlemen, you’re looking at the future of gaming.
CEO of Virtuix, Jan Goetgeluk, told me about the TF2 experience that, “The action feels like real running. The immersion is intense. I had a former Marine try HL2 yesterday, and he was slightly shaking.”
At the moment the Virtuix Omni is using Kinect for tracking, but Goetgeluk say that the company is working on an integrated tracking solution that will be part of the Omni.
Actual running speed is not yet tied to in-game speed, according to Geotgeluk, but that will come in due time.
CloudHead Games Creative Director, Denny Unger, basks in a successful campaign.
CloudHead Games Creative Director, Denny Unger, basks in a successful campaign.
CloudHead games, developers of the forthcoming Oculus Rift game The Gallery: Six Elements, have just closed a successful Kickstarter funding round with $82,937 of their $65,000 goal. A new video reveals some early gameplay which makes use of the Oculus Rift and Razer Hydra — the results are already look extremely promising.
In the last few weeks of the Gallery Kickstarter it wasn’t clear if they were going to make it. But as the deadline approached, it became apparent that the VR community was not about to see this project go unfunded. The campaign raised an average of $2,675 per day and ended with 127% of the $65,000 goal. This figure hit the $70,000 and $80,000 stretch goals which were Razer Hydra support and a female avatar, respectively.
The folks at CloudHead were clearly pretty excited about the end of the campaign:
We can’t thank everyone enough, though we’ll try! Truly, we’ve never seen such a heartfelt effort made by backers of any campaign to help a game succeed on Kickstarter. You are all simply incredible and your drive, your passion for where we can take things is such an inspiration to all of us here. We can’t wait to work with you!
The developers say that they’ll continue the funding drive at the official website soon to continue to raise money to make the game as good as it can be. Furthermore, CloudHead now needs your help to achieve ‘Steam Greenlight’ status to make the game available on Steam once it launches. This only requires a steam account and a click so go check it out!
As I wrote after my hands-on with the Oculus Rift Razer Hydra Tuscany demo,
The Tuscany Razer Hydra demo is absolutely the most fun I’ve had yet with virtual reality. Soon, game developers will be wrapping these new natural interactions in compelling narratives and enticing gameplay and I can’t wait to step into those experiences. This is an extremely exciting time to be a gamer!
The Gallery looks to be doing just this — and damn am I excited! I can’t wait to step into the virtual world and reach into games to be immersed in ways never before possible with consumer hardware.
Combine this game with the Virtuix Omni and we’ll be in gaming heaven.
The Oculus Rift developer kit is driven by an external power source – but a resourceful modder has found a way to power the unit through USB, making the wall plug obsolete.
The Virtuix Omni is a passive omnidirectional treadmill that looks like it could fill one of the last major missing pieces of the VR puzzle. The Omni, which is soon to hit Kickstarter, allows players to walk, jump, and literally sprint inside of their favorite games. Virtuix CEO Jan Goetgeluk tells me he thinks his company has “cracked the formula” for a consumer omnidirectional treadmill that will have players more immersed than ever before.
I will admit, I was absolutely impressed when I saw that the Omni will allow players to actually sprint inside of their favorite games:
For a long time I’ve been saying that an omni-directional VR treadmill will have major implications for games. It’s one thing to hold a thumb-stick or mouse button and have your character sprint at 20 MPH for hours on end. But when you are the one who has to do the sprinting, things change fast — everything from gameplay to game pacing is impacted by how quickly your character moves.
In a game like GTA IV (Rockstar North, 2008), you constantly run from one point to the next in a huge city. With a system where you actually need to run to run, game developers had better expect a lot more walking. Suddenly those blocky pedestrians will need to high quality assets to stand up to the scrutiny of a player strolling by down the sidwalk. In a virtual world where the player really has to walk and run, maybe an entire city isn’t the best environment. Perhaps a single, high detailed, city block would be better suited to the medium.
If the Omni succeeds in its mission it will take VR gaming to a new level of immersion.
Imagine a terrifying game like Slender: The Arrival (Parsec Productions, 2013)wherein you are pursued in a dark forest by a terrifying daemon and the only way to survive is to run for your life. With a keyboard, you simply hold the ‘W’ key to sprint away from that nightmare. With a VR treadmill like the Virtuix Omni, you won’t just sprint at one set speed — you’ll have to actually run for your life. As I imagine this scenario in my head (playing Slender with the Omni, Oculus Rift, and Razer Hydra) I can almost feel the terror coursing through me. Mark my words, people are going to be screaming and sprinting for their lives, anxiously peering behind them to see if they’ve gotten away. I’ll be the first in line.
Coming to Kickstarter, Endorsed by Palmer Luckey
As reported by 3D Focus, Oculus VR Inc founder Palmer Luckey will be officially endorsing endorsing the Virtuix Omni in the forthcoming Omni Kickstarter campaign which is expected in May.
“Palmer and others (Chris Roberts, Paul Bettner) tried the Omni at SXSW in Austin this past March and greatly enjoyed it. We were allowed to film our demo night for Kickstarter, so we’ll have some fun footage to share. Palmer is endorsing the Omni for our Kickstarter campaign,” Goetgeluk told 3D Focus.
Virtuix is in the process of filming and editing their Kickstarter video materials. The Omni price has not been announced but the obvious aim to is to make it affordable for your everyday gamer.
Virtuix Omni and the Oculus Rift
Palmer Luckey (Oculus VR) and Jan Goetgeluk (Virtuix Omni)
Virtuix CEO Jan Goetgeluk recently picked up an Oculus Rift developer kit and tested it with the Omni for the first time.
“I tried the Rift with the Omni this morning, a magical experience… Walking around the Tuscany villa with the Omni must have been my strongest VR moment so far. My brain started to believe I was in Italy… VR users will want and need a natural interface to experience VR. I am now more convinced than ever that the Omni will become a crucial part of VR,” he told me.
Thanks to the built-in headtracking and wide FoV, the Oculus Rift makes a natural companion for the Omni. Together they take care of two huge components of the VR puzzle. Along with the Razer Hydra or a similar system for 6DOF hand-input, the trifecta will comprise a highly immersive virtual reality system at a price that consumers can actually afford — the first time this has ever happened.
The developers of MechWarrior Online, the latest title in the long-standing mech franchise, tried out the Oculus Rift at GDC 2013 and seem very impressed. Their report from GDC teases heavily that Oculus Rift support for MWO could be in the works.
MechWarrior Online is one of those titles that come to mind when you think about games that would work great with Oculus Rift support or in virtual reality in general: first off, you’re seated within a huge Mech – and who hasn’t dreamt of steering 50 tons of hardened steel through a futuristic battlefield? But more importantly, many reviewers have noted that games that put you in a seating position would be the best fit for the Rift.
Tony Bowren recently shared his rather ingenious solution to the problem of the Oculus Rift Development Kit’s lack of positional tracking using the Razer Hydra on YouTube. He talks to us about this project, a little of himself, and walks us through the code changes needed to achieve Oculus Rift positional tracking with the Razer Hydra yourself.
If you’d like to skip to the Code walkthrough, select ‘Page 2’ from the select box above or click here.
The Oculus Rift Dev Kit and Its Greatest Shortcoming
What’s missing from the Oculus Rift developer kit besides a high resolution display? Positional tracking, or the ability to detect the position of a user in physical space (we went into more detail a little while back if you’re interested). Tony Bowren caused a stir online when he recently posted a video demonstrating his clever solution to this issue.
That solution was, the Razer Hydra. Cunningly strapped to the back of the HMD, one of the Hydra’s controller units acted as a positional sensor and allowed Tony to use that data to customise the Oculus Rift Tuscany Demo, part of the Oculus SDK. One of the stand out moments of the original video (see above) is when Tony’s son demonstrates leaning out of one of the windows and looking around, something not currently possible with the Rift dev kit alone. We reached out to Tony, eager to learn more about his approach to this issue and to find out a little more about him. Happily, he not only agreed to an interview but also to share his annotated code with us (see the code walkthrough on the second page of this article).
Road to VR:Can you tell us a bit about yourself?
Tony: My education is a Mechanical Engineering (Robotics Controls) Degree from UCI but I quickly went into gaming instead of engineering. My first graphics job was to make a 3D intro for Interplay Productions back in 1994ish.
I have worked at Interplay and Acclaim before going into commercial and film work. I was an FX artist on Warner Bros’ Osmosis Jones before coming back to games again. October will mark 10 years with NCsoft, 2 of that working on GuildWars cinematics and the other 8 making Wildstar. I am most interested in working on human / computer interaction especially as it relates to art creation. My goals in getting more involved with VR programming are to reduce the barriers for people to create in VR and create tools that make it fun and intuitive. I am interested in developing approaches that really disrupt the way people think about content creation. In order to do this, work has to be done on better input methods, specifically better tracking of head and hands.
Road to VR: Would you describe yourself as a VR enthusiast? When did you become aware of the Oculus Rift?
Tony: Growing up in the 80’s I was always interested in the idea VR but there was never any way for me to really be “enthusiastic” about something I could never experience much. I heard about the Rift from some friends of who saw “John Carmack’s” new goggles at E3 last year. I was immediately on Google, finding MTBS3D.com and educating myself on all the details. I woke up every morning and checked Kickstarter for the Rift, and finally on August 1st I was able to be the 22nd backer.
Road to VR: What interests you about the Oculus Rift and where do you think it might lead the games industry? Is there one game in particular you’re interested in seeing ‘in’ the Rift?
Tony: What interests me about the Rift is the ability to put the user and the computer in the same “space”. When I watch IronMan, and see Tony Stark physically interacting with all his data and models I get very excited to be able to work like that. I have always loved the Kinect and [PlayStation] Move, but to effectively use them I have to be 6 to 8 feet from the screen. Bringing that screen up to my face eliminates all that and suddenly all that technology becomes incredibly more compelling.
Road to VR: Do you think there are significant implications for non-gaming interfaces and applications? Is there anything in particular you’d be interested in seeing?
Tony: Minecraft, Skyrim, and any good flight and driving simulators would be fantastic. I would love to see an MMO in VR, but it would have to be designed with much less emphasis on UI and keyboard interaction than most currently are. I have thought a lot about these challenges and doing it well really does require significant design time.
Road to VR: The Razer Hydra has been out quite some time (and some would say ahead if its time), do you feel it might be about to enter a renaissance with the reinvigoration of virtual reality?
Tony: In terms on non-gaming interfaces, back in January I started playing with the Hydra in an attempt to track a head and create a virtual window into my computer. You can see the video here:
Tony: Tracking with the hydra was a pretty effective test. I had tried using OpenCV to face track, but the processing was to slow. The Hydra seemed to have no computational overhead, but one thing I did notice was that the aluminum frame of the Macbook Pro definitely warped the magnetic field. Different areas of the house or desk also effected it. This is the primary weakness of the Hydra but because typical head motions are limited in range and speed, and because there are not typically a lot of metal object near your head, I feel it has potential.
Road to VR: What interests you about the Razer Hydra?
Tony: The Hydra interests me because it is a true 3 dimensional input device and it has buttons. I have played with many type of gestural input schemes that read hands, but they don’t give reliable results. My mouse button ALWAYS clicks, it always drags and it always stops moving when I remove my hand. Without physical buttons, I have never been able to make any other gestural scheme work as effectively. If its frustrating and inconsistent then it will not replace the mouse.
Road to VR: What next for you and the Oculus Rift? Any special projects you’d like to share with us?
Tony: My next project is going to be with the Kinect and the Rift. I have all the code now to merge the two devices so once I get some time that is what I plan to play with.
Head over to page 2, where Tony walks us through the code changes to add Hydra positional tracking to the Oculus SDK’s Tuscany Demo.
Ever wondered what the inside of the Oculus Rift dev kit actually looks like or how it scores when it comes to reparability? If you have, be sure to check out iFixit’s extensive Oculus Rift disassembly. The writers take apart the device step by step and show what kind of choices Oculus VR has made during the design of the 7-inch developer kits with the help of great, descriptive pictures. What they find bodes well for the future of the Oculus Rift mod community.
Apparently, all that’s needed to separate the different parts of the Oculus Rift dev kit is a Phillips screwdriver, a plastic opening tool, and a quarter dollar. According to the iFixit teardown, the Rift is also very easy to reassemble and can be taken apart in less than 10 minutes. Because of this accessibility, the writers give the Rift a repairability score of 9 out of 10 points.
A chip that has printed “A983 2206” on top – iFixit suspects that “this is a three-axis magnetometer, used in conjunction with the accelerometer to correct for gyroscope drift”
Control Box (feeds the HMD with HDMI, DVI, USB and power input): Realtek RTD2486AD display interface controller, 256 KB Winbond W25X20CL serial flash, and a Techcode TD1484A synchronous rectified step-down converter
So, if you ever should have problems with your Oculus Rift Dev Kit or want to upgrade the existing components, then it looks good that you can grab your soldering iron and tweak the device yourself – which gives hope that future consumer versions of the Rift might be just as accessible.
Oculus Rift Mod Potential
The ease of disassembly offers a great perspective for the Oculus Rift mod: since the unit is so accessible and can be taken apart within minutes, adding features like positional tracking or Ambilight-like peripheral lighting are pretty easy… in fact, these additions have already been made! Caleb Kraft of hackaday.com has used an LED strip to illuminate the peripheral vision just like Ambilight does on your TV screen, while Tony Bowren has used one of his Razer Hydra controllers to add positional tracking. The results are stunning, especially considered that we’re only in the early stages of Rift hacking.
There are even more features that could be added to the Rift relatively easily: an LED system for optical flow positional tracking via a camera (akin to what PlayStation Move and PlayStation Eye do), dual cameras in front of the Rift for a video throughput and AR applications or even fancy technologies like small-scale eye-trackers that would make foveated rendering a possibility.
The possibilities are endless and all of us at Road to VR are excited to see what other features the hacking community will come up with. Do you have an idea for a Rift feature that the hacking crows should tackle? What would you like to see as an addition to your dev kit? Leave us a comment with your ideas!