The Gallery community Q&A with Denny Unger from CloudHead Games

2

gscreen11

We put your questions to Denny Unger from CloudHead Games, the development studio behind Oculus Rift integrated game The Gallery: Six Elements. A game that not only seeks to revive memories of classic adventure games such as Myst – but to also wrap it all in a Virtual Reality showcase.

Early Oculus Rift Reviews from Two Experts

0

oculus rift review

Since the very first reported arrival of the Oculus Rift developer kit on March 27th, units are trickling out to developers around the world. Two folks that I follow closely got their hands on a Rift to examine and both happen to be experts in their particular fields. They’ve posted their first early Oculus Rift review of the dev kit and both accounts are well worth a read.

Virtual Reality Researcher, Oliver Kreylos’ Early Oculus Rift Review

oculus rift review oliver kreylos

Oliver Kreylos is a PhD virtual reality researcher who works at the Institute for Data Analysis and Visualization, and the W.M. Keck Center for Active Visualization at the University of California, Davis. He maintains a blog on his VR research at Doc-Ok.org, where a few months back he showed us what it’s like to be inside of a CAVE.

Kreylos approaches the Oculus Rift as someone who has spent significant time with several head mounted displays and other virtual reality systems. He got to play with the Rift through a friend and says that he’ll “hold back a thorough evaluation until I get the Rift supported natively in my own VR software, so that I can run a direct head-to-head comparison with my other HMDs, and also my screen-based holographic displays systems using the same applications.” That said, he still delivers a verbose initial Oculus Rift review.

For Kreylos, praise comes right out of the gate:

…I’m very relieved that the Oculus Rift is as good as I had hoped. It’s surprisingly light, and the “ski goggle” design, which had me slightly worried, actually works. One unexpected benefit of the design is that it’s possible to put on and take off the unit without having to deal with the head straps, just by holding it up to one’s face, and still get the optimal view.

Kreylos compares his experiences with the Rift to that of the Sony HMZ-T1 and eMagin Z800 (see our HMD comparison chart):

I am utterly impressed by the optical properties of the lenses, especially considering how strong they are. Once the display sits properly (and it’s easy to seat), the entire screen area is in focus and clear. This is very different from my Z800, where I’ve yet to find a position where the screens are entirely in focus, and even from the HMZ-T1 with its better optics. There is very little chromatic aberration; I only saw some color fringes when I started looking for them. Given that the Rift’s field of view is more than twice that of the Z800 and HMZ-T1, it’s an amazing feat.

But, like we’ve all heard, the resolution of the dev kit leaves much to be desired. Oculus was well aware of this long before the kit even reached Kickstarter; they say that the consumer model will have a significantly improved screen. To that end, Kreylos seemed more concerned about the ‘screen door effect‘, which comes from a display’s subpixel structure, than the low resolution:

Now, the Rift has significantly more solid angle real estate over which these pixels are spread, so it is comparatively low-res, but that didn’t really bother me. No, the Rift’s problem is that there are small black borders around each pixel, which feels like looking through a screen door attached to one’s face all the time. I found that quite distracting and annoying, and I hope it will get fixed.

He also mentions ghosting (blurring of moving on-screen content) which is noticeable especially when turning your head because the entire scene moves (and blurs) around you.

When I tried the Oculus Rift at GDC the other week, ghosting felt more prevalent in some games than others, but it’s possible that I was simply getting used to the effect. Ghosting can be fixed by using a screen with a faster pixel refresh rate (the amount of time it takes for the pixels to change colors) and is expected to be greatly improved with the consumer version.

Kreylos moves onto the software, and while he says he’ll need to get his own programs working first to check how good the calibration is, when it comes to the distortion correction, he says that “the developers did a bang-up job.”

The Rift (or rather its SDK) does lens correction via post-processing. First, the virtual world is rendered into a “virtual” camera image, which is then resampled using a simple radial undistortion formula based on a quadratic polynomial. The fundamental problem with this approach is that it has to resample a 1280×800 pixel image into another 1280×800 pixel image, which requires very good reconstruction filters to pull off. The SDK’s fragment shader simply uses bilinear filtering, which leads to a distinct blurriness in the image, and doesn’t seem to play well with mipmapping either (evidenced by “sparkliness” and visible seams in oblique textures). The SDK code shows that there are plans to increase the virtual camera’s resolution for poor-man’s full-scene antialiasing, but all related code is commented out at the moment.

He goes on to mention that the Tuscany demo gave him a “pronounced feeling of dizziness from walking,” though he couldn’t put his finger on what was causing it — he hasn’t gotten dizzy in other VR environments.

Interestingly, motion sickness seems to be highly variable from one person to the next, and it even seems to be related to the demo being used. I spent at least 30 minutes in the Tuscany demo with the Razer Hydra and had no issues with motion sickness. My first experience with the Oculus Rift and Hawken, on the other hand, gave me a bit of nausea after about 10 minutes.

Kreylos says he’ll be looking into the cause of motion sickness once he has his own software running on the Oculus Rift. I’m sure others will be researching the problem as well as dev kits continue to arrive. I’m looking forward to what conclusions Kreylos reaches after a full Oculus Rift review.

There’s more to read from Kreylos’ early Oculus Rift review, go check out his full article!

Consumer 3D Expert, Anton Belev’s Early Oculus Rift Review

oculus rift review dev kit anton belev 3dvision-blog

Anton Belev has been running the 3dvision-blog for several years. He focuses on consumer 3D stereoscopy like 3D monitors and HDTVs. He approaches his early Oculus Rift review as someone who has experience with a number of consumer stereo systems and software and regularly plays games in 3D.

Belev starts out with some considerations for those who want to use the Oculus Rift with glasses:

Since I do wear prescription glasses as I’m a bit nearsighted, with -1.25 diopters what seemed to work best with the Rift was the middle B set of lenses as the A set produces a blurry image for me and the C set is a bit too much. I’ve also tested trying to fit my prescription glasses inside the Rift as they are pretty compact in size (the do fit inside), the effect I get with them inside using the A set is pretty much the same as when using the B set without the glasses. I prefer to use the B set of lenses as it is more comfortable than to try to wear my glasses inside the rift and if you wear larger prescription glasses you may have trouble fitting them inside.

Like many who have tried the Oculus Rift, Belev notes the low resolution.

“…looking at the Oculus Rift LCD display without the lenses it looks great in terms of detail, but since the lenses zoom it you can clearly see the pixels.” he wrote. Like many others have experience, myself included, he mentions that resolution concerns somewhat fade when you focus on the experience. “…if you stop paying too much attention to the pixels you can still enjoy what you get,” he continued.

Similar to others, motion sickness for Belev is not universal, but happens in one demo but not others.

“Strangely enough I get nausea fairly quickly only in the Oculus Tuscany Demo and not in any other of the demos I’ve tried or in TF2 (or at least not as fast as in the Tuscany Demo),” he wrote.

Belev has an interesting take on the 3D effect of the Oculus Rift. He acknowledges that, “the focus of the stereoscopic 3D support with the Rift is making things seem realistic,” but goes on to mention that the effect may seem subdued to 3D PC gamers who are used to games that provide exaggerated 3D:

And while [the stereoscopic 3D] works quite well in the demos, they may seem a bit flat for people used to playing games in stereoscopic 3D mode with a lot of depth – not intended to provide realistic proportions, but just to have a lot of depth. So if realism is your goal, it works quite well even now, though the lower resolution is a bit of a drawback here as well, but virtual reality does not need to always be true to real things, it can be used to provide “unreal” experiences as well. I suppose it can take some time for developers to pick up on stereoscopic 3D support for the Rift to be able to use it as best as possible and also to give adequate user control over the depth levels. From the currently available supported software I cannot say I’m impressed by the stereoscopic 3D support as much as by the VR experience, though both work well together.

I think he’s right about developers eventually picking up on and controlling the 3D effect. I can already think of some interesting “unreal” experiences that could be achieved by playing with depth — something like Hitchcock’s signature dolly zoom immediately comes to mind, but it’ll have to go out the window if it causes motion sickness.

There’s lots more detail to be gleaned from Belev’s early Oculus Rift review, go check it out!

See All Oculus Rift News

Reactive Grip Brings Tactile Feedback to the Razer Hydra, Other Motion Input Devices [video]

7
tactical haptics reactive grip razer hydra gdc 2013

At GDC 2013, a company called Tactical Haptics showed off a tactile-feedback system, called Reactive Grip, for motion-controlled input devices. The prototype I got to use consisted of a hacked up Razer Hydra built into a 3D printed housing with four sliders that move up and down in your hand as you grip the unit. For certain situations, like swinging a sword or flail, the system creates an impressively convincing sensation that could bring us one step closer to immersive virtual reality.

A Sensational Sensation

palmer luckey tactical haptics reactive grip
Palmer Luckey, creator of the Oculus Rift, trying the Reactive Grip prototype at GDC 2013 – Photo credit: Make

With the Reactive Grip system, translational motions and forces can be portrayed by moving the sliders in unison with the direction of force, while moving the sliders in opposite directions can create the feeling of the device wrenching in the user’s grasp.

When I asked Palmer Luckey what he thought about the system, he told me it was “totally badass.”

The sword and flail demos (see the video above) stole the spotlight for me. While the slicing motion felt good, the stabbing motion felt immediately natural. As you stab into the material, the handle of the unit seems to push back against you, as though there really is some resistance at the end of the virtual sword. Sliding the sword under the arm of the dummy and letting it slowly slide off the blade felt very convincing.

The flail demo was even better. Imagine swinging a flail over your head; think about the way that the grip of the flail would pull in a circular motion around your hand as the mass above you swings about. That sensation was very convincing with Reactive Grip. It almost felt like there really was a weight flying around above my head attached to the handle.

An Extra Layer of Immersion

Reactive Grip could provide another layer of immersion when combined with an experience like the Oculus Rift Razer Hydra ‘Tuscany’ demo. With a head mounted display, reaching out to grab objects with your own virtual hands is already highly immersive — adding realistic tactile sensation would take things to the next step.

Immediately I imagined that this would be incredible for a game like Chivalry: Medieval Warfare (PC, 2012), wherein you engage in combat using any number of medieval weapons — including a flail. Such a setup would take advantage of both motion sensing for attacks and tactile feedback when striking opponents or parrying weapons.

Founding and Future of Tactical Haptics and Reactive Grip

Tactical Haptics and the Reactive Grip technology comes out of the University of Utah’s Haptics & Embedded Mechatronics Laboratory. The company is a recent startup, founded Dr. William Provancher, which hopes to commercialize the Reactive Grip technology.

 

reactive grip skin stretch tactile feedbackIn addition to the handle-based unit that I tried at GDC, the technology can be applied to a number of other applications, such as finger-based systems that can be built into gamepads and small tool-like implements which could provide tactile feedback for virtual surgery training.

Provancher, who holds a Ph. D. in Mechanical Engineering and is currently an Associate Professor at the University of Utah, told me that Reactive Grip is not restricted to the Razer Hydra. It could be implemented with a Wiimote, PlayStation Move, or even a new motion tracking peripheral altogether.

The company intends to launch a Kickstarter this summer to fund developer kits, expected to cost $150, though the final implementation of the technology might happen on a license basis rather than a product directly from Tactical Haptics.

Excitingly, Provancher thinks that a system incorporating Reactive Grip would only add about $50 to the price tag of an in-production motion tracking device, which I think many looking for an immersive virtual reality experience would be willing to pay for.

FaceX Real-time Facial Animation Could Add Immersion to Multiplayer Games [video]

3

At GDC 2013 I saw a cool demo at the AMD booth called FaceX. Created by a company called Mixamo, FaceX tracks a player’s face using a standard webcam and can use that data to animate a 3D model in real-time. While this tech could be used as cheap motion capture to animate computer-controller characters, the much more exciting implication would be real-time facial animation for multiplayer games where teammates communicate with voice chat.

FaceX could be an elegant solution to increasing immersion in current multiplayer games that aren’t yet making the leap to full blown VR.

In theory, the system is perfect for a PC gamer. Most likely they’ve already got a microphone/headset to communicate with teammates. If they’re on a laptop, they already have a camera, or if they’re using a desktop, a cam can be picked up on the cheap — either way, they will already be close to the camera for accurate tracking. Developers wouldn’t even need to sync the audio to the facial animation as they are both being delivered in real time. There’s a huge audience of PC gamers to which this technology could be deployed quickly and inexpensively.

Using this tech with console gamers would be a bit more complicated, but not impossible. Current console cameras like the Kinect and PlayStation Eye probably don’t have the necessary resolution to accurately track gamers’ faces as far as the couch. Higher resolution cameras would be an easy fix, and we’ll probably see that from the Kinect 2 and the PS4 Eye.

FaceX tracks more than just mouth and eye movement. It uses the incoming data to convincingly animate cheeks and other subtle face muscles thanks to machine-learning analysis of over 30 test subjects, Mixamo told me.

You can see that it’s a bit jumpy in the video as I turn off to the side to ask the rep questions and look down at my camera, but front-on, where PC gamers are looking all the time, it works quite well, and I would expect it to get better over time.

I can imagine it now: hunkered down in Battlefield 3 with bullets whizzing over my head. The M-COM Station objective is just a few yards away, but the defensive line has me and a teammate suppressed. My teammate, who is communicating with me over voice chat, turns to me and says, “Drop some smoke and head straight for the station, I’ll flank left and take them out,” all the while, I can actually see his mouth moving in real time — and hear the audio emanating from his avatar. This would be significant step forward over simply hearing a disembodied voice coming from nowhere.

Another great place for FaceX would be slower MMORPG games like Second Life or World of Warcraft where players frequently have extended face-to-face interaction through trading, meetings, and general banter. Second Life with FaceX would be fun, but the novelty of seeing a non-human avatar (like a Troll or Orc in World of Warcraft) animated as though you are talking through them would be very cool indeed.

While this tech would be great for virtual reality as well, there are some issues. The near-term future of virtual reality will likely revolve around head mounted displays like the Oculus Rift. This means that the eyes are always going to be obscured to a normal webcam. There was ways around this, like small eye-tracking cameras inside of the HMD, but that adds an extra hardware requirement which quickly begins to shave down the potential audience. There’s also the issue of head-tracking wherein players using head mounted displays will not always be looking directly at a computer monitor in front of them. Thus their faces will sometimes be obscured when they turn away from the camera and tracking would be lost.

But mainstream virtual reality gaming is still a little ways off. Meanwhile, there are tons of games that could benefit from this tech; Mixamo tells me that FaceX will be launched in the next 6 months.

Oculus Rift Team Fortress 2 Hats Now Available

5
TF2 Oculus Rift Hat TF2VRH

TF2 Oculus Rift Hat TF2VRH

A few weeks back we learned that the ever popular Team Fortress 2 (TF2) would get Oculus Rift support; it was also revealed that Oculus Rift owners would get a unique TF2 hat (a popular in-game item type). Today the TF2 Oculus Rift hat is officially redeemable by those who were part of the Oculus Rift Kickstarter and apparently anyone who ordered through the Oculus website before April 1st.

The hat, officially known as the TF2VRH (Team Fortress 2 Very Rectangular Hat), can be redeemed by going to the Oculus Account Info page and entering the email address associated with your order. Oculus will email you a link to your order status page, where you’ll find a ‘claim code’ button toward the top of the page.

claim tf2 oculus rift hat

Then you can follow these instructions to activate the hat inside of TF2. As some speculated, the TF2 Oculus Rift hat cannot be traded inside of the game. Here’s the official in-game description of the item:

Genuine TF2VRH

Level 1-100 Headset   So long, squares! Take a hike, triangles! Introducing the Team Fortress 2 Very Rectangular Headset, which brings hundreds of decades of rectangle research and technology to the absolute forefront of your face. ( Not Tradable )

Oculus Rift ‘Virtual Cinema’ Explored, Plus an Interview with Its Creator [video]

5
oculus rift virtual reality movie theater

Ever wanted a full-size movie theater of your very own? A new Oculus Rift compatible project promises to grant your wish, and then some. We talk to Christer-Andre Fagerslett (aka namielus) the author of a new ‘Virtual Cinema’ project which leverage’s the immersive power of the Rift and aims to provide the ultimate movie viewing experience.

10 Oculus Rift Demo Reactions from GDC 2013

4

10 oculus rift reactions from gdc 2013

At GDC 2013, people waited upwards of two hours in line to try the Oculus Rift developer kit. Was it worth the wait? Did it live up to the hype? I interviewed 10 people right after they stepped out of the demo. I was expecting to hear people mention the lower than desirable resolution and perhaps the ghosting, but I came away very surprised to find out how much everyone — many of them non-developers — said they enjoyed the experience. To be clear, I didn’t leave out any interviews or cut any negative comments — this is the aggregate of what people told me. And yes, 10 is a small sample size, but the results certainly bode well for the Oculus Rift.

See All Oculus Rift News See All GDC 2013 News

For Those Can’t Wait, Here’s a 40 Minute Overview of the Oculus Rift Dev Kit

1


Tested.com has a great 40 minute test session of the Oculus Rift developer kit on video. They unbox the unit and test the various virtual reality control schemes baked into Team Fortress 2.

We have one of the first Oculus Rift development kits in house, and spend the day testing it in Team Fortress 2. Watch how this virtual reality head-mounted display works in-game with every available control setting, as Will practices rocket jumping and we discuss the promises and challenges of VR.

See All Oculus Rift News

Exclusive Hands-on: Sixense MakeVR Brings 3D Modeling to the Masses with Oculus Rift and Razer Hydra Support

6

Sixense, maker of the Razer Hydra, is about to storm into the software arena with an impressive user-friendly computer aided design program called MakeVR. The software is built from the ground up with Oculus Rift and Razer Hydra support. The combination enables even entirely inexperienced users to make complex 3D models with ease, while at the same time retaining powerful CAD functionality. In addition to trying MakeVR for myself at GDC 2013, I sat down with Sixense’s Simon Solotko and Paul Mlyniec to learn more about this ambitious project.

Sixense is launching a Kickstarter campaign this month and they hope to use the funds to cram as much user-friendly functionality into the software as possible. They intend the end result to be a collaborative CAD environment that functions on natural interaction and is easy enough for the novice, but powerful enough for the master.

The smooth, intuitive control that you see above is not concept footage, it’s exactly how MakeVR works.

I used an alpha build of MakeVR at GDC 2013 with the Razer Hydra and a 3D HDTV. After a 30 second training session, I was easily navigating the design space, manipulating objects, and scaling to my heart’s content. Having depth-perception and independent 1:1 hand controls makes for an instantly natural experience which relieves the need to even grasp an X/Y/Z coordinate system.

When I saw what was possible with MakeVR I knew that Sixense was onto something — for a younger audience this will be like unlimited virtual Legos on steroids, except with the ability to share your imaginative creations with a worldwide audience (and without the $50/kit price tag).

For someone like myself — who has zero CAD experience — MakeVR suddenly opens up to me the possibility of building complex 3D models that would have formerly required software that was too expensive and complex to manage.

And for the advanced user, Sixense says that MakeVR is ready to make both 3D printing models (thanks to .stl exporting) and detailed virtual goods which could be distributed in other games. MakeVR can import models from other software and uses the industry standard .sat file format.

Another exciting possibility is for those working on virtual reality games for the Oculus Rift. To make 3D models you’d currently need to work outside of the Rift in a normal modeling program, then drop your models into the game and put the Rift on to see how they look to scale and in 3D. With MakeVR you could work directly with the medium in which the model will eventually be used.

MakeVR Made for Head Mounted Displays

MakeVR Head of Development, Paul Mlyniec, told me that using MakeVR with a head mounted display like the Oculus Rift is the quintessential usage of the software. While it’s natural enough using the Razer Hydra to manipulate and build models on a monitor, adding a head mounted display makes it feel like you are reaching out with your own hands — not unlike the Oculus Rift ‘Tuscany’ Razer Hydra demo.

“[MakeVR is] designed very much with head mounted displays in mind… you see from the tool panel that you have all the control that you need — you’re never groping for a mouse and keyboard — everything is self contained, and expected that [the software] is going to take over all the senses of the user,” said Mlyniec.

GDC 2013: Sixense MakeVR Interview

I had the chance to sit down with two folks from Sixense who are leading the MakeVR project — Product Manager, Simon Solotko, and Lead Developer, Paul Mlyniec, to learn more about the project:

GDC 2013: Inside Half Life 2 with the $15,000 NVIS SX60 Head Mounted Display

1
nvis sx60 gdc 2013 half life 2

nvis sx60 gdc 2013 half life 2

At GDC 2013 I spent some time with the folks from Forth Dimension Displays, maker high-end microdisplays. The company’s CEO, Greg Truman, told me in an interview that they were at GDC mainly to evangelize VR. To that end they brought with them the NVIS SX60, a $15,000 head mounted display which uses their microdisplays. Inside was Half Life 2 in full 3D — when I put on the SX60 my first thought was, ‘I can’t wait for the Oculus Rift to have this kind of resolution!’

Had I known how much the SX60 cost before putting it on, I probably wouldn’t have touched it for fear of the ‘break-it, buy-it’ policy. This is a hand-built head mounted display made for research, military, and other high-end uses — not for consumer VR. As they say, ignorance is bliss.

Attached to the NVIS SX60 was an IntertiaCube IMU for head tracking. Inside was a retail copy of Half Life 2 with 3D drivers. When I put on the SX60 I was looking at the plaza scene from early in the game (before you get any weapons). The head tracking was tight and the 3D was great. I strolled around the scene with a controller. The guards in the plaza threatened me with their stun sticks as I approached — it’s bit more menacing in an HMD than on a screen a few feet away.

The 1280×1024 (per eye) resolution was excellent, the scene was very sharp. The moment I put on the SX60 I was thinking that I can’t wait for the Rift to achieve a similar resolution.

Not only was the resolution good, but there was no ghosting at all (blurring during movement) thanks to a very fast switching rate. The display technology in use, Liquid Crystal on Silicon, doesn’t use subpixels, opting rather to stack short pulses of color on top of one another. This means there’s no ‘screen door effect‘, however I did see a bit of color fringing when turning my head — but it’s a more than fair price to pay for the elimination of ghosting.

nvis sx60 lenses

The 60 degree field of view doesn’t stack up to the Oculus Rift, however it isn’t limited by the display, it comes down to the optics. In the case of GDC, Forth Dimension Displays decided to bring with them an HMD that was quick to put on, like the NVIS SX60. HMDs with a higher field of view tend to be more bulky and have longer set up times not suitable for a demonstration environment.

Stretching out the 1280×1024 resolution over a larger field of view would reduce the clarity of the screen. However, Forth Dimension Displays has a remedy for that — their latest microdisplay has a whopping 2048×1536 resolution. That’s over 3.1 million pixels crammed into a 0.83 inch diagonal, putting the DPI off the charts at 3084.34!

The company told me that, at scale, they believe a VR headset could be sold with their latest display for $1500 — a price point which they think will represent the high-end consumer virtual reality enthusiast market if VR makes it mainstream.

See All GDC 2013 News

Interview With Forth Dimension Displays CEO: Now is “the best opportunity ever to get consumers wearing head mounted displays”

1
gdc 2013 virtual reailty greg truman cero forth dimension display interview

gdc 2013 virtual reailty greg truman cero forth dimension display interview

At GDC 2013, I sat down with Greg Truman, CEO of Forth Dimension Displays, for a chat about the future of VR. Truman believes that the time is right for virtual reality. He told me that right now is “the best opportunity ever to get consumers wearing head mounted displays.” Forth Dimension Displays works primarily in high-end microdisplays for military and research industries which come with a suitably high-end price. The company was not really at GDC 2013 to sell their product. Instead, they were there to help evangelize VR in the hope that, this time, it will break into the mainstream. Truman told me he loves what the Oculus Rift folks are doing and hopes they succeed because it has the potential to benefit everyone in the HMD market.

See All GDC 2013 News

GDC 2013: Michael Abrash on “Why VR is Hard (and where it might be going)”

13

michael abrash gdc 2013 virtual reality talk

At GDC 2013, the legendary Michael Abrash took to the stage to talk about the Oculus Rift and virtual reality. Abrash, now working at Valve, has been researching augmented and virtual reality technology for the company. When he began his talk I thought he was discouraging virtual reality because of the many problems that need to be solved for a truly perfect VR experience. However, as he continued, I realized that he was actually being encouraging — he sees the problems ahead as challenges ripe to be solved by eager developers; this is an opportunity to define the future of gaming. Keep your eye on his blog for more on VR from Abrash.

Updated (4/1/13): Added videos in middle of presentation.

Michael Abrash’s GDC 2013 Presentation: Why Virtual Reality is Hard (and where it might be going)

MAbrash GDC2013

Good afternoon. I’m Michael Abrash, and I’m part of the group working on virtual reality at Valve. Today I’m going to share as much of what we’ve learned as I can cram into 25 minutes; I’m going to go fast and cover a lot of ground, so fasten your seat belts!

MAbrash GDC2013 (1)

17 years ago, I gave a talk at GDC about the technology John Carmack and I had developed for Quake.

That was the most fun I ever had giving a talk, because for me Quake was SF made real, literally.

You see, around 1994, I read Neal Stephenson’s Snow Crash, and instantly realized a lot of the Metaverse was doable then – and I badly wanted to be part of making it happen.

The best way I could see to do that was to join Id Software to work with John on Quake, so I did, and what we created there actually lived up to the dream Snow Crash had put into my head.

While it didn’t quite lead to the Metaverse – at least it hasn’t yet – it did lead to a huge community built around realtime networked 3D gaming, which is pretty close.

Helping to bring a whole new type of entertainment and social interaction into existence was an amazing experience, and it was all hugely exciting – but it’s easy to forget that Quake actually looked like this:

MAbrash GDC2013 (2)

And it took 15 years to get to this:

MAbrash GDC2013 (3)

See All GDC 2013 News

Exclusive: Hands-on With the Oculus Rift Tuscany Razer Hydra Demo — The Most Fun I’ve Had in VR Yet

14
gdc 2013 oculus rift tuscany razer hydra demo

At GDC 2013 I met with the good folks from Sixense, makers of the Razer Hydra controller. They put me into the Oculus Rift and loaded up the Tuscany demo (built in Unity). This particular version of the demo has full Razer Hydra support, thanks to Sixense. Scattered around the space is a myriad of physics-driven objects to interact with — and damn is it fun! The combination of the Oculus Rift and the Razer Hydra is potent and incredibly immersive. The Tuscany demo, infused with support for the Razer Hydra, is hands-down the most fun I’ve had in virtual reality yet.

The Oculus Rift and Razer Hydra Are a Winning Combo

If you intend to develop for the Rift and haven’t jumped on the Razer Hydra bandwagon yet, I highly advise that you do so (don’t miss their current sale).

Interacting with the Tuscany demo using the Razer Hydra was not only natural — it was fun! Reaching out and touching objects with your own virtual hands is miles more immersive than using a keyboard and mouse. You can do things with the Rift and the Hydra that you simply can’t do with a monitor and traditional input.

The moment that really sold me was when I tossed a basketball up into the air above my head (see 3:35 in the video). I threw it in a way that the trajectory of the ball would have it landing somewhere slightly behind me. My natural reaction was to look at the ball as it was coming down, lean back, and grab it — and that’s exactly what I did. I didn’t have to think to myself, ‘how do I need to move the controls in order to do what I want to do,’ I simply followed the ball as it flew through the air, reached up behind me, and plucked it out of the air.

I was blown away the moment I caught the ball and realized what had just happened. The immersion that comes from the Rift and the Hydra is extremely impressive and vividly promising. The barriers that separate humans and computers are falling away before our very eyes.

A Visit to Tuscany

There was more to the Oculus Rift Tuscany Razer Hydra demo (we’re going to need a shorter name) than the basketball. There were plenty of objects: books, chairs, logs, barrels, etc. Interacting with things felt incredibly natural.

Picking up a book and bringing it close to your face to see detail was really cool. With a book in my hands I couldn’t help but want to open them to see the pages inside — something Sixense says they would have added given more time. Also exciting was the ability to hold out your finger using one of the Hydra’s buttons and trace a line from the cover art on one of the books; the accuracy is quite impressive (0:45 in the video).

Maybe it was that I was tossing heavy barrels hundreds of feet like a superhero, but throwing objects was immensely fun. At first I was releasing objects just a bit too soon, but eventually I got the hang of it and was launching objects clear off of a cliff into the sea below. I nailed the underhand volleyball serve on my first try which was quite satisfying (7:11).

You can easily grab two objects at the same time, or pass and object back and forth between your hands as you manipulate it. There are clear implications for gameplay here: being able to do something with one hand while doing something entirely different with the other hand is natural and useful. Examples that come to mind include holding a flashlight in one hand while pushing open doors with the other, holding a magnifying glass while inspecting an object, hooking two interlocking pieces of something together to form a key or other useful object, or loading shells into a shotgun.

The Tuscany Razer Hydra demo is absolutely the most fun I’ve had yet with virtual reality. Soon, game developers will be wrapping these new natural interactions in compelling narratives and enticing gameplay and I can’t wait to step into those experiences. This is an extremely exciting time to be a gamer!

Sixense tells me that they intend to release this demo to the public in due time — more on that as we hear it.

Discuss this over in our forums.

Oculus Rift Dev Kit first impressions from Jayoh

4
daughter-rift-playing-medium
Joe’s daughter enjoys stepping through the Tuscany demo using the Oculus Rift.

Oculus Rift Developer Kits are finally arriving at people’s doors. The first 300 units are either on their way or have already landed. One of the earliest recipients of a kit and seemingly the first to publicly announce his new arrival, has torn himself away from his new prize long enough to talk to us about himself and his first few hours with the Rift.

Oculus Rift Dev Kits begin arriving with early backers!

1
Jayoh's Dev Kit still boxed.

yanek-martinson-rift-arrival

The first few early Kickstarter backer have begun receiving their Oculus Rift Development kits. We collate initial impressions and early arrival screenshots. As if that weren’t enough, Oculus has finally opened the virtual doors to it’s online Developer’s Area, we take a peek.

39,929FansLike
13,574FollowersFollow
66,541FollowersFollow
27,800SubscribersSubscribe

Latest Headlines

Features & Reviews