The HTC Vive came to London recently, courtesy of Valve and PlayHubs, to form the centrepiece of a dedicated VR Jam. Mike Algar charts his room-scale developer adventures for us in this guest post from inside the event.
Mike Alger is a virtual reality designer focused on creating volumetric digital interfaces. With a background in motion design and video production, Mike’s postgraduate work focused on the translation of two-dimensional paradigms to three-dimensional design workflows for head mounted displays. He is an active member of the virtual reality community, delivering talks and participating in hackathons.
Another guy in his twenties was looking at me apprehensively, not sure if he should introduce himself or if we already knew each other.
The truth is, we had already met and talked face-to-face on several occasions… originally at a group meetup on a floating platform in space. The next was a planning meeting in a mostly empty pub and most recently in the engineering bay of Star Trek Enterprise-D engineering room. We had only seen each other as avatars in VR Chat, a social virtual reality application. It was there that he originally recruited me to join his team for the “game jam” that would be happening in London. Three of us got together a couple times to plan a concept we could try to build in 36 hours for HTC’s upcoming virtual reality system, Vive, developed by Valve.
These meetings were typically accompanied by such passers-by as Batman with an Australian accent or a murderous looking Chucky doll, each listening to our ideas and offering their own. In our brainstorming sessions, ideas like Wii-fit for physical rehabilitation or an ‘escape the room’ style game were presented.
The idea we ultimately settled on was to have the Vive-equipped user play the part of a robot engineer tasked with traveling to and repairing different areas of a spaceship, while fighting off enemies. A second player outside the headset would have a normal screen showing the ship’s diagnostics. The game would rely on communication (and hopefully breakdown thereof) between both players “outside” and “inside.” That’s the concept that had us standing in our separate bedrooms, standing around the Enterprise’s warp core in VR Chat to see what it might feel like.
Now, however, we were in the basement of London’s Somerset House and this guy wasn’t sure if I looked enough like the avatar he had been talking to. “Are you…?”
We staked our territory at the venue, where our team of four would spend the next two days. The rule of this game jam was simple: Have a playable game for the Vive running at 90 frames per second in 36 hours. There was no competition or winner, as is usually the case for these hackathon-style gatherings. Just creativity and collaboration with a crunchtastic deadline.
But why work for free, foregoing sleep for an entire weekend? Where is the reciprocity? The opportunity to use the typically exclusive and rare Vive headset, controllers, and Lighthouse tracking system was more than enough for me, a longtime fan of both Valve and VR. Rubbing elbows with some of the most experienced VR developers around was definitely a bonus too, though it certainly causes some impostor syndrome.
The Vive system itself is, as many have stated before me, fantastic! What was a “screen door effect” due to lower resolution in previous head mounted displays is reduced to looking like granules of illuminated sand. The weight of both the headset and controllers is so minimal that they even feel ‘Happy Meal’ cheap, but the experiences are significantly beyond anything I’ve ever gotten with fries.
Initially, I asked to try a few demos and was shown Job Simulator and Tilt Brush. They’re as good as everyone else says, particularly Tilt Brush’s color and brush picking interface. But, as someone with companion cube fuzzy dice on my Aperture Science labeled vehicle, hauling an orange test subject jumpsuit and handheld portal device, there was one demo in particular that I wanted to try… but was too sheepish to ask (again, imposter syndrome). Instead, I worked all night amongst the pizza boxes, caffeinated drinks, and various prototype VR hardware. When all was quiet and most people were asleep, I went in to the testing room with a team member and we turned on the Aperture Science demo for ourselves.
I try to reign in people’s expectations for VR. I avoid overhyping it as a medium. But that demo was nothing less than absolutely delightful! As far as I know, Valve doesn’t like to do any screen recordings Vive demos because it misrepresents the actual experience, like watching a cell-phone video of a Disneyland ride. For the same reason, I don’t think it’s particularly useful to describe the narrative of it. What I would rather describe is that the experience is just plain professional. Materials, lighting, animations, writing, pacing, models… everything was on point – like a Cirque De Soleil performance – and running at 90 FPS in Valve’s Source 2 engine.
Anyone who works in VR gives a looooot of demos. We watch the mouths gape in disbelief as their sight is subverted, like the way headphones have to their hearing in the past. As they take off the headset, there is always a stream of ideas, “You could use it for…!” gaming, therapy, education, film, etc. Trying something so polished kind of made me not want to show developer kit demos anymore because I’ll feel like I’m misrepresenting VR. Yet at the same time it made me want to make everything! I want to build a model of a spider! I want to make a boxing game! I want to design a building while in VR! I want to play a platformer! I was just like anyone who had tried VR for the first time again.
By the Jam’s close, everyone’s projects were at varying degrees of completion, as is typical. One experience called GTA Vive had you standing in a city square telekinetically throwing and shooting vehicles and humans. Another had you puppeting a marionette. There was “Penguin Hustle” that had you feed or squirt an increasing number of cute penguin characters on an iceberg.
The Hatton Garden Heist saw users crawl through air ducts and dodge security lasers. D.E.R.P had users sort objects on a conveyor belt by color, getting more with each round like the famous ‘I Love Lucy’ candy factory scene. In Cyberkuza Showdown you were descending the side of a building shooting at agents whilst being shot at. In another you had a few seconds to perform a task like flip a pancake, jump a gap, or give a bar of soap to a showering bear before the room would fly away and be replaced with another task. Save My Idiot Babies had you, as a giant, scoop up creatures in a valley to save them from an imminent volcano eruption… or throw them in the volcano, as some preferred.
I grabbed a camera and haphazardly shot some the developers talking about what they learned [above] at this point to pass on to the rest of the VR community. Rather than repeating those here, I’ll say some of the walk away opinions I gained about room-scale VR specifically as a consumer medium.
Yes, VR is amazing and interesting and you don’t need me to tell you that again. But physically walking through a space and seeing the parallax of simulated objects creates a surprisingly more compelling experience. The way that affects your interpretation of animated characters is also surprisingly powerful.
VR is also kind of tiring. The subtle eye strain and intensity of the experiences is literally exhausting over time. That’s obviously increased with room movement. All of the developers seem to me to be really down to earth people without delusions of grandeur about the technology. All of these things lead to my own opinions about what the most successful content types will be, but I’ve already gone on long enough.
What I will say is that using the Vive solidified my desire to work in virtual reality permanently and showed me that the next ten years are going to be interesting.
Our thanks to Mike Algar for taking the time to share his notes with us. If you’d like to contact Mike, contact him via LinkedIn.