‘BUTTS’ Creator Working on New VR Project in Collaboration with WEVR
Tyler Hurd, creator of the infamous BUTTS: The VR Experience, is working on a new project in collaboration with VR production studio WEVR, Road to VR has learned.
Tyler Hurd, creator of the infamous BUTTS: The VR Experience, is working on a new project in collaboration with VR production studio WEVR, Road to VR has learned.
Disney and Total Cinema 360 have released a 360-degree video documenting the opening five minutes (read: “The Circle of Life”) of The Lion King musical to VR platforms, including Littlstar’s iOS and Android apps, YouTube 360, Vrideo’s app and MilkVR for Samsung Gear VR users.
Samsung’s Gear VR launched officially in it’s first consumer ready form last week, and as the first commercial VR headset launching in decades, how do you sell a technology famously difficult to demonstrate other than in person? Samsung’s new advert has a very effective stab at this problem.
Ray Davis is the Studio Manager of Epic Games Seattle, and he talks about working on Bullet Train, which is Epic’s latest VR tech demo that uses the Oculus Touch controllers and debuted at the Oculus Connect 2 gathering. I had a chance to catch up with Ray at the Seattle VR conference where he told me about the iterative design process behind Bullet Train, the evolution of the teleportation VR locomotion approach, and how they discovered the innovative bullet grab and throwing game mechanic.
LISTEN TO THE VOICES OF VR PODCAST
Audio PlayerRay Davis talks about some of the goals and motivations behind Bullet Train. Epic wanted to create an immersive VR experience that was interactive and dynamic designed for anyone to go through regardless of what their level of gaming experience might be. Lead VR engineer Nick Whiting and Creative Director Nick Donaldson collaborated on creating Bullet Train, and they wanted to explore what it means to have hand presence within a VR experience.
Ray says that there’s an art to constructing a competitive death match environment in terms of the player flows and different pickups that encourage different pathways throughout the environment. It’s not just a matter of teleporting from location to location, and Nick Donaldson took a lot of that into consideration when creating Bullet Train.
Bullet Train has definitely been the most comfortable first-person shooter experience that I’ve had in VR so far. This level of comfort is largely thanks to their teleportation mechanic in order to move between different way points that are set on a subway train and out into the station. There’s a ghosting trail that you can see after you teleport that can help you orient you to your new location. Ray says that they thought a lot about ways to design the experience so that you could have enough visual cues to maintain your orientation as you teleported between the various waypoints.
Ray says that game design process at Epic Games has always been very organic and iterative. His advice is to just make a VR experience, and then see what people want to try to do in the experience, and then implement those things if it hasn’t been implemented yet. This is how they discovered their bullet grabbing and throwing game mechanic. They noticed that people kept trying to to catch them, and so they went ahead and just added that feature. He says that their ultimate goal is to create an intuitive experience such that people forget that they’re controlling a game, and that they can get into a flow where they’re reacting with their unconscious muscle memories.
Ray says that it’s ultimately a lot of fun to develop for virtual reality when you’re the target audience, because you’re the best expert in what you find fun and engaging. Especially when they could look to their favorite Hollywood action movies, and see what they could start to recreate within their VR experience. There are a still a number of design challenges in moving something like Bullet Train from a novel tech demo into a full-fledged game, and Ray didn’t mention any specific plans for what the future of Bullet Train might be. But it wouldn’t be surprising if they were continuing to refine and develop this concept after giving more than 500 demos over the last couple of months.
There’s also a lot of these experiments in VR where these ad hoc teams at Epic are able to dogfood the Unreal Engine. And so there is a lot of feedback and improvements that are made to the engine to make it more and more well-suited to create different virtual reality experiences. Ray says that part of the culture at Epic Games is to make things, and then try to give as much away of those innovations as possible.
Finally, Ray sees that VR and AR will have a convergence and eventually replace our screen-based interfaces in monitors, laptops, tablets, and phones. He sees that VR and AR will continue to unlock a lot of actual changes with how we gather and consumer information as well as how we connect with each other.
Become a Patron! Support The Voices of VR Podcast Patreon
Theme music: “Fatality” by Tigoolio
WakingApp announced recently that they’ve completed a series C funding round to secure $4.3M to realise their vision of an immersive content creation platform that everyone can use.
PresenZ is a system developed by VFX studio Nozon which combines some of the essential benefits of pre-rendered imagery with those of real-time interactivity. For the first time the company is demonstrating a room-scale version of a Presenz-enabled scene.
For the entire history of CGI, creators have had to choose between the high fidelity visuals of pre-rendered CGI or the interactivity of real-time CGI. Then along comes Presenz, promising to mash together these two formerly incompatible benefits into a single solution.
Nozon first revealed Presenz in early 2015, demonstrating positional tracking parallax—the ability to look around objects as you move your head through 3D space—in a small area around the users head in a scene with animated pre-rendered CGI visuals with complexity which would otherwise bring a real-time game engine to a halt.
I’ve seen it for myself and it’s everything they say it is: the visuals of pre-rendered CGI with the positional tracking parallax that’s normally only possible with a scene rendered in real-time. The area in which you can move your head about the scene is only about a meter square. If you hit the edge of that area the scene will fade out as the view of the scene in the area beyond that space has not been pre-rendered. This is of course a limitation if users want to be able to crouch, jump, or walk around a larger scene.
For the first time the company is now showing a room-scale Presenz-enabled scene which is navigable with the HTC Vive & Lighthouse tracking system. In the video above, we see a pre-rendered scene which can be navigated from one end to another seamlessly just like a real-time experience. Nozon calls the scene’s viewable area the ‘zone of view’ rather than a singular ‘point of view’ (as you would be stuck with using a traditional pre-rendering approach.
So what’s the downside to this seemingly magic solution to the pros and cons balance of pre-rendered vs. real-time CGI? Well for one, interactivity is currently limited. You may be able to navigate through the scene, but interaction in the traditional real-time sense is not possible as the scene is still pre-rendered. Nozon says that they’re developing the ability to add real-time interactive elements into their Presenz scenes, but so far they’ve only demonstrated support for pre-rendered animations.
Another downside to Presenz is file size. Relatively simple scenes can climb to the gigabyte count quickly (likely scaling with the size of the zone of view), though Nozon says they are working on compression schemes which “make it possible to reduce [the file size of] some scenes by a factor 10.”
Framerate is also another downside. A Presenz scene can only currently be animated up to 25 FPS (though the headset still views the scene at its own native refresh rate). It isn’t clear yet if this is a technical limitation or a means to keep the file size down.
Those of you following along carefully will probably notice some commonalities between the Presenz solution and lightfields. I certainly did, and so I queried Nozon about the differences between the two. The company insists that, despite the similarities, Presenz is a patented solution which differs from lightfields. My efforts to understand the precise differences didn’t get very far as the company is understandably careful not to dig into the specifics of their technology.
See Also: OTOY Shows Us Live-captured Light Fields in the HTC Vive
However, Nozon’s Matthieu Labeau provided me with a broad comparison between the two solutions (the skeptical reader will understand that this list is likely to lean in Nozon’s favor):
Benefits
Benefits
Current Limitations
Current Limitations
Computer Minimum Specs
Oculus recommended spec + Raid0 SSD
Computer Minimum Specs
Both technologies are still in development so the list here is likely to be in flux for a while to come. Either way, Presenz seems to be making good headway in combining the benefits of pre-rendered and real-time CGI, though there are still a number of limitations that will need sorting before broad application of the technology is possible.
Samsung today became the first company to market with a consumer oriented virtual reality headset in decades as its Samsung Galaxy mobile phone powered Gear VR launched in the US today.
Ben Lang jokes that we’ve been at ‘Year Zero’ of VR for three years now, and the official release of the Gear VR today marks the first official launch of a consumer-ready virtual ready head-mounted display. Ben joins me on the podcast to talk about some of the technical details that allow the Gear VR to drive such a compelling virtual reality experience, as well as some analysis of the larger virtual reality market. There are a lot of high expectations that virtual reality will be able to grow and evolve into the ultimate potential that we all hope it can be, and so we take a look at how the smartphone market evolved over time and what we can expect to see over the next year.
Pre-orders for the Gear VR, the mobile VR headset made in collaboration between Samsung and Oculus, opened up on November 10th. With the headset due to be released on the 20th, it looks like Best Buy has sold out of its initial stock, but other stores remain stocked.
Matt McIlwain is a managing director at the Madrona Venture Group, which recently announced their first investment in the virtual reality space with a $4 million Series A round of funding for Envelop VR. Matt talks about why Seattle is one of the top hotbeds for augmented and virtual reality because there’s a wide variety of hardware, software, gaming, and cloud computing companies including HTC, Valve, Oculus, Facebook, Amazon AWS & Twitch, Microsoft Xbox & Hololens, and Nintendo of America. Matt talks about Madrona’s investment in Envelop VR as well as their strategy finding companies building horizontal software for AR and VR as well as other vertical commercial opportunities of new models of distribution that AR and VR enables.
EVE: Gunjack, The mobile counterpart of the hugely anticipated VR multiplayer space shooter EVE:Valkyrie, is released tomorrow on Samsung Gear VR. We go hands-on with the consumer release and find out that CCP Games have succeeded in setting a new benchmark for visual fidelity in mobile virtual reality.
Nokia has sent out invites for a forthcoming event in Los Angeles on November 30th asking guests to “join us for the exclusive unveiling” of Ozo, the company’s professional VR camera.
As developers continue to experiment with a range of VR navigation techniques, Virtuix’s Omni treadmill gives gamers a way to physically walk and run around virtual worlds in an otherwise limited space. The company’s latest developments bring compatibility with the HTC Vive and its Lighthouse tracking tech, enabling ‘decoupled’ manipulation of walking, looking, and aiming.