Next month, Samsung Developer Conference 2014 will include a virtual reality track for the first time. The track will see eight, 50 minute sessions dedicated to VR. Among them, two sessions hosted by Oculus, including ‘Getting Started with the Oculus Mobile SDK’ which will be of particular interest to developers hoping to work with Samsung’s new Gear VR headset.
Gear VR, Samsung’s recently announced mobile VR headset, is hotly anticipated by developers and enthusiasts alike within the VR community. The new headset, which was built in conjunction with Oculus, is largely expected to set the bar for mobile VR performance and experience. The headset combines with Galaxy Note 4 which provides the display and processing power necessary to experience VR on the go.
While Gear VR is expected soon, it still hasn’t been given an official released date. Neither has the Oculus Mobile SDK, which will allow developers to create experiences for the new device.
In September, Oculus Head of Mobile, Max Cohen, told Road to VR that the company was aiming to release their mobile SDK in October. He noted that the goal was to release on the early side of the month, though with no release yet, the company appears to be still hard at work readying the SDK for developers.
However, it would seem that Oculus intends to have the SDK ready at least by the start of Samsung Developer Conference 2014, which will be held from November 11th–13th at the Moscone Center in San Francisco, California.
There, Oculus will host two sessions on VR as part of the conference’s virtual reality track. One of which is ‘Getting Started with the Oculus Mobile SDK’ which will introduce developers to the process of creating mobile virtual reality experiences for Gear VR using the SDK. The session will be hosted by Oculus Senior Engineers J.M.P. van Waveren and Jonathan Wright. We expect that the Oculus Mobile SDK will be available for developers at some point prior to the session, which puts its release some time in the next few weeks.
The other Oculus hosted session at SDC 2014 is titled ‘The Human Perceptual System in Virtual Reality’ and will be presented by Richard Yao, a research scientist at Oculus. The session will focus on how the human perceptual system interfaces with VR devices like the Oculus Rift and Gear VR, with an eye toward creating comfortable experiences.
Below we’ve pulled the full VR track schedule for your perusal (all times PST):
Virtual Reality Track Sessions at Samsung Developer Conference 2014
Wednesday, November 12th
1:00 PM – VR Design: Transitioning from a 2D to 3D Design Paradigm
Alex Chu (Samsung Research America – Dallas)
The commercialization of virtual reality provides an exciting space for designing user experiences and interfaces. However, this mostly unexplored technology holds a different set of opportunities and challenges for traditional two-dimensional designers. In this session, we will review the design/development process for Samsung’s XVu player, discuss the switch from 2D to VR design, and look at tips for creating your own VR experiences.
2:00 PM – Creating Realistic VR-Animated Videos
Dan Ferguson (Reel FX), Brad Herman (Dreamworks), Mike Woods (Framestore)
Learn how animation studios Reel FX, Dreamworks, and Framestore created VR effects that make you feel like you are immersed in the animation. Join a lively discussion about some of their technical and creative challenges and see how they solved them.
Moderator: “Cymatic” Bruce Wooden
3:00 PM – Getting Started with the Oculus Mobile SDK
J.M.P. van Waveren (Oculus), Jonathan Wright (Oculus)
This session will focus on the Oculus SDK for Mobile developers. Topics will include tips for getting started with the SDK for both native and Unity application development, an overview of the TimeWarp architecture and its impact on development, tips for high-performance mobile VR applications, interface design in VR, planned improvements, and insights gained and challenges overcome during the development of Gear VR. Attendees will have an opportunity to ask questions of the engineers responsible for developing the Gear VR SDK.
4:00 PM – Virtual Reality and Telepresence: 50 Years from Dreams to Reality
Henry Fuchs (University of North Carolina at Chapel Hill)
VR may be at a historic inflection point not unlike the personal computer just before the IBM PC’s introduction, about to transition from a niche product to widespread adoption. What’s worrisome is that this transition was also predicted at least once before, in the early 1990s, when VR systems first became commercially available. What is different this time? This talk will review the history of VR, the development of the component technologies and several representative applications. We’ll review the key technical problems to be solved, assess their current state of effectiveness, and note how the situation is different now than during the last promising era two decades ago. We’ll conclude with a tour of remaining technical challenges (such as merging real and virtual worlds), a look at some new application areas, and speculate on why a VR startup company might be worth $ 2 Billion.
5:00 PM – Creating Realistic Sound for True Spatial Immersion in VR Applications
Anish Chandak (Impulsonic, Inc.), Dinesh Manocha (University of North Carolina at Chapel Hill)
Our experiences in the real world are formed using multiple senses and include audio cues from objects which may not even be visible. In virtual reality, one of the main challenges is recreating this true spatial immersion. In this session, you will learn about Phonon, a cross-platform audio physics engine that uses the physics of sound to recreate spatial immersion. Phonon analyzes the 3D model of the environment, and figures out direct sound paths, early reflections and occlusion, and reverberation in real-time. Phonon is available as a Unity plugin, and a C API that supports integration with any game engine and audio engine. Using advanced techniques such as Head-Related Transfer Functions (HRTFs), convolution reverb, and physically-based diffraction, Phonon can help you deliver true spatial immersion in VR applications.
Thursday, November 13th
10:00 AM – Interacting with VR Today and Tomorrow
Andrew Dickerson (Samsung Research America – Dallas), Jan Goetgeluk (Virtuix), Eric Romo (AltspaceVR), Amir Rubin (Sixense)
Since the Facebook acquisition of Oculus, VR has been rapidly advancing into the everyday consumer’s world. A panel of long-time VR interaction experts that have predicted this movement will share their thoughts on where VR is today, and where it will be in the next few years. They will also discuss mapping walking in the real world to your virtual character, picking up objects in VR, the sense of presence those generate, and the pros and cons of mobile vs. desktop VR.
Moderator: Ben Lang
1:00 PM – The Human Perceptual System in Virtual Reality
Richard Yao (Oculus)
This talk will focus on the human component of VR: your perceptual system. Virtual reality systems like the Oculus Rift and Gear VR replace the visual input to your brain in a way technology never has before; for better or worse, this raises unprecedented opportunities and challenges. I will give a brief, high-level overview of some key perceptual system functions, some interesting phenomena that occur in VR, and the importance of understanding them all for building comfortable content.
5:00 PM – Developing High-performance Apps for Gear VR
Learn how VR experts approached and overcame CPU/GPU bandwidth limitations to create compelling experience for Gear VR. Join a lively discussion about some of their technical and creative challenges, and see how they solved them.