Alan Yates Says Rift’s Core Features Are a “Direct Copy” of Valve’s VR Research

136

In a revealing statement on Reddit, Alan Yates, a Valve employee who works closely on the company’s Lighthouse tracking system, claims that the Oculus Rift headset is largely derived from Valve’s research.

Google Daydream Leads on the Future of Mobile VR

1
Daydream headset

Andrey-DoronichevAndrew-NartkerGoogle’s mission is “to organize the world’s information and make it universally accessible and useful,” but what happens if there’s a shift from the Information Age to the Experience Age? Google’s Daydream mobile VR platform is part of that answer. It’s Android’s next phase moving beyond the minimum viable Cardboard VR viewers, where it’s starting to really leverage the Android hardware and software ecosystem to help bring virtual reality to the world at scale.

Democratizing Neuroscience with OpenBCI & Adapting Content with Biofeedback

4

conorOpenBCI is an open source, brain control interface that gathers EEG data. It was designed for makers and DIY neuroengineers, and has the potential to democratize neuroscience with a $100 price point. At the moment, neither OpenBCI nor other commercial off-the-shelf neural devices are compatible with any of the virtual reality HMDs, but there are VR headsets like MindMaze that are fully integrated their headset with neural inputs. I had the chance to talk with OpenBCI founder Conor Russomanno about the future of VR and neural input on the day before the Experiential Technology and Neurogaming Expo — also known as XTech. Once the neural inputs are integrated in VR headsets, then VR experiences will be able to detect and react whenever something catches your attention, your level of alertness, your degree of cognitive load and frustration, as well differentiating between different emotional states.

LISTEN TO THE VOICES OF VR PODCAST

“Neurogaming” is undergoing a bit of rebranding effort towards “Experiential Technology” to take some of the emphasis off of the real-time interaction of brain waves. Right now the latency of EEG data is too slow and it is not consistent enough to be reliable. One indication of this was that all of the experiential technology applications that I saw at XTech that integrated with neural inputs were either medical and educational applications.

Conor says that there are electromyography (EMG) signals that are more reliable and consistent including micro expressions of the face, jaw grits, moving your tongue, and eye clinches. He expects developers to start to use some of these cues to drive drones or do medical applications for quadriplegics or people who have limited mobility from ALS.

There are a lot of privacy implications once you start to gather some of this EEG data, and Conor is particularly sensitive to this. He says that recent research is indicating that EEG signals are very unique to each person, and represent a unique digital signature that could trace anonymously submitted data back to you. He says that companies of the future will need to take into consider a strict privacy policy, and not use this data to exploit their users.

At the same time, there were a number of software-as-a-service companies at XTech who were taking EEG data and applying their own algorithms to extrapolate emotions and other higher-level insights. A lot of these algorithms are using AI techniques like machine learning in order to capture a baseline signals of someone’s unique fingerprint and start to train the AI to be able to make sense of the data. AI that interprets and extrapolates meaning out of a vast sea of data from dozens of biometric sensors is going to be a big part of the business models for Experiential Technology.

Once this biometric data starts to become available to VR developers, then we’ll be able to go into a VR experience and be able to see visualizations of what contextual inputs were affecting our brain activity and we’ll start to be able to make decisions to optimize our lifestyle.

I could also imagine some pretty amazing social applications of these neural inputs. Imagine being able to see a visualization of someone’s physical state as you interacting with them. This could have huge implications within the medical context where mental health consolers could get additional insight and the physiological context that would be correlated to the content of a counseling session. Or I could see experiments in social interactions with people who trusted each other enough to be that intimate with their inner most unconscious reactions. And I could also see how immersive theater actors could have very intimate interactions or entertainers could be able to read the mood of the crowd as they’re giving a performance.

Finally, there are a lot of deep and important questions to protect users from loosing control of how their data is used and how it’s kept private since it may prove impossible to completely anonymize it. VR enthusiasts will have to wait on better hardware integrations, but the sky is the limit for what’s possible once all of the inputs are integrated and made available for VR developers.

Subscribe on iTunes

Donate to the Voices of VR Podcast Patreon

Music: Fatality & Summer Trip

Rev VR Podcast (Ep. 120): Onward to New Places, Real & Virtual

9

After an excellent partnership of two years, this is the final episode of the Rev VR Podcast that will be published through Road to VR. Ben Lang, Executive Editor of Road to VR joins me for a farewell episode.

Manus VR’s Gloves in Action Using Valve’s Lighthouse Tracking for Hands and Arms

16

We’ve been tracking Manus VR, the start-up dedicated to producing an intuitive VR glove input device, for a while now. The team were present at GDC in March, showing their latest prototype glove along with their in-house developed game Pillow’s Willow.

Google Wants Established YouTubers to Make VR Video With Jump 360 Rig

0
YouTube VR on Daydream has built from the ground up for VR.

Google is delivering a dedicated VR YouTube app in their newly announced Daydream platform. Along with it, the company is making sure to give users a few reasons to jump head first into the app when it launches by encouraging more established YouTubers to get into 360 video.

Review: ‘Please, Don’t Touch Anything’ is a ‘Must Have’ VR Puzzle Game

Please, Don’t Touch Anything (PDTA), a puzzle game from BulkyPix, has recently released a new VR adaptation of the year-old 2D title. Now built from the ground up in 3D for Oculus Rift and Gear VR headsets, PDTA offers the most immersive button-pushing simulator currently in VR. I wish I was making that up.

AWE 2016 Brings the Best of AR & VR to Santa Clara, 45% Off for Road to VR Readers

1

The Augmented World Expo, one of the longest running immersive technology events, is back for a 7th year and it’s bigger than ever. With focus on augmented reality, virtual reality and wearable technologies, the expo is packing in 4000 attendees, 200+ speakers and 200+ exhibitors to the Santa Clara Convention Center, California from June 1st.

Google’s (Day)dream: ‘Hundreds of Millions of Users in a Couple of Years’

8

At I/O 2016 this week, Google laid out its ambitious goals for Daydream, the company’s new platform for native, high performance virtual reality experiences on Android.

Candy Crush Creator on Casual VR Gaming with Google’s Daydream

6

tommy-palmCandy Crush creator Tommy Palm has moved into making casual virtual reality games with Resolution Games. They’ve already released Bait! on the Gear VR, and it’s the first VR app to feature in-app purchases, and they just announced at Google I/O that they’re designing a launch title for Google’s Daydream mobile VR platform with a title called Wonderglade. I had a chance to catch up with Tommy about developing causal VR games that are interruptible, how they’re designing in natural breaks to not create games that are too addicting, thoughts on the future of free-to-play VR with in-app purchases, how VR games can be social without it being multi-player, developing a game with Daydream’s 3DOF controller, and how casual games may really start to blur the line between games and VR experiences.

LISTEN TO THE VOICES OF VR PODCAST

Here’s some other videos and updates from Google I/O including a new GoogleVR YouTube channel and the @GoogleCardboard has been deprecated, and Google’s main Twitter VR account is now @GoogleVR.

Here’s some of the relevant GoogleVR talks from Google I/O over the past couple of days (with more coming soon to their GoogleVR YouTube channel.

VR at Google Keynote where Daydream Labs was announced.

Daydream Labs: Lessons Learned from VR Prototyping. This is an absolute must-watch talk by any VR designer since they condensed lessons that they got from rapidly prototyping 60 experiences in 30 weeks.

Daydream Labs: Drum Keyboard will revolutionize text input.

Monetization and Distribution on Daydream

Designing & Developing for the Daydream Controller – Google I/O 2016

VR Design Process – Google I/O 2016

Learn more about the Cloud Vision and Speech API – Google I/O 2016

Start Making Google Daydream VR Apps Today with a DIY Dev Kit

9

This week Google announced Daydream, the company’s initiative for high-end mobile virtual reality. While Google didn’t reveal a dedicated dev kit, the company says developers can cobble together their own using the Nexus 6P smartphone, Cardboard, and a spare phone.

IMAX to Use StarVR 210 Degree FOV Headset for Premium VR Film Experiences

17

Starbreeze, the company behind the 210 degree FOV VR headset StarVR are to work with large-scale cinema specialists IMAX to create “premium location-based VR” experiences “worthy of the IMAX brand”.

Google is Working with IMAX on Cinema Grade VR Cameras

1

Google Jump, a 360 camera system and processing software platform announced at last year’s Google I/O, is adding IMAX and Yi Technology to its roster of camera manufacturers.

‘Portal Stories: VR’ Brings 10 Made-for-VR Puzzles to the World of Aperture Science

11

When I loaded up Portal Stories: VR I was greeted with familiar sights and sounds from the Portal universe. A disembodied voice told that I was about to take part in an Aperture Science Virtual Reality Experiment, words I was absolutely thrilled to hear.

Social Dynamics in AltSpaceVR After the Consumer Launch of VR

4

bruce-reggieAltSpaceVR is the first cross-platform social VR application that supports both the Gear VR, Oculus Rift, and HTC Vive. Now that all of these VR headsets have had their consumer launch, then AltSpace VR has been one of the top places in VR to have social interactions.

I had chance to catch up with Bruce Wooden (aka “Cymatic Bruce”) at SVVR 2016 to talk about what types of events they’ve been holding, what types of emergent social behaviors he’s seeing, and how they’ve been teaching VR etiquette within their welcome spaces for new users. We also talk about some of the social dynamics that occur when they mix together the full range of interaction fidelity between hand-tracked Vive, Leap Motion hands in the Rift, and just the head rotation of the Gear VR. There is a bit of a power dynamic that emerges where people with tracked hands can be more expressive and have more attention and power in a conversation, and so we talk a bit about the implications of that including whether or not it makes Gear VR users feel more like consumers than creatives, producers, or equal participants.

39,929FansLike
13,574FollowersFollow
66,541FollowersFollow
27,800SubscribersSubscribe

Latest Headlines

Features & Reviews