Dhiraj-JeyanandarajanWhen I attended the Experiential Technology Conference in May 2016, I heard from a number of commercial off-the-shelf brain-control interface manufacturers that their systems would not natively work with VR headsets because there are some critical portions on the head that are occluded by VR headset straps. Companies like Mindmaze VR are building in integrated EEG hardware primarily for high-end medical applications, and perhaps we’ll start to see more EEG hardware integrations in 2017.

Qneuro is an educational company that was exhibiting at the Experiential Technology Conference, and they had some early VR prototypes that used EEG as input within a lab environment. Qneuro founder Dhiraj Jeyanandarajan is a clinical neurologist and works as a neurophysiologist who looks at real-time electrophysiological signals to make corrections during brain or spinal surgeries. He’s also a father who got frustrated with the educational games that were available for his two kids, and so he started Qneuro to create educational games that integrated real-time EEG feedback.

Qneuro has been building 3D environments in Unity and launching them on the iPad, and they’re still waiting for a more integrated hardware solution before launching their virtual reality version. I had a chance to catch up with Jeyanandarajan at the XTech Conference to see what they’re able to do with real-time EEG feedback within a lab environment to improve the learning process within their educational game.

LISTEN TO THE VOICES OF VR PODCAST

Jeyanandarajan said that they’re using Cognitive Load Theory to improve the efficiency of learning. They’re using the EEG data to detect how hard users are thinking, and then they’re dynamically reducing distracting factors like visual and auditory complexity or increasing the frequency of hints that are provided. Here’s more from their blog as to how they’re using Cognitive Load Theory:

Our research facility and team continue to investigate key concepts within cognitive load theory such as, efficiency in learning, cognitive load, multi-modality, schemas, automation, the split attention effect, guided instruction and modifications to instructional design from novices to experts, through research data gathered in real time from our own experiments and primary research.

It’s an open question as to how effective brain-control interfaces (BCI) will be in providing real-time interactions within VR environments. <a href=”“>OpenBCI co-founder Conor Russomanno told me in May that the real power of EEG data from brain-control interfaces is not in real-time interactions, but rather it’s the Electromyography (EMG) signals that are much stronger and easier to detect for real-time interactions:

Russomanno: I think it’s really important to be practical and realistic about the data that you can get from a low-cost dry, portable, EEG headset. A lot of people are very excited about brain-controlled robots and mind-controlled drones. In many cases, it’s just not a practical use of the technology. I’m not saying that it’s not cool, but it’s important to understand that this technology is very valuable for the future of humanity, but we need to distinguish between the things that are practical and the things that are just blowing smoke and getting people excited about the products.

With EEG, there’s tons of valuable data that is your brain over time in the context of your environment, not looking at EEG or brain-computer interfaces for real-time interaction, but rather looking at this data and contextualizing it with other biometric information like eye-tracking, heart rate, heart rate variability, respiration, and then integrating that with the way that we interact with technology, where you’re clicking on a screen, what you’re looking at, what application you’re using.

All of this combined creates a really rich data set of your brain and what you’re interacting with. I think that’s where EEG and BCI is really going to go, at least for non-invasive BCI.

That said, when it comes to muscle data and micro expressions of the face and jaw grits and eye clenches, I think this is where systems like open BCI are actually going to be very practical for helping people who need new interactive systems, people with ALS, quadriplegics.

It doesn’t make sense to jump past all of this muscle data directly to brain data when we have this rich data set that’s really easy to control for real-time interaction. I recently have been really preaching like BCI is great, it’s super exciting, but let’s use it for the right things. For the other things, let’s use these data sets that exist already like EMG data.

Voices of VR: What are some of the right things to use BCI data then?

Russomanno: As I was alluding to, I think looking at attention, looking at what your brain is interested in as you’re doing different things. Right now, there are a lot of medical applications ADHD training, neuro-feedback training for ADHD, depression, anxiety, and then also new types of interactivity such as someone who’s locked in could practically use a few binary inputs from a BCI controller. In many ways, I like to think of the neuro revolution goes way beyond BCI. EMG, muscle control, and all of these other data sets should be included in this revolution as well, because we’re not even coming close to making full use of these technologies currently.

In the short-term, it’s still an open question as to how effective EEG data will be able to provide within the context of a real-time game. The quality and fidelity of the data depends upon how many EEG sensor contact points will be able to make a direction connection to the skin on your scalp. The more sensors that available will provide better data, but may be more inconvenient to use. Since the most crucial contact points are at the same place as to where the VR straps are at, then using EEG input for a input to a VR experience may require a custom integrated headset like Mindmaze.

The Neurogaming Conference rebranded itself last year to become the Experiential Technology Conference & Expo, perhaps as a de-emphasis on real-time interactions in games and more of a focus on other medical or educational applications. There were also a lot of companies at the Experiential Technology Conference who were using machine learning techniques in order to amplify the noisy and complicated EEG signals coming from BCI devices. These AI techniques could also be used to detect the level of attention as well as different emotional states.

In the long-term, virtual reality will be likely integrating more and more biometric data as feedback into VR experiences. <a href=”The Intercept recently wrote about how VR could be used to gather the most detailed & intimate digital surveillance yet, and so there are a lot of unresolved privacy implications that come with using biometric data with VR experiences. This is something that the virtual reality community and privacy advocates will need to push back on companies to evolve their terms of service and privacy policies for what type of data is collected and stored, and how it can be used and not used.

There are currently a lot of challenges of using EEG or EMG data to controlled VR experiences, but there is also a lot of potential ranging from individualized educational applications, medical applications, personalized narratives based upon your emotional reactions, and biofeedback experiences that help deepen contemplative practices.


Support Voices of VR

Music: Fatality & Summer Trip

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


  • Albert Hartman

    maybe a better application would be a warring game where points are scored if you can upset the other player’s EEG?

  • Trung Ngo

    Does a Combination of Virtual Reality, Neuromodulation and Neuroimaging Provide a Comprehensive Platform for Neurorehabilitation? – A Narrative Review of the Literature.
    See http://journal.frontiersin.org/article/10.3389/fnhum.2016.00284/full and my comment on this article.