OpenBCI, the neurotech company behind the eponymous open-source brain-computer interface (BCI), are making a new hardware and software platform specifically for immersive headsets.

Called Galea, the company says its new hardware is designed to attach to both AR and VR headsets, arriving with a multiple sensors designed to monitor biometric data streams in real time.

Galea is said to include a bevy of sensors, such as electroencephalogram (EEG), electrooculography (EOG) electromyography (EMG), electrodermal activity (EDA), and photoplethysmography (PPG) sensors, which are intended to measure data from the brain, eyes, heart, skin, and muscles.

OpenBCI hardware | Image courtesy OpenBCI

The company says Galea will allow researchers and developers to measure “human emotions and facial expressions” including happiness, anxiety, depression, attention span, and interest level—and use it to create more immersive content tailored to the individual. The idea is very similar to what we heard from Valve’s resident experimental psychologist Dr. Mike Ambinder during last year’s GDC talk on how BCI will change the nature of game design.

SEE ALSO
'Rec Room' Plus Members Can Now Exchange In-game Tokens for Real Cash

OpenBCI says it will provide researchers and developers early access to Galea, intended for commercial and research applications. Galea will also include SDKs to bring its stream of biometric data into “common development environments,” the company says.

“I believe that head-mounted computers integrated with human consciousness will drive the next major technology paradigm shift,” said Conor Russomanno, CEO of OpenBCI. “Galea is the realization of six years of research and development. We are providing the world with a playground for experimentation and development using multi-modal biometric data in tandem with next generation wearable displays.”

Founded on the back of a successful Kickstarter campaign in late 2013, OpenBCI creates consumer-grade biosensing systems built with open-source software and hardware. Today, the company’s hardware is used by researchers, academic labs, artists, and developers across 89 countries.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.


  • Ad

    What could go wrong.

    • I highly recommend watching Kathryn Bigelow movie “Strange Days”. The brain recording/playback technology is awesome yet frightening.

      • Ron

        That was a great movie!

      • flamaest

        Don’t forget about the other movie “Brain Scan”. Also amazing.

        • Lorena Carter

          Get $192 of an hour from Google!… Yes this is Authentic as I just got my first payout and was really awesome because it was the biggest number of $24413 in a week…(b4599)… It seems Appears Unbelievable but you won’t forgive yourself if you do not check it >>>> http://www.LifestylesLinks.com ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

      • Jeannine Soto

        Get $192 of an hour from Google!… Yes this is Authentic as I just got my first payout and was really awesome because it was the biggest number of $24413 in a week…(b3693)… It seems Appears Unbelievable but you won’t forgive yourself if you do not check it >>>> http://www.LifestylesLinks.com |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

    • Anonymous

      Nothing can go wrong really.
      Non invasive EEG/EMG/ECG, which OpenBCI is based on, may only pick up noisy signals to be interpreted into simple stuff such as concentration levels and emotions in crude forms at best.
      It is often said to be like “trying to figure out a football match from outside the stadium based on the noises that leaked out”.

      Until someone can miniturize and combine multiple methods (EEG/MEG/fNIRS/fUS) to one headset to complement each other’s shortcomings, we are decades away from a commercial non-invasive product that is depicted in Sci-Fi.

      That being said, even if crude, there are still applications for these data tho. An example raised by Neurosymposium this year is to have a game’s weather altering due to player’s emotions.

      • Ad

        Even emotions seems like too much. If it can be used for games, it can be used for ads or be lumped in with other data.

        • Anonymous

          To me it is sort of a necessary evil of the digital world. No way around it.

          Besides, thinking on the bright side, emotion data would allow for much, much better and less annoying ads that would ramp up consumption and contribute to the economy.

          How many times are you irritated by some dumb ads on TV or YouTube wasting your time and even worse, antagonizing the company that created it?

          Maybe it’s just me, but if I am forced to watch ads one way or the other, I would rather watch those that are neatly created or thought provoking.

          Emotional data is also relatively abstract no matter how much we try to quantify it. These data while allowing companies to create more compelling ads would be much safer than factual data (age, political affiliation, address etc) that could breach privacy in much worse ways.

          • Ad

            “Besides, thinking on the bright side, emotion data would allow for much,
            much better and less annoying ads that would ramp up consumption and
            contribute to the economy.”

            This has to be a joke, it’s one of the most nauseating and gross things I’ve read in some time.

  • kontis

    We don’t need that.
    These archaic non-invasive solutions won’t achieve precision and reliability good enough to not be frustrating.

    What actually would be very valuable is a single, simple 1-bit sensor. Just like Carmack suggested- a single “mouse click” with brain. This way eye tracking would become 10x better and more useful. Just look at something and think “click”.

    Simple and revolutionary. Even without VR!

    • Now *THAT* I can see a use for. Leave it to Carmack to cut through all of the bulls**t!

  • asdf

    this is awesome technology to get in the hands of vr users and devs. We will learn a lot about current capabilities that will help us prepare for things like the neurolink. I know as a dev id love to get my hands on this and see what my brain does.

  • Wow, it seems amazing!

  • There’s more value in using those sensors to read the muscles in the face then there is *ANY* other use. There’s no meaningful data that can be pulled from the brain using these things.

    You know even the guys making it are struggling to think of any use it might have when they use the words, “paradigm shift”. The definition of that term might as well be, “Hell if we know what good it’s for!”.