Valve co-founder Gabe Newell previously revealed Valve was working on a brain-computer interface (BCI) with OpenBCI, the minds behind open source BCI software and hardware solutions. Now Tobii, the eye-tracking firm, announced it’s also a partner on the project and that developer kits incorporating eye-tracking and “design elements” of Valve Index are expected to first ship sometime early next year.

Update (February 5th, 2021): Newell’s interview, which is referenced below in the original article, didn’t reveal whether OpenBCI project ‘Galea’ was the subject of the partnership, however now Tobii has confirmed that it is indeed the case along with a few other details.

Tobii will be lending its eye-tracking technology to Galea, which it says will incorporate design elements from Valve Index. Developer kits for early beta access partners will ship in early 2022, the companies say.

Reading between the lines somewhat, it appears Galea will not only include the sensors-packed strap, but also an eye-tracking enabled, possibly modified Valve Index headset.

Original Article (January 28th, 2021): Newell hasn’t been secretive about his thoughts on BCI, and how it could be an “extinction-level event for every entertainment form.” His message to software developers: start thinking about how to use BCI now, because it’s going to be important to all aspects of the entertainment industry fairly soon.

How soon? Newell says in a talk with News 1 that by 2022, studios should have them in their test labs “simply because there’s too much useful data.”

Gabe Newell (right), psychologist Mike Ambinder (left) | Image courtesy Valve

Newell speaks about BCI through a patently consumer-tinted lens—understandable coming from a prominent mind behind Steam, the largest digital distribution platform for PC gaming, and not to mention an ardent pioneer of consumer VR as we know it today.

To Newell, BCI will allow developers to one day create experiences that completely bypass the traditional “meat-peripherals” of old in function—eyes, ears, arms and legs—giving users access to richer experiences than today’s reality is capable of providing.

“You’re used to experiencing the world through eyes, but eyes were created by this low-cost bidder that didn’t care about failure rates and RMAs, and if it got broken there was no way to repair anything effectively, which totally makes sense from an evolutionary perspective, but is not at all reflective of consumer preferences. So the visual experience, the visual fidelity we’ll be able to create — the real world will stop being the metric that we apply to the best possible visual fidelity.”

On the road to that more immersive, highly-adaptive future, Newell revealed Valve is taking some important first steps, namely its newly revealed partnership with OpenBCI, the neurotech company behind a fleet of open-source, non-invasive BCI devices.

SEE ALSO
Road to VR’s 2020 Game of the Year Awards

Newell says the partnership is working to provide a way so “everybody can have high-resolution [brain signal] read technologies built into headsets, in a bunch of different modalities.”

Back in November, OpenBCI announced it was making a BCI specifically for VR/AR headsets, called Galea, which sounded very similar to how Valve’s Principal Experimental Psychologist Dr. Mike Ambinder described in his GDC 2019 vision for VR headsets fitted with electroencephalogram (EEG) devices.

OpenBCI hardware | Image courtesy OpenBCI

Although Newell doesn’t go into detail about the partnership, he says that BCIs are set to play a fundamental role in game design in the very near future.

“If you’re a software developer in 2022 who doesn’t have one of these in your test lab, you’re making a silly mistake,” Newell tells 1 News. “Software developers for interactive experiences — you’ll be absolutely using one of these modified VR head straps to be doing that routinely — simply because there’s too much useful data.”

There’s a veritable laundry list of things BCI could do in the future by giving software developers access to the brain, and letting them ‘edit’ the human experience. Newell has already talked about this at length; outside of the hypotheticals, Newell says near-term research in the field is so fast-paced, that he’s hesitant to commercialize anything for the fear of slowing down.

SEE ALSO
Where to Change Quest 2 Privacy Settings and See Your VR Data Collected by Facebook

“The rate at which we’re learning stuff is so fast that you don’t want to prematurely say, ‘OK, let’s just lock everything down and build a product and go through all the approval processes, when six months from now, we’ll have something that would have enabled a bunch of other features.”

It’s not certain whether Galea is the subject of the partnership, however its purported capabilities seem to line up fairly well with what Newell says is coming down the road. Gelea is reportedly packed with sensors, which not only includes EEG, but also sensors capable of electrooculography (EOG) electromyography (EMG), electrodermal activity (EDA), and photoplethysmography (PPG).

OpenBCI says Galea gives researchers and developers a way to measure “human emotions and facial expressions” which includes happiness, anxiety, depression, attention span, and interest level—many of the data points that could inform game developers on how to create better, more immersive games.

Provided such a high-tech VR headstrap could non-invasively ‘read’ emotional states, it would represent a big step in a new direction for gaming. And it’s one Valve clearly intends on leveraging as it continues to both create (and sell) the most immersive gaming experiences possible.


Interested in watching the whole interview? Catch the video directly on 1 News.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.


  • Xron

    Interesting, so 2023 we might see first gen of Vr Bci?

    • Vegeta785

      That’ll be neat.

  • Blaexe

    Sounds too good to be true imo…I mean, Gabe also said “Wireless is a solved problem at this point” back in 2017 (!). And yet – here we are in 2021 with a wired Valve Index.

    • kontis

      He also said ~10 years ago that that consoles are finished and PS4 won’t stand a chance against Apple TV that would steamroll living room like they did in mobile.
      It was the only reason he tried to make the Steam Machines (some people and journalists misunderstood it to be an answer to Windows 8 as Microsoft was having Apple-style ambitions for integrated store and more locked down experience, but it was not related to Valve entering the living room – both failed)

    • Bob

      Unless he has a solid plan of some sort of capability for humans to jack into their own brains to experience virtual reality a la The Matrix, BCI isn’t the radical and revolutionary technology that will create a generational leap in presence within the audiovisual department. A tiny step towards it? Sure.

    • Charles

      Well, wireless VR is a solved problem if you limit your scope. There are wireless mods for VR, such as for the Vive Pro. It’s just a matter of resolution/framerate limits and price.

      • Blaexe

        “So, my expectation is that it will be an add-on in 2017, and it will be an integrated feature in 2018.”

        That’s the next part of that statement.

        • Charles

          Hmm. Well he was a year late on the first prediction. Still hasn’t been an integrated feature yet.

    • mirak

      Magic Leap was too good to be true, but here it’s Gabe Newel, he doesn’t have anything to sell, he is just here for the fun xD

  • FrankB

    Cave Johnson must surely be based on Gabe Newell.

  • namekuseijin

    > too much useful data

    hey, if Lord Gaben said so, so be it. must be cool seeing and feeling what they want us to, including pain and obedience.

    if Apple can scan your whole house and offices with widespread LiDAR, so be it.

    if Zucc puts a $300 VR console on your head to play games, god damn the sky is falling, they’re spying and mind-controlling us…

    • alboradasa

      It’s weird right? People are acting as if Facebook has repeatedly betrayed the trust and confidence of thier users..

  • namekuseijin

    btw, just like with VR headsets, we sure have come a long way in design and comfort for these mind readers

    https://uploads.disquscdn.com/images/9a58a10edc6cc595cce52d45e7990b08058541d9d681e59698dc5263c04e0c29.jpg

  • kontis

    The first BCI in VR should be possibly the simples solution that can only get 1-bit data from the brain and NOTHING more.

    Why?

    Because 1-bit data with eye tracking will be more revolutionary than all this “reading emotions” BS.

    This is John Carmack’s idea. A simple “mouse click” with brain would solve the eye tracking’s biggest UX problem and would turn it into a next gen super human point and click device.

    This is an incredible low hanging fruit. The first company to do that is going to rule the HMD industry for some time.

    • Azreal42

      It does already exists, isn’t it ? Epoc-X from Emotiv and NextMind product already can detect an “action”.
      But it is not that simple, you can’t expect EEG to work fast enough for mouse clicking. (We will probably need invasive sensor to do that).
      That’s why people are switching to emotional stuff. It’s easier to read because lot of emotion can be inferred from face muscles (which are parasites in normal EEG) or skin (hence the bunch of other sensors in Galea i think).

      • Anonymous

        Basically first half of this.
        EEG at its current state simply isn’t good enough to accurately interpret all action potentials – which is basically anything one does in daily life.
        There are talks about collecting more brainwave samples to develop an AI algorithm that might be able to predict user intentions better and to filter out noises from the skull, but I am not aware of any major breakthrough papers on it yet.

        Emotions are much easier to read because all you need is to know where, how fast the electric activities are occurring in the brain. We already have enough neuroscience data to interpret most of it. Emotion data alone is powerful enough for developers to make a far more interactive and personal VR experience.
        To say emotion reading is BS is… lacking the most basic understanding of brain functions and how EEG works.

        As for your second half, I personally feel that a lot of it can be achieved by cardio and muscle sensors already, albeit too bulky for normal consumer uses.

        • Azreal42

          Hey, there is two post and we are two people you know, I didn’t say emotion reading is BS.
          It kinda works, in lab. But now i’ve say that, i’m not sure on the feasibility with the current tech on Emotion reading through EEG. From my understanding, EEG is tricky, since there is a lot of noise from muscle. And in vr you move a lot, you use your jaw, when talking, use your eyes, and i wonder what can be achieved with current SOTA in theses conditions.
          And from my perspective, Emotion != Meditation. When reading how fast the electric activities are occurring, it seems to me we are measuring the “calmness” of the subject.
          The emotion are much easier to read with the skin, the heart frequency and other stuff.
          That’s why i said EEG in VR are still a long way for thoses usages. But EMG can provide very meaningfull data and are easier to interprete.

          As for the future on reading intention with ML, i think it might become better (with more data or with different algo), but to the better of my knowledge, right now, reading intention is done with motor imagery which require (a lot of) training. Even with more sensors and better classifiers, I don’t think we will see that on consumer or enthusiast in the current decade.

          • sara.
          • Anonymous

            Ah sorry you misunderstood me. I was merely trying to reply two posts together and not referring to you specifically. Apology.

            And you are right about your points too. Emotion does not equal attention level (meditation) which indeed is what a lot of the consumer grade EEG can only manage best now. Nevertheless it is still possible to do some guess work.

            I think the best scenario moving on for consumer device may involve combining multiple sensor types like MEG (5 years ago I read an journal on NIST about research on significantly downsizing it), fNIRS, etc. to complement each other. First generation BCI-VR I also believe will be rather crude and can only do some fun, but rather gimmicky, tasks (which is ok… better than nothing)

  • Arthur

    “…gives researchers and developers a way to measure “human emotions and facial expressions” which includes happiness, anxiety, depression, attention span, and interest level—many of the data points that could inform game developers on how to create better, more immersive games.”

    Obligatory paranoid rant:
    It’s nice to imagine all the cool things this tech will enable devs to create for us. On the flip side, it’s scary af to imagine how publishers will try to exploit this for profit.

    These companies have no moral quams preying on the minority of people who are vulnerable to gambling addiction with loot boxes. Even if, in their games, there was no gambling dynamic and you could directly buy whatever you wanted – sure, maybe you don’t buy in-game things on the regular or at all, but, have you ever made an exception for something you saw and realized you just had to have? Couldn’t the data from this tech be used for a super deep understanding of a person, where the individual-specific conditions in an experience are created to provoke that response? Low-key enough that you don’t know it’s happening. Same for selling irl shit. I’d like to think I would easily recognize an attempt to influence me. But then again, my behaviour and emotions have never been observed from my own POV and analyzed to that end before. How much of our personality do we give away in our subconscious behaviour? Why wouldn’t they just skip ads and just subliminally suggest shit? A few wild thoughts here but some version of this scenario seems plausible to me in the future. Hopefully it’s avoidable.

    There should be some minimal ethical standard to hold the industry to for BCI data management. And have the subject be addressed beforehand. Lest any technophobic policy makers get ham-fisted with regs if there’s a lot of misinformation when it gets mainstream coverage. On that note, the anti-5g folks are really gonna lose their shit with this. HMD makers may have to up their security when the consumer product first starts rolling out.

    • vbscript2

      On the flip side, maybe there’s a marketing opportunity for the anti-5G folks here: Valve could sell BCI VR HMDs surrounded with tin foil to keep the 5G Covid rays out. :)

    • “Qualms”.

  • Jorge Gustavo

    Ok… So, i know we are talking about Valve… But this means that at some point in the future, we will have a Facebook Quest with brain reading functions? And that is a good thing?

  • Boz has said that we’ll see the first crappy implementations of ctrl+labs in facebook stuff in 3 years. So this from gaben seems a bold statement, mostly released just to make hype and attract some talents. I’m experimenting with bci too, and at the moment the information is too limited and noisy to actually be used in consumer products

    • Innovation Investor

      Have you used Nextmind? There’s no telling what kind of research Valve has going on behind the scenes, but by the time it hits markets, it has to be better than Nextmind which is already pretty good. Valve is also partnering with the one and only OpenBCI which I’m sure is miles ahead of Facebook on this front.

      With early beta access kits being released in 2022, it doesn’t seem too farfetched for a product to hit the markets by 2030. I’m hoping it all happens ASAP, but 2028-2030 seems reasonable.

  • MrSmileyFace

    Whyyyy does it use EEG instead of infrared

  • So what is Newell saying exactly …?
    1] Soon visuals & audio will be sent directly into your brain
    without the need for screens and headphones.

    2] Or is this some gimmicky bullshit where you’ll think “I am cold”,
    and a sensor will read this, send it to the SoC and then
    the software displays a snowy, Wintery scene on the screen

  • Tobii is the VILE COMPANY who’s standing in the way of Eye Tracking for all VR headsets with their outrageous, predatorial pricing structures. They are stopping any and all development with their crackpot patents. Why is nobody talking about this???

    Cameras are around $1 each, if even that much. No “Tobii” powered headset is less then $1000. What they’re charging for Eye Tracking must be SHAMEFUL!

    If you want to know why we don’t have Foveated Rendering, Tobii is the cause!

  • Eric Draven

    Sword Art Online, is that you?

    • Mohammad Al-Ali

      Wake up Samurai, we have a world to beat

  • JDawg

    This was like when people thought voice control would be cool. But when I can just push a button instead of a 2 second delay from voice or BCI then why would I use anything but a button? Voice control can be OK for immersion or when you run out of buttons but still doesn’t get used much today.

    BCI is over-hyped and will end up being cool but gimmicky.

  • Meekosmic

    Sword Art Online coming sooner than expected