The brain-computer interface revolution is just getting started

News Room

Whether it’s jacking into the Matrix or becoming a Na’avi in Avatar, connecting brains to computers is a science-fiction trope that I never thought I’d see become a reality. But increasingly, BCIs (brain-computer interfaces) have become a serious area of study in research labs, rapidly advancing from research labs to real human trials — perhaps most famously by the Elon Musk’s company Neuralink.

While this promises individuals with disabilities a greater degree of freedom and control, along with potential applications in gaming and health care, significant technical, ethical, and regulatory challenges remain. But the more I dug into the topic, the more I found leaders and researchers rising to the occasion to lead us responsibly into the future of the this groundbreaking technology.

What is a brain-computer interface?

Alvin Lucier: Music for Solo Performer (1965)

Let’s start at the beginning. In a sentence, BCIs are devices that bridge the gap, essentially translating, between your brain’s analog electrical signals and external digital machines.

“Bypassing the conventional communication channels for different tasks (e.g., vision, movement, and speech), BCI links the brain’s electrical activity and the external world to augment human capabilities in interacting with the physical environment,” a 2023 study from the journal Brain Inform reads. “BCI provides a non-muscular communication channel and facilitates acquisition, manipulation, analysis, and translation of brain signals to control external devices or applications.”

Early BCI development actually began back in the 1920’s with the advent of the electroencephalogram (EEG), a test that uses electrodes to amplify and then measure electrical activity in the brain. However, modern BCIs evolved in the 1970s through the work of UCLA’s Dr. Jacques Vidal, with funding from the National Science Foundation and DARPA. He was the first researcher to coin the term “brain-computer interface” as well.

Over the last half century, BCIs have found clinical use in a variety of applications, from mapping the inner workings of the brain to augmenting human cognition and motor skills. BCIs are even being used to restore physical mobility in patients suffering from injury and disease, such as ALS or brainstem stroke, or folks who are “locked-in” — cognitively intact but without useful muscle function.

The potential is incredibly exciting, but as you can likely imagine, there are some seemingly insurmountable challenges that the researchers of this technology are having to face head-on.

Non-invasive BCIs

You might assume that all modern brain-computer interfaces involve brain surgery, but the technology actually comes in many forms, depending on how close to the user’s gray matter the device is situated. There are wholly non-invasive types that we’re all familiar with, such as EEGs and MRIs, which simply monitor and record brain activity. Then, there are what is classified as “partially-invasive” endovascular EEGs, which use a catheter to deliver electrodes into the brain without requiring open brain surgery.

Non-invasive BCIs pick up the brain’s electrical impulses through the patient’s skull and scalp and transmit them directly to the external device. While this sounds appealing in that it doesn’t require brain surgery, the technology is rife with challenges.

One of the biggest problems with externally-worn BCIs, for example, is their low signal-to-noise ratios. This means that the electrical impulses picked up are often muddled with interference from the skull and scalp, making it difficult to accurately decode brain signals. Decoding these signals is further complicated by the brain’s intricate neural patterns, which require sophisticated algorithms and significant computational resources to interpret reliably.

I spoke with Dr. Jane Huggins, the director of the University of Michigan Direct Brain Interface Laboratory, to further understand the challenges presented in BCIs seen today.

“Let’s make a list of the things that affect your brain activity … well, maybe let’s make a list of the things that don’t because that’s going to be a shorter list,” Huggins quipped. “Everything from what the patient is currently seeing to the amount of light in the room to what they just ate for lunch to their emotional state, all can affect the amplitude of the signals and the complexity of what’s going on. It’s hard to pick out the pieces that you need.”

Meanwhile, in terms of comfort and usability, non-invasive BCIs can be uncomfortable to wear for extended periods due to bulky electrodes and headsets.

It’s why the idea of invasive brain implants has become the future of where this technology is headed, a future that came barreling into the present in 2024 like never before.

Direct access to the brain

The Telepathy implant being held.

Implantable BCIs takes the impulses directly from the brain matter the chip is sitting on and translates them into commands. It then wirelessly relays those command signals to an external device, which carries them out.

Dr. Huggins makes the case that although some people will always be uneasy about the idea of implanting a device in your brain, in the long run, it’s the most convenient option.

“People have a tendency to refer to implanted BCIs as ‘invasive,’” she told me. “Certainly there’s surgery involved if you’re implanting a BCI and it can be quite a dramatic surgery.” On the other hand, Huggins likens it to her own artificial hip surgery she received a couple of years back. Invasive surgery? Yes. But in daily live, they can be forgotten about entirely.

For one, implanted BCIs don’t require the 10- to 20-minute setup needed to operate them daily. They also don’t require charging and cleaning as external BCIs do. Huggins posited that future BCI devices could offer the benefits from both externally worn and implanted BCIs, similar to how today’s cochlear implants operate.

“If you could implant those EEG electrodes under the scalp, you wouldn’t have to put them on and take them off every day, and they would be invisible.”

Furthermore, the basics of the technology has been around longer than you might assume. It’s been decades since the first neuroprostheses were installed in humans, and the field continues to expand at a rapid pace.

Elon Musk’s Neuralink implants brain chip in first human

That’s leads us to where we are today, with the first patients receiving these implanted chips. After a six-year study and getting FDA approval in 2023, Neuralink launched its clinical trial for its first implantable chip, completing the surgery in January of 2024 on its first patient.

In just a couple of months, Neuralink had posted an update, showing the patient controlling a laptop to play online only with his brain, which he compared to “using the Force on the cursor.”

By now, Neuralink’s BCI trial has found a second patient, while the first has gone from playing chess to playing Civilization VI.

A screenshot from a YouTube video showing a Neuralink patient playing Civilization VI with his mind.

Neuralink gets all the spotlight due to its high-profile founder, but it’s far from alone. Brooklyn New York’s Synchron, which is developing a device that can be safely implanted into the brain’s blood vessels, began its six-patient clinical trial last year. BrainGate, a research cadre amalgamated from universities across the U.S., implanted the world’s first wireless, high-bandwidth BCI in 2021. Blackrock Neurotech, on the other hand, is headquartered in Salt Lake City, Utah, and has been running human trials with its Utah array BCI for more than two decades with zero FDA-reported “serious adverse events” in that time.

In these applications, the devices enable users to effectively bypass damaged and non-responsive limbs to control external devices directly with their thoughts and perform activities without depending on healthy people, significantly improving the quality of their lives. The technology has already revolutionized a number of fields of research, including entertainment and gaming, industrial automation, education, and neuromarketing.

Continuing challenges

an EEG cap

While implanted BCIs feel like the future of the field, but they certainly come with their own challenges. For example, even with implanted BCIs, which provide a higher quality of signal, long-term stability remains an issue. These devices can degrade over time due to biological tissue reactions or mechanical failures, limiting their usability and lifespan for continuous applications.

The implanted BCI also doesn’t overcome the hurdle of all the training and calibration required, which poses a significant challenge for BCI technology. Users often need extensive practice to gain effective control over these devices, making the process both time-consuming and sometimes frustrating, as Dr. Huggins explained.

Neuralink has an app in development to help with this process, helping patients along to train their minds to better control digital devices, which has already been tested with monkeys.

Beyond convenience and cost, the technology’s ethical and privacy consequences pose significant challenges to BCI’s further development. The data generated by BCIs — our emotions, intentions, and thoughts — are intrinsically personal and raises the risk that such data could be unintentionally collected and misused.

BCIs adoption also creates issues with autonomy, consent, and accessibility. What’s to stop someone from being forced to use a BCI against their will or without fully understanding its consequences?

“I can think of nothing scarier than having someone decide for you that you want an implanted BCI and give it to you,” Huggins said. “And you can’t ask any questions about what’s going on or express your opinion.”

The same is true for leveraging AI and machine learning systems to assist BCI patients. “We can combine a lot of the [functions of] artificial intelligence and BCIs, but that does start raising the same kind of questions you run into with any kind of share and control: Who’s deciding? What’s going to be said?”

“And that gets back to the ethical questions we were talking about earlier about self-determination. If you have someone whose abilities are deteriorating [such as ALS patients], is there a balance? Will that balance change over time? Or am I just going to give up and start letting the AI auto-complete my sentences?”

“Don’t worry about people reading your thoughts from satellites.”

These are serious concerns for the future, even if we’re still a long way off from really needing to face them.

“I’m only able to pick that up with, you know, 90% accuracy, maybe 95% accuracy on a good day. On a bad day, well, you can go as low as as low as possible on a bad day. But that’s someone who is actively voluntarily trying to communicate a message.”

Huggins makes the point to quell one of the biggest challenges to the future of BCI: fear and misconceptions.

“It scares a lot of people,” Huggins conceded. “I had somebody ask me once if the government could read their thoughts from satellites. And I was like, ‘well, you know, I have trouble getting the correct answer when I have somebody sitting in my lab who let me put this headset on and is actively trying to pay attention to a key on the keyboard. Don’t worry about people reading your thoughts from satellites.”

Looking ahead to a brighter BCI future

Certainly, BCI researchers face considerable challenges in bringing this revolutionary technology to the general public, perhaps none more so than managing expectations. Huggins notes that her colleagues who work with implanted research studies go through a very rigorous process to make sure that participants in those studies understand what the plan is as well as a realistic understanding of the benefits and the risks.

Those same courtesies are rarely extended to the general public who are bombarded with fantastical promises of telepathic communication, perfect memory and recall, and even a melding of human and robotic minds.

As for where BCI is heading in our lifetimes, Huggins conceded that a lowering of expectations would be necessary to really see the progress.

“I think there will be things available, I just don’t think it will live up to all of the hype. It’s gonna change expectations. One needs to have expectations that are realistic, and understand that this is new technology. We’re still learning how it works, why it works, when it works, when it doesn’t work, what kind of support it needs, and how many places are going to be able to deliver it.”

So no, we likely won’t be experiencing “Whoa, I know Kung Fu” moments in the foreseeable future. But that’s not to say that the next generation won’t. We might have a long way to go, but the foundation of those future experiences is being built today — and that’s reason to be excited.






Read the full article here

Share This Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *