...

New Brain Implant Decodes ‘Inner Monologue’ of People With Paralysis


We all talk to ourselves in our heads. It could be a pep talk heading into a wedding speech or chaotic family reunion or motivating yourself to quit procrastinating. This inner speech also hides secrets. What we say doesn’t always reflect what we think.

A team led by scientists at Stanford University have now designed a system that can decode these conversations with ourselves. They hope it can help people with paralysis communicate with their loved ones—especially those who struggle with current brain-to-speech systems.

Instead of having participants actively try to make sounds and form words, as if they’re speaking out loud, the new AI decoder captures silent monologues and translates them into speech with up to 74 percent accuracy.

Of course, no one wants their thoughts continuously broadcast. So, as a brake, the team designed “neural passwords” the volunteers can mentally activate before the implant starts translating their thoughts.

“This is the first time we’ve managed to understand what brain activity looks like when you just think about speaking,” said study author Erin Kunz. “For people with severe speech and motor impairments…[an implant] capable of decoding inner speech could help them communicate much more easily and more naturally.”

Penny for Your Thoughts

The brain sparks with electrical activity before we attempt to speak. These signals control muscles in the throat, tongue, and lips to form different sounds and intonations. Brain implants listen to and decipher these signals, allowing people with paralysis to regain their voices.

A recent system translates speech in near real time. A 45-year-old participant who took part in a study featuring the system lost the ability to control his vocal cords due to amyotrophic lateral sclerosis (ALS). His AI-guided implant decoded brain activity—captured when he actively tried to speak—into coherent sentences with different intonations. Another similar trial gathered neural signals from a middle-aged woman who suffered a stroke. An AI model translated this data into words and sentences without notable delays, allowing normal conversation to flow.

These systems are life-changing, but they struggle to help people who can’t actively try to move the muscles involved in speech. An alternative is to go further upstream and interpret speech from brain signals alone, before participants try to speak aloud—in other words, to decode their inner thoughts.

Words to Sentences

Previous brain imaging studies have found that inner speech activates a similar—but not identical—neural network as physical speech does. For example, electrodes placed on the surface of the brain have captured a unique electrical signal that spreads across a wide neural network, but scientists couldn’t home in on the specific regions contributing to inner speech.

The Stanford team recruited four people from the BrainGate2 trial, each with multiple 64-channel microelectrode arrays already implanted into their brains. One participant, a 68-year-old woman, had gradually lost her ability to speak nearly a decade ago due to ALS. She could still vocalize, but the words were unintelligible to untrained listeners.

Another 33-year-old volunteer, also with ALS, had incomplete locked-in syndrome. He relied on a ventilator to breathe and couldn’t control his muscles—except those around his eyes—but his mind was still sharp.

To decode inner speech, the team recorded electrical signals from participants’ motor cortexes as they tried to produce sounds (attempted speech) or simply thought about a single-syllable word like “kite” or “day” (inner speech). In other tests, the participants heard or silently read the words in their minds. By comparing the results from each of these scenarios, the team was able to map out the specific motor cortex regions that contribute to inner speech.

Maps in hand, the team next trained an AI decoder to decipher each participant’s thoughts.

The system was far from perfect. Even with a limited 50-word vocabulary, the decoder messed up 14 to 33 percent of the translations depending on the participant. For two people it was able to decode sentences made using a 125,000-word vocabulary, but with an even higher error rate. A cued sentence like “I think it has the best flavor” turned into “I think it has the best player.” Other sentences, such as “I don’t know how long you’ve been here,” were accurately decoded.

Errors aside, “If you just have to think about speech instead of actually trying to speak, it’s potentially easier and faster for people [to communicate],” said study author Benyamin Meschede-Krasa.

All in the Mind

These first inner speech tests were prompted. It’s a bit like someone saying “don’t think of an elephant” and you immediately think of an elephant. To see if the decoder could capture automatic inner speech, the team taught one participant a simple game in which she memorized a series of three arrows pointing at different directions, each with a visual cue.

The team thought the game could automatically trigger inner speech as a mnemonic, they wrote. It’s like repeating to yourself a famous video game cheat code or learning how to solve a Rubik’s cube. The decoder captured her thoughts, which mapped to her performance.

They also tested the system in scenarios when participants counted in their heads or thought about relatively private things, like their favorite movie or food. Although the system picked up more words than when participants were instructed to clear their minds, the sentences were largely gibberish and only occasionally contained plausible phrases, wrote the team.

In other words, the AI isn’t a mind reader, yet.

But with better sensors and algorithms, the system could one day leak out unintentional inner speech (imagine the embarrassment). So, the team constructed multiple safeguards. One labels attempted speech—what you actually want to say out loud—differently than inner speech. This strategy only works for people who can still try to attempt speaking out loud.

They also tried creating a mental password. Here, the system only activates if the person thinks about the password first (“chittychittybangbang” was one). Real-time trials with the 68-year-old participant found the system correctly detected the password roughly 99 percent of the time, making it easy for her to protect her private thoughts.

As implants become more sophisticated, researchers and users are concerned about mental privacy, the team wrote, “specifically whether a speech BCI [brain-computer interface] would be able to read into thoughts or internal monologues of users when attempting to decode (motor) speech intentions.’’ The tests show it’s possible to prevent such “leakage.”

So far, implants to restore verbal communication have relied on attempted speech, which requires significant effort from the user. And for those with locked-in syndrome who can’t control their muscles, the implants don’t work. By capturing inner speech, the new decoder taps directly into the brain, requiring less effort and could speed up communication.

“The future of BCIs is bright,” said study author Frank Willett. “This work gives real hope that speech BCIs can one day restore communication that is as fluent, natural, and comfortable as conversational speech.”

Source link

#Brain #Implant #Decodes #Monologue #People #Paralysis