Skip to main content

A.I. system seeks to turn thoughts of people unable to talk into speech

For the first time, neuroengineers have developed a system capable of translating thoughts directly into recognizable speech, marking an important step toward more advanced brain-computer interfaces for people who lack the ability to speak.

The system, which was created by researchers at Columbia University, works by monitoring a person’s brain activity, identifying brain signals, and reconstructing the words the individual hears. Powered by speech synthesizers and artificial intelligence, the technology lays the groundwork for helping individuals who are unable to speak due to disability regain their capacity to communicate verbally.

Recommended Videos

“Our ultimate goal is to develop technologies that can decode the internal voice of a patient who is unable to speak, such that it can be understood by any listener,” Nima Mesgarani, an electrical engineer at Columbia University who led the project, told Digital Trends by email.

Please enable Javascript to view this content

Parts of the brain light up like a Christmas tree — neurons firing left and right — when people speak or even simply think about speaking. Neural researchers have long endeavored to decode the patterns that emerge in these signals. But it isn’t easy. For years, scientists like Mesgarani have tried to translate brain activity to intelligible thought, using tools like computer models to analyze visual representations of sound frequencies.

In their recent work, Mesgarani and his team used a computer algorithm called a vocoder, which can generate speech-like sounds when trained on recordings of human speech. But to train the vocoder, Mesgarani needed brain models, so he partnered with Ashesh Dinesh Metah, a neurosurgeon at Northwell Health Physician Partners Neuroscience Institute in New York who treats epilepsy patients.

Mesgarani and Metah asked some of Metah’s patients to listen to speech recordings and measured their brain activity. The patterns in their brain activity trained the vocoder. The researchers then recorded the patients’ brain activity as they listened to people count to nine, which the vocoder attempted to recite by analyzing the neural signals.

The result isn’t perfect. The sounds it produces are robotic and, even after an A.I. system was used to “clean up” the vocoder to more intelligible levels, vaguely recognizable. But the researchers found that individuals could understand and repeat the sounds about 75 percent of the time.

Moving forward, the researchers plan to trial more complicated words before moving on to sentences. Their end goal is to integrate the system into an implant that could translates the thoughts directly into words.

A paper detailing the research was published last month in the journal Scientific Reports.

Dyllan Furness
Former Digital Trends Contributor
Dyllan Furness is a freelance writer from Florida. He covers strange science and emerging tech for Digital Trends, focusing…
Neuro-symbolic A.I. is the future of artificial intelligence. Here’s how it works
IBM Watson Shapes

Picture a tray. On the tray is an assortment of shapes: Some cubes, others spheres. The shapes are made from a variety of different materials and represent an assortment of sizes. In total there are, perhaps, eight objects. My question: “Looking at the objects, are there an equal number of large things and metal spheres?”

It’s not a trick question. The fact that it sounds as if it is is proof positive of just how simple it actually is. It’s the kind of question that a preschooler could most likely answer with ease. But it’s next to impossible for today’s state-of-the-art neural networks. This needs to change. And it needs to happen by reinventing artificial intelligence as we know it.

Read more
Many hybrids rank as most reliable of all vehicles, Consumer Reports finds
many hybrids rank as most reliable of all vehicles evs progress consumer reports cr tout cars 0224

For the U.S. auto industry, if not the global one, 2024 kicked off with media headlines celebrating the "renaissance" of hybrid vehicles. This came as many drivers embraced a practical, midway approach rather than completely abandoning gas-powered vehicles in favor of fully electric ones.

Now that the year is about to end, and the future of tax incentives supporting electric vehicle (EV) purchases is highly uncertain, it seems the hybrid renaissance still has many bright days ahead. Automakers have heard consumer demands and worked on improving the quality and reliability of hybrid vehicles, according to the Consumer Reports (CR) year-end survey.

Read more
U.S. EVs will get universal plug and charge access in 2025
u s evs will get universal plug charge access in 2025 ev car to charging station power cable plugged shutterstock 1650839656

And then, it all came together.

Finding an adequate, accessible, and available charging station; charging up; and paying for the service before hitting the road have all been far from a seamless experience for many drivers of electric vehicles (EVs) in the U.S.

Read more