Computer scientists at the university of Bremen have realized a speech...
Computer scientists at the university of Bremen have realized a speech neuroprosthetic. With it, imagined speech can be made acoustically audible.
Source: © CSL / Universität Bremen

Making imagined speech audible

Computer scientists at the University of Bremen have succeeded in an international project to realize a so-called speech neuroprosthetic. With it, imagined speech can be made acoustically audible.

Great research successes require international collaboration: For several years, the Cognitive Systems Lab (CSL) at the University of Bremen, the Department of Neurosurgery at Maastricht University in the Netherlands, and the ASPEN Lab at Virginia Commonwealth University (USA) have been working on a speech neuroprosthetic. The goal: To translate speech-related neural processes in the brain directly into audible speech. This goal has now been achieved: "We have managed to make our test subjects hear themselves speak, even though they only imagine speaking," says Professor Tanja Schultz, head of the CSL, happily. "Neural signals from volunteers who imagine speaking are directly translated into audible output by our speech neuroprosthetic - in real time with no perceptible latency!"

The innovative speech neuroprosthetic is based on a closed-loop system that combines technologies from modern speech synthesis with brain-computer interfaces. This system was developed by Miguel Angrick at the CSL. As input, it receives the neural signals of users who imagine speaking. Using machine learning, it translates them into speech almost immediately and outputs audible feedback to its users. "This closes the loop for them from imagining speaking to hearing their speech," says Angrick.

The work is based on a study with a volunteer epilepsy patient who was implanted with depth electrodes for medical examinations and was in hospital for clinical monitoring. In the first step, the patient read texts aloud, from which the closed-loop system learned the correspondence between speech and neural activity by means of machine learning. „In the second step, this learning process was repeated with whispered and imagined speech“, explains Miguel Angrick. „In the process, the closed-loop system produced synthesised speech. Although the system had learned the correspondences exclusively on audible speech, audible output is also produced with whispered and imagined speech.“ This suggests that the underlying speech processes in the brain for audibly produced speech share to some extent a common neural substrate to those for whispered and imagined speech.

Recommended article

Speech neuroprosthetics focuses on providing a natural communication channel for people who are unable to speak due to physical or neurological impairments," says Professor Tanja Schultz, explaining the background for the intensive research activities in this field, in which the Cognitive Systems Lab at the University of Bremen plays a world-renowned role. "Real-time synthesis of acoustic speech directly from measured neural activity could enable natural conversations and significantly improve the quality of life of people whose communication capabilities are severely limited."

The research result has been published in „Nature Communications Biology“.

Subscribe to our newsletter

Related articles

Neuroprosthesis decodes speech for paralyzed man

Neuroprosthesis decodes speech for paralyzed man

Researchers have developed a "speech neuroprosthesis" that has enabled a man with severe paralysis to communicate in sentences.

Brain-computer interface turns mental handwriting into text

Brain-computer interface turns mental handwriting into text

Scientists have used an implanted sensor to record the brain signals associated with handwriting, and used those signals to create text on a computer in real time.

A wireless chip shines light on the brain

A wireless chip shines light on the brain

Researchers have developed a chip that is powered wirelessly and can be surgically implanted to read neural signals and stimulate the brain with both light and electrical current.

BCI training reduces phantom limb pain

BCI training reduces phantom limb pain

Scientists used brain-computer-interface to train the brains of patients to reduce phantom-hand pain.

Prosthetics: Artificial pieces of brain communicate with real neurons

Prosthetics: Artificial pieces of brain communicate with real neurons

Researchers have developed a system for integrating artificial chip-based 'neurons' with real neurons using QR-code-like patterns of light to facilitate communication.

Printed tattoo electrodes measure brain signal

Printed tattoo electrodes measure brain signal

A researcher has developed ultra-light tattoo electrodes that are hardly noticeable on the skin and make long-term measurements of brain activity cheaper and easier.

Implants: Next-generation brain interfaces

Implants: Next-generation brain interfaces

Next-generation brain implants with more than a thousand electrodes can survive for more than six years.

AI decodes conversations in brain’s motor cortex

AI decodes conversations in brain’s motor cortex

Engineers use deep learning to decode the conversation between brain and arm, by analyzing electrical patterns in the motor control areas of the brain.

BCI decodes movement from brain signals

BCI decodes movement from brain signals

The intention of a continuous movement was able to be read out from non-invasive brain signals.

Popular articles

Subscribe to Newsletter