AI decodes conversations in brain’s motor cortex

How does your brain talk with your arm? The body doesn’t use English, or any other spoken language. Biomedical engineers are developing methods for decoding the conversation, by analyzing electrical patterns in the motor control areas of the brain.

Photo
Biomedical engineers are developing methods for decoding the conversation between brain and arm, by analyzing electrical patterns in the motor control areas of the brain.
Source: Illustration by Bona Kim, Emory University

In this study, the researchers leveraged advances from the field of “deep learning”. The computing approaches, which use artificial neural networks, let researchers uncover patterns in complex data sets that have been previously overlooked, says lead author Chethan Pandarinath, PhD.

Pandarinath and colleagues developed an approach to allow their artificial neural networks to mimic the biological networks that make our everyday movements possible. In doing so, the researchers gained a much better understanding of what the biological networks were doing. Eventually, these techniques could help paralyzed people move their limbs, or improve the treatment of people with Parkinson’s, says Pandarinath, an assistant professor in the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University.

For someone who has a spinal cord injury, the new technology could power “brain-machine interfaces” that discern the intent behind the brain’s signals and directly stimulate someone’s muscles. “In the past, brain-machine interfaces have mostly worked by trying to decode very high-level commands, such as ‘I want to move my arm to the right, or left’,” Pandarinath says. “With these new innovations, we believe we’ll actually be able to decode subtle signals related to the control of muscles, and make brain-machine interfaces that behave much more like a person’s own limbs.”

Network behavior ‘emergent’ from individual neurons

Previous research on how neurons control movement have revealed that it’s difficult to discern individual neurons’ roles, in a way that we might think of in a basic machine. Individual neurons’ behaviors don’t correspond to variables like arm speed, movement distance or angle. Rather, the rhythms of the entire network are more important than any individual neuron’s activity.

Pandarinath likens his team’s approach to ornithologists studying the flocking behavior of birds. To understand how the group holds together, one has to know how one bird responds to its neighbors, and to the flock’s movements as a whole. Flocking behavior is “emergent” from the interactions of the birds with each other, he says. Such emergent behaviors are challenging to characterize with standard methods, but are precisely the way artificial neural networks function.

Pandarinath started investigating this approach, called LFADS (Latent Factor Analysis via Dynamical Systems), while working with electrical engineer Krishna Shenoy, PhD, and neurosurgeon Jaimie Henderson, MD, who co-direct the Neural Prosthetics Translational Lab at Stanford University. The researchers analyzed data from both rhesus macaques and humans, who had electrodes implanted in the motor cortex. In some experiments, monkeys were trained to move their arms to follow an on-screen “maze,” and the researchers tested their ability to “decode” the monkeys’ arm movement trajectories based solely on the signals recorded from the implanted electrodes. Using their artificial neural network approach, the researchers were able to precisely uncover faint patterns that represented the brain rhythms in the motor cortex. They also observed similar patterns in human patients who were paralyzed – one because of motor neuron degeneration (amyotrophic lateral sclerosis), and another with spinal cord injury.

In addition to the motor cortex, Pandarinath believes the new approach could be used to analyze the activity of networks in other brain regions involved in spatial navigation or decision making.

Future plans for clinical applications include pairing the new technology with functional electrical stimulation of muscles for paralyzed patients, and also the refinement of deep brain stimulation technology in Parkinson’s disease. In addition, Pandarinath and colleagues have begun using these techniques to start to understand the activity of neurons at fundamentally different scales than were previously possible.

Subscribe to our newsletter

Related articles

Making imagined speech audible

Making imagined speech audible

With a speech neuroprosthetic, imagined speech can be made acoustically audible.

Neuroprosthesis decodes speech for paralyzed man

Neuroprosthesis decodes speech for paralyzed man

Researchers have developed a "speech neuroprosthesis" that has enabled a man with severe paralysis to communicate in sentences.

Brain-computer interface turns mental handwriting into text

Brain-computer interface turns mental handwriting into text

Scientists have used an implanted sensor to record the brain signals associated with handwriting, and used those signals to create text on a computer in real time.

BCI training reduces phantom limb pain

BCI training reduces phantom limb pain

Scientists used brain-computer-interface to train the brains of patients to reduce phantom-hand pain.

Implants: Next-generation brain interfaces

Implants: Next-generation brain interfaces

Next-generation brain implants with more than a thousand electrodes can survive for more than six years.

Smart algorithm analyses whole brain vasculature

Smart algorithm analyses whole brain vasculature

Scientists have developed a technique for visualising the structures of all the brain's blood vessels including any pathological changes.

'Deepfaking the mind' to improve brain-computer interfaces

'Deepfaking the mind' to improve brain-computer interfaces

Researchers are using generative adversarial networks to improve brain-computer interfaces for people with disabilities.

Quantum sensors for next-gen brain-computer interfaces

Quantum sensors for next-gen brain-computer interfaces

Recently, Professor Surjo R. Soekadar outlined current and upcoming applications of brain-computer interfaces.

BCI decodes movement from brain signals

BCI decodes movement from brain signals

The intention of a continuous movement was able to be read out from non-invasive brain signals.

Popular articles

Subscribe to Newsletter