A robotic arm (in front) controlled by the test person (behind)
Researchers have succeeded in controlling a robotic arm purely by thought in real time ,in the non-invasive way using an EEG cap. Here: the experimental setting. The robotic arm (in front) controlled by the test person (behind)
Source: Baustädter – TU Graz

BCI decodes movement from brain signals

For the first time ever, the intention of a continuous movement was able to be read out from non-invasive brain signals at TU Graz. This success enables more natural and non-invasive control of neuroprostheses to be carried out in real time.

Intended to give paraplegic people back some freedom of movement and thus a better quality of life, so-called brain-computer interfaces (BCIs) measure the person's brain activity and convert the electrical currents into control signals for neuroprostheses. "Controlling by thoughts," as Gernot Müller-Putz puts it in simplified terms. The head of the Institute of Neural Engineering at Graz University of Technology (TU Graz) is an “old hand” of BCI research and is intensively involved with non-invasive BCI systems. 

He and his team have achieved initial positive results with EEG-based control of neuroprostheses or robotic arms in people with spinal cord injuries over the last ten years. However, until now the control was unnatural and cumbersome because the thought patterns had to be repeatedly imagined. As part of his recently completed ERC Consolidator Grant project "Feel your Reach", Müller-Putz and his team have now achieved a breakthrough in the development of more natural and continuous BCI control systems.

It all comes down to seeing

The TU Graz researchers have succeeded for the first time in controlling a robotic arm purely by thought in real time in the usual non-invasive way using an EEG cap. This was made possible by decoding continuous movement intention from brain signals – something previously impossible. The researchers first examined a variety of movement parameters such as position, speed and distance, and extracted their correlates from the neuronal activity. "The contribution of the eyes is essential here," says Müller-Putz. "It is important that users are allowed to use their eyes to follow the trajectory of the robotic arm." 

However, eye movements and eye blinks generate their own electrical signals, so-called ocular artefacts in the EEG. "These artefacts distort the EEG signal. They therefore have to be removed in real time. However, it is essential that eye-hand coordination can take place and thus contribute to the decoding of movement requests," Müller-Putz explains. In other words, the visual information helps to capture the intention to move. The unwanted signals of the eye itself, however, have to be filtered out of the electrical activity arithmetically.

Recommended article

BCI detects unwanted movement

It is also essential that one of the BCIs developed by the researchers is able to recognize whether a person wants to start a movement – it can recognize the start of a goal-oriented movement. In addition, another of the research team's BCIs detects and corrects errors, i.e., unwanted movements of the robotic arm; one more piece of the puzzle for a more natural prosthetic control. "The brain's error response can be read off from the EEG. The BCI recognizes that the movement performed does not correspond to the person's intention. It stops the movement of the robotic arm or resets it to the beginning," says Müller-Putz. In the project, the error detection was successfully tested several times in tests with spinal cord injured persons.

People can feel movements of the robotic arm

The TU Graz researchers were also successful with so-called kinaesthetic feedback. "The participants not only see the movements of the prosthesis, they also feel them," says a visibly pleased Müller-Putz. Technically, this was made possible with the help of vibration sensors. These are stuck to the skin on the shoulder blade and track the movements of the robotic arm in finely flowing vibrations. Theoretically, it is also possible for completely paralyzed people to feel movements. "However, we have to consider an application in the area of the neck here," says Müller-Putz, alluding to future goals. First and foremost, the researchers want to improve the decoding of a movement from visual, intentional and kinesthetic information, thereby detecting errors and uniting all four BCI systems in a "quadruple BCI system".

Subscribe to our newsletter

Related articles

Neuroprosthesis decodes speech for paralyzed man

Neuroprosthesis decodes speech for paralyzed man

Researchers have developed a "speech neuroprosthesis" that has enabled a man with severe paralysis to communicate in sentences.

Robotic third thumb changes the brain

Robotic third thumb changes the brain

Using a robotic 'Third Thumb' can impact how the hand is represented in the brain, finds a new study.

Bionic touch does not remap the brain

Bionic touch does not remap the brain

Neuroscientists have demonstrated that the brain does not remap itself even with long-term bionic limb use, posing challenges for the development of realistic prosthetic limbs.

Quadriplegic controls two prosthetics with thoughts

Quadriplegic controls two prosthetics with thoughts

Researchers have enabled a quadriplegic man to control a pair of prosthetic arms with his mind.

Brain-controlled computers are becoming a reality, but major hurdles remain

Brain-controlled computers are becoming a reality, but major hurdles remain

More researchers and companies are moving into the brain-computer interfaces, yet major challenges remain, from user training to the reality of invasive brain implant procedures.

Prostheses could alleviate amputees' phantom limb pain

Prostheses could alleviate amputees' phantom limb pain

New prosthetic technologies that stimulate the nerves could pave the way for prostheses that feel like a natural part of the body and reduce the phantom limb pain commonly endured by amputees.

An ultra-precise mind-controlled prosthetic

An ultra-precise mind-controlled prosthetic

Researchers have tapped faint, latent signals from arm nerves and amplified them to enable real-time, intuitive, finger-level control of a robotic hand.

Robotic hand merges amputee and robotic Control

Robotic hand merges amputee and robotic Control

Scientists have successfully tested neuroprosthetic technology that combines robotic control with users’ voluntary control, opening avenues in the new interdisciplinary field of shared control for neuroprosthetic technologies.

Brain-computer interface can improve your performance

Brain-computer interface can improve your performance

Researchers have shown that they can use online neurofeedback to modify an individual's arousal state to improve performance in a demanding sensory motor task.

Popular articles

Subscribe to Newsletter