Artoni Fiorenzo tries out shared control with the robotic arm.
Artoni Fiorenzo tries out shared control with the robotic arm.
Source: EPFL / Alain Herzog.

Robotic hand merges amputee and robotic Control

Scientists from the École Polytechnique Fédérale de Lausanne (EPFL) are developing new approaches for improved control of robotic hands – in particular for amputees – that combines individual finger control and automation for improved grasping and manipulation.

This interdisciplinary proof-of-concept between neuroengineering and robotics was successfully tested on three amputees and seven healthy subjects. The technology merges two concepts from two different fields. Implementing them both together had never been done before for robotic hand control, and contributes to the emerging field of shared control in neuroprosthetics.

One concept, from neuroengineering, involves deciphering intended finger movement from muscular activity on the amputee’s stump for individual finger control of the prosthetic hand which has never before been done. The other, from robotics, allows the robotic hand to help take hold of objects and maintain contact with them for robust grasping. “When you hold an object in your hand, and it starts to slip, you only have a couple of milliseconds to react,” explains Aude Billard who leads EPFL’s Learning Algorithms and Systems Laboratory. “The robotic hand has the ability to react within 400 milliseconds. Equipped with pressure sensors all along the fingers, it can react and stabilize the object before the brain can actually perceive that the object is slipping.

How shared control works

The algorithm first learns how to decode user intention and translates this into finger movement of the prosthetic hand. The amputee must perform a series of hand movements in order to train the algorithm that uses machine learning. Sensors placed on the amputee’s stump detect muscular activity, and the algorithm learns which hand movements correspond to which patterns of muscular activity. Once the user’s intended finger movements are understood, this information can be used to control individual fingers of the prosthetic hand. “Because muscle signals can be noisy, we need a machine learning algorithm that extracts meaningful activity from those muscles and interprets them into movements,” says Katie Zhuang first author of the publication.

Next, the scientists engineered the algorithm so that robotic automation kicks in when the user tries to grasp an object. The algorithm tells the prosthetic hand to close its fingers when an object is in contact with sensors on the surface of the prosthetic hand. This automatic grasping is an adaptation from a previous study for robotic arms designed to deduce the shape of objects and grasp them based on tactile information alone, without the help of visual signals.

Many challenges remain to engineer the algorithm before it can be implemented in a commercially available prosthetic hand for amputees. For now, the algorithm is still being tested on a robot provided by an external party. “Our shared approach to control robotic hands could be used in several neuroprosthetic applications such as bionic hand prostheses and brain-to-machine interfaces, increasing the clinical impact and usability of these devices,” says Silvestro Micera, EPFL’s Bertarelli Foundation Chair in Translational Neuroengineering, and Professor of Bioelectronics at Scuola Superiore Sant’Anna in Italy.

Subscribe to our newsletter

Related articles

An ultra-precise mind-controlled prosthetic

An ultra-precise mind-controlled prosthetic

Researchers have tapped faint, latent signals from arm nerves and amplified them to enable real-time, intuitive, finger-level control of a robotic hand.

Open-source bionic leg: platform aims to advance prosthetics

Open-source bionic leg: platform aims to advance prosthetics

The Open-Source Bionic Leg will enable investigators to efficiently solve challenges associated with controlling bionic legs across a range of activities in the lab and out in the community.

Bionic touch does not remap the brain

Bionic touch does not remap the brain

Neuroscientists have demonstrated that the brain does not remap itself even with long-term bionic limb use, posing challenges for the development of realistic prosthetic limbs.

Prostheses could alleviate amputees' phantom limb pain

Prostheses could alleviate amputees' phantom limb pain

New prosthetic technologies that stimulate the nerves could pave the way for prostheses that feel like a natural part of the body and reduce the phantom limb pain commonly endured by amputees.

Bionic leg to help amputees walk faster

Bionic leg to help amputees walk faster

Bionic breakthrough: Engineers develop computerized bionic leg to help amputees walk faster, easier and with better balance.

Bionic prosthesis improves amputees’ health

Bionic prosthesis improves amputees’ health

Thanks to bionic prosthesis that features sensors that connect to residual nerves in the thigh, two volunteers are the first above-knee amputees in the world to feel their prosthetic foot and knee in real time.

Prosthetic arm can move with your thoughts

Prosthetic arm can move with your thoughts

Reseachers are developing a prosthetic arm that can move with the person's thoughts and feel the sensation of touch via an array of electrodes implanted in the muscles of the patient.

Exceptional sensitive e-skin for prosthetics

Exceptional sensitive e-skin for prosthetics

Researchers have developed an e-skin that may soon have a sense of touch equivalent to, or better than, the human skin with the Asynchronous Coded Electronic Skin (ACES).

A smart orthosis for a stronger back

A smart orthosis for a stronger back

Researchers developed ErgoJack to relieve back strain and encourage workers to execute strenuous movements in a more ergonomic way

Popular articles

Subscribe to Newsletter