Researchers used individual fingertips fitted with stretchable tactile sensors...
Researchers used individual fingertips fitted with stretchable tactile sensors with liquid metal on a prosthesis attached to a robotic arm.
Source: Photo by Alex Dolce

Liquid metal sensors and AI used for prosthetics

Researchers at Florida Atlantic University are the first to incorporate stretchable tactile sensors using liquid metal on the fingertips of a prosthetic hand.

Each fingertip has more than 3,000 touch receptors, which largely respond to pressure. Humans rely heavily on sensation in their fingertips when manipulating an object. The lack of this sensation presents a unique challenge for individuals with upper limb amputations. While there are several high-tech, dexterous prosthetics available today—they all lack the sensation of 'touch'. The absence of this sensory feedback results in objects inadvertently being dropped or crushed by a prosthetic hand.

To enable a more natural feeling prosthetic hand interface, researchers from Florida Atlantic University's College of Engineering and Computer Science and collaborators are the first to incorporate stretchable tactile sensors using liquid metal on the fingertips of a prosthetic hand. Encapsulated within silicone-based elastomers, this technology provides key advantages over traditional sensors, including high conductivity, compliance, flexibility and stretchability. This hierarchical multi-finger tactile sensation integration could provide a higher level of intelligence for artificial hands.

For the study, researchers used individual fingertips on the prosthesis to distinguish between different speeds of a sliding motion along different textured surfaces. The four different textures had one variable parameter: the distance between the ridges. To detect the textures and speeds, researchers trained four machine learning algorithms. For each of the ten surfaces, 20 trials were collected to test the ability of the machine learning algorithms to distinguish between the ten different complex surfaces comprised of randomly generated permutations of four different textures.

Results showed that the integration of tactile information from liquid metal sensors on four prosthetic hand fingertips simultaneously distinguished between complex, multi-textured surfaces—demonstrating a new form of hierarchical intelligence. The machine learning algorithms were able to distinguish between all the speeds with each finger with high accuracy. This new technology could improve the control of prosthetic hands and provide haptic feedback, more commonly known as the experience of touch, for amputees to reconnect a previously severed sense of touch.

"Significant research has been done on tactile sensors for artificial hands, but there is still a need for advances in lightweight, low-cost, robust multimodal tactile sensors," said Erik Engeberg, Ph.D., senior author, an associate professor in the Department of Ocean and Mechanical Engineering and a member of the FAU Stiles-Nicholson Brain Institute and the FAU Institute for Sensing and Embedded Network Systems Engineering (I-SENSE), who conducted the study with first author and Ph.D. student Moaed A. Abd. "The tactile information from all the individual fingertips in our study provided the foundation for a higher hand-level of perception enabling the distinction between ten complex, multi-textured surfaces that would not have been possible using purely local information from an individual fingertip. We believe that these tactile details could be useful in the future to afford a more realistic experience for prosthetic hand users through an advanced haptic display, which could enrich the amputee-prosthesis interface and prevent amputees from abandoning their prosthetic hand."

Researchers compared four different machine learning algorithms for their successful classification capabilities: K-nearest neighbor (KNN), support vector machine (SVM), random forest (RF), and neural network (NN). The time-frequency features of the liquid metal sensors were extracted to train and test the machine learning algorithms. The NN generally performed the best at the speed and texture detection with a single finger and had a 99.2 percent accuracy to distinguish between ten different multi-textured surfaces using four liquid metal sensors from four fingers simultaneously.

"The loss of an upper limb can be a daunting challenge for an individual who is trying to seamlessly engage in regular activities," said Stella Batalama, Ph.D., dean, College of Engineering and Computer Science. "Although advances in prosthetic limbs have been beneficial and allow amputees to better perform their daily duties, they do not provide them with sensory information such as touch. They also don't enable them to control the prosthetic limb naturally with their minds. With this latest technology from our research team, we are one step closer to providing people all over the world with a more natural prosthetic device that can 'feel' and respond to its environment."

The study was published in the journal Sensors.

Subscribe to our newsletter

Related articles

Wearable shines light on benefits of outdoor lighting

Wearable shines light on benefits of outdoor lighting

A device could help scientists better understand the health benefits of outdoor lighting and lead to wearables that could nudge users to get more outdoor time.

Robotic hand merges amputee and robotic Control

Robotic hand merges amputee and robotic Control

Scientists have successfully tested neuroprosthetic technology that combines robotic control with users’ voluntary control, opening avenues in the new interdisciplinary field of shared control for neuroprosthetic technologies.

Neuroprosthesis decodes speech for paralyzed man

Neuroprosthesis decodes speech for paralyzed man

Researchers have developed a "speech neuroprosthesis" that has enabled a man with severe paralysis to communicate in sentences.

Using AI to predict 3D printing processes

Using AI to predict 3D printing processes

Engineers use Frontera supercomputer to develop physics-informed neural networks for additive manufacturing.

Medical technologies that come out of the printer

Medical technologies that come out of the printer

Fraunhofer-Gesellschaft's German-Polish High-Performance Center brings additive manufacturing to medical technology – first demonstrators will already be presented by the end of 2021.

Smart biomarkers to empower drug development

Smart biomarkers to empower drug development

Researchers aim to speed up developing drugs against brain diseases through cutting-edge technology. They are generating an innovative technology platform based on high-density microelectrode arrays and 3D networks of human neurons.

Deep learning predicts viral infections

Deep learning predicts viral infections

Using fluoresence images from live cells, researchers have trained an artificial neural network to reliably recognize cells that are infected by adenoviruses or herpes viruses.

Smart textiles: a programmable digital fiber

Smart textiles: a programmable digital fiber

In a first, the digital fiber contains memory, temperature sensors, and a trained neural network program for inferring physical activity.

Brain-computer interface turns mental handwriting into text

Brain-computer interface turns mental handwriting into text

Scientists have used an implanted sensor to record the brain signals associated with handwriting, and used those signals to create text on a computer in real time.

Popular articles

Subscribe to Newsletter