NTU Singapore scientists have developed an Artificial Intelligence system for...
NTU Singapore scientists have developed an Artificial Intelligence system for high precision recognition of hand gestures.

AI system for recognition of hand gestures

Scientists from Nanyang Technological University, Singapore (NTU Singapore) have developed an Artificial Intelligence (AI) system that recognises hand gestures by combining skin-like electronics with computer vision.

The recognition of human hand gestures by AI systems has been a valuable development over the last decade and has been adopted in high-precision surgical robots, health monitoring equipment and in gaming systems.

AI gesture recognition systems that were initially visual-only have been improved upon by integrating inputs from wearable sensors, an approach known as 'data fusion'. The wearable sensors recreate the skin's sensing ability, one of which is known as 'somatosensory'.

However, gesture recognition precision is still hampered by the low quality of data arriving from wearable sensors, typically due to their bulkiness and poor contact with the user, and the effects of visually blocked objects and poor lighting. Further challenges arise from the integration of visual and sensory data as they represent mismatched datasets that must be processed separately and then merged at the end, which is inefficient and leads to slower response times.

To tackle these challenges, the NTU team created a 'bioinspired' data fusion system that uses skin-like stretchable strain sensors made from single-walled carbon nanotubes, and an AI approach that resembles the way that the skin senses and vision are handled together in the brain.

The NTU scientists developed their bio-inspired AI system by combining three neural network approaches in one system: they used a 'convolutional neural network', which is a machine learning method for early visual processing, a multilayer neural network for early somatosensory information processing, and a 'sparse neural network' to 'fuse' the visual and somatosensory information together. The result is a system that can recognise human gestures more accurately and efficiently than existing methods. 

Lead author of the study, Professor Chen Xiaodong, from the School of Materials Science and Engineering at NTU, said, "Our data fusion architecture has its own unique bioinspired features which includes a man-made system resembling the somatosensory-visual fusion hierarchy in the brain. We believe such features make our architecture unique to existing approaches."

"Compared to rigid wearable sensors that do not form an intimate enough contact with the user for accurate data collection, our innovation uses stretchable strain sensors that comfortably attaches onto the human skin. This allows for high-quality signal acquisition, which is vital to high-precision recognition tasks," added Prof Chen, who is also Director of the Innovative Centre for Flexible Devices (iFLEX) at NTU.

High recognition accuracy in poor environmental conditions

To capture reliable sensory data from hand gestures, the research team fabricated a transparent, stretchable strain sensor that adheres to the skin but cannot be seen in camera images.

As a proof of concept, the team tested their bio-inspired AI system using a robot controlled through hand gestures and guided it through a maze. Results showed that hand gesture recognition powered by the bio-inspired AI system was able to guide the robot through the maze with zero errors, compared to six recognition errors made by a visual-based recognition system. High accuracy was also maintained when the new AI system was tested under poor conditions including noise and unfavourable lighting. The AI system worked effectively in the dark, achieving a recognition accuracy of over 96.7 per cent.

First author of the study, Dr Wang Ming from the School of Materials Science & Engineering at NTU Singapore, said, "The secret behind the high accuracy in our architecture lies in the fact that the visual and somatosensory information can interact and complement each other at an early stage before carrying out complex interpretation. As a result, the system can rationally collect coherent information with less redundant data and less perceptual ambiguity, resulting in better accuracy".

Providing an independent view, Professor Markus Antonietti, Director of Max Planck Institute of Colloids and Interfaces in Germany said, "The findings from this paper bring us another step forward to a smarter and more machine-supported world. Much like the invention of the smartphone which has revolutionised society, this work gives us hope that we could one day physically control all of our surrounding world with great reliability and precision through a gesture."

"There are simply endless applications for such technology in the marketplace to support this future. For example, from a remote robot control over smart workplaces to exoskeletons for the elderly."

The NTU research team is now looking to build a VR and AR system based on the AI system developed, for use in areas where high-precision recognition and control are desired, such as entertainment technologies and rehabilitation in the home.

The team published their findings in the scientific journal Nature Electronics

Subscribe to our newsletter

Related articles

A smart orthosis for a stronger back

A smart orthosis for a stronger back

Researchers developed ErgoJack to relieve back strain and encourage workers to execute strenuous movements in a more ergonomic way

Smart surgery - from technical fascination to clinical reality

Smart surgery - from technical fascination to clinical reality

Artificial intelligence is developing at an enormous speed and intelligent instruments will profoundly change surgery and medical interventions.

Electronic skin reacts to pain like human skin

Electronic skin reacts to pain like human skin

Researchers have developed electronic artificial skin that reacts to pain just like real skin, opening the way to better prosthetics, smarter robotics and non-invasive alternatives to skin grafts.

Surgical data science: data-driven assistance in the OR

Surgical data science: data-driven assistance in the OR

In the next-generation operating room interconnected sensors will collect data, analyse it in real-time and make it available to digital assistance functions.

Origami-inspired miniature manipulator for microsurgery

Origami-inspired miniature manipulator for microsurgery

Researchers have developed a surgical robot that improves precision and control of teleoperated surgical procedures.

Progress in capture of biosignals for EEG

Progress in capture of biosignals for EEG

New electrode technology and AI analytics solve challenges in neurological emergency, acute and intensive care medicine.

Robotic textiles could enable new mechanotherapy

Robotic textiles could enable new mechanotherapy

A new smart fabric that can be inflated and deflated by temperature-dependent liquid-vapor phase changes could enable a range of medical therapeutics.

Necklace detects abnormal heart rhythm

Necklace detects abnormal heart rhythm

A necklace which detects abnormal heart rhythm will be showcased for the first time on EHRA Essentials 4 You, a scientific platform of the European Society of Cardiology (ESC).

Smart insoles unlock the secrets of your sole

Smart insoles unlock the secrets of your sole

Researchers at Stevens Institute of Technology have developed an AI-powered, smart insole that instantly turns any shoe into a portable gait-analysis laboratory.

Popular articles