The EEG test setup.
Image shows the EEG test setup.
Source: Loughborough University

Expanding human-robot collaboration in manufacturing

To enhance human-robot collaboration, researchers at Loughborough University have trained an AI to detect human intention.

Machines and robots undoubtedly make life easier. They carry out jobs with precision and speed, and, unlike humans, they do not require breaks as they are never tired. As a result, companies are looking to use them more and more in their manufacturing processes to improve productivity and remove dirty, dangerous, and dull tasks. However, there are still so many tasks in the working environment that require human dexterity, adaptability, and flexibility.

Human-robot collaboration is an exciting opportunity for future manufacturing since it combines the best of both worlds. The relationship requires close interaction between humans and robots, which could highly profit from anticipating a collaborative partner's next action.

Ph.D. student Achim Buerkle and a team of researchers from the Intelligent Automation Centre at Loughborough University have published promising results for 'training' robots to detect arm movement intention before humans articulate the movements in the Robotics and Computer-Integrated Manufacturing journal.

"A robot's speed and torque need to be co-ordinated well because it can pose a serious threat to human health and safety", said Buerkle. "Ideally, for effective teamwork, the human and robot would 'understand' each other, which is difficult due to both being quite different and 'speaking' different languages. We propose to give the robot the ability to 'read' its human partners intentions."

The researchers looked to achieve this by interfacing the frontal lobe activity of the human brain. Every movement performed by the human body is analyzed and evaluated in the brain prior to its execution. Measuring this signal can help to communicate an 'intention to move' to a robot. However, brains are highly complex organs, and detecting the pre-movement signal is challenging.

The Loughborough University researchers tackled this challenge by training an AI system to recognize the pre-movement patterns from an electroencephalogram (EEG) – a piece of technology that allows human brain activity to be recorded.

Their latest paper reports the findings of a test carried out with eight participants. They had to sit in front of a computer that randomly generated a letter from A-Z on the screen and press the key on the keyboard that matched the letter. The AI system had to predict which arm the participants would move from the EEG data and this intention was confirmed by motion sensors.

The experimental data shows that the AI system can detect when a human is about to move an arm up to 513 milliseconds (ms) before they move, and on average, around 300ms prior to actual execution.

In a simulation, the researchers tested the impact of the time advantage for a human-robot collaborative scenario. They found they could achieve a higher productivity for the same task using the technology as opposed to without it. The completion time for the task was 8-11% faster—even when the researchers included 'false positives', which involved the EEG wrongly communicating a person's intention to move to the robot.

Expanding human-robot collaboration in manufacturing
Source: Loughborough University

Buerkle plans to build on this research and hopes to eventually create a system that can predict where movement is directed—for example, reaching for a screwdriver or picking a new work piece.

Of the latest findings, he said that "we hope this study will achieve two things: first, we hope this proposed technology could help towards a closer, symbiotic human-robot collaboration, which still requires a large amount of research and engineering work to be fully established. Secondly, we hope to communicate that rather than seeing robots and artificial intelligence/machine learning as a threat to human labor in manufacturing, it could also be seen as an opportunity to keep the human as an essential part of the factory of the future."

In a joint statement, Achim's supervisors Dr. Thomas Bamber, Dr. Niels Lohse, and Dr. Pedro Ferreira said that "there is a need to transform the nature of human work in order to create a truly sustainable world no longer dependent on strenuous physical and cognitive human labor."

"Human-Robot Collaboration (HRC) is starting to innovate factory shop-floors, however, there is still a need for more substantial collaboration between humans and robots. True HRC will have a transformative effect on labor productivity, job quality, and wellness and establish a more secure and sustainable labor market, whilst also overcoming physical disadvantages caused by gender, sex age, or disability. Achim's work using Artificial Intelligence and EEG brings us one step closer to true HRC."

Subscribe to our newsletter

Related articles

Robots encourage risk-taking behaviour in humans

Robots encourage risk-taking behaviour in humans

“The Robot made me do it” - research has shown robots can encourage humans to take greater risks in a simulated gambling scenario than they would if there was nothing to influence their behaviours.

ReSkin helps to discover a sense of touch

ReSkin helps to discover a sense of touch

Carnegie Mellon University and Meta AI (formerly Facebook AI) want to increase the sense of touch in robotics, wearables, smart clothing and AI.

Giving robots social skills

Giving robots social skills

A machine learning system helps robots understand and perform certain social interactions

A cryptography game changer for biomedical research

A cryptography game changer for biomedical research

Data privacy and security concerns hamper large-scale studies. Researchers have developed a potential solution.

Enabling AI-driven advances without sacrificing privacy

Enabling AI-driven advances without sacrificing privacy

Secure AI Labs is expanding access to encrypted health care data to advance AI-driven innovation in the field.

AI finds out what makes human tick

AI finds out what makes human tick

Scientists have developed a machine learning technology to understand how gene expression regulates an organism's circadian clock.

Bipedal robot learns to run

Bipedal robot learns to run

Cassie the robot has made history by traversing 5 kilometers, completing the route in just over 53 minutes.

Deep learning helps visualize X-ray data in 3D

Deep learning helps visualize X-ray data in 3D

Scientists have leveraged artificial intelligence to train computers to keep up with the massive amounts of X-ray data taken at the Advanced Photon Source.

A contact aware robot design

A contact aware robot design

Researchers have developed a new method to computationally optimize the shape and control of a robotic manipulator for a specific task.

Popular articles

Subscribe to Newsletter