Set of rendered faces representing six basic emotions in three different...
Set of rendered faces representing six basic emotions in three different intensity levels. All emotion categories performed well, only Disgust did not perform as well as the other emotions.
Source: Viviane Herdel

Using drones to elicit emotional responses

As drones become more ubiquitous in public spaces, researchers at Ben-Gurion University of the Negev (BGU) have conducted the first studies examining how people respond to various emotional facial expressions depicted on a drone, with the goal of fostering greater social acceptance of these flying robots.

The research reveals how people react to common facial expressions superimposed on drones. "There is a lack of research on how drones are perceived and understood by humans, which is vastly different than ground robots." says Prof. Jessica Cauchard together with Viviane Herdel of BGU's Magic Lab, in the BGU Department of Industrial Engineering & Management. "For the first time, we showed that people can recognize different emotions and discriminate between different emotion intensities."

The researchers conducted two studies using a set of rendered robotic facial expressions on drones that convey basic emotions. The faces use four core facial features: eyes, eyebrows, pupils, and mouth. The results showed that five different emotions (joy, sadness, fear, anger, surprise) can be recognized with high accuracy in static stimuli, and four emotions (joy, surprise, sadness, anger) in dynamic videos. Disgust was the only emotion that was poorly recognized.

"Participants were further affected by the drone and presented different responses, including empathy, depending on the drone's emotion," Prof. Cauchard says. "Surprisingly, participants created narratives around the drone's emotional states and included themselves in these scenarios."

BGU researchers propose a number of recommendations that will enhance the acceptability of drones for use in emotional support and other social situations. These include adding anthropomorphic features, using the five basic emotions, and using empathetic responses to drive compliance in health and behavior change applications.

"BGU is spearheading some of the most remarkable robotic research in the world," says Doug Seserman, chief executive officer, Americans for Ben-Gurion University. "We foresee continued innovation leveraging human-drone interaction technologies, leading to greater adoption and more beneficial applications."

Subscribe to our newsletter

Related articles

A robot scientist is ready for drug discovery

A robot scientist is ready for drug discovery

The robot scientist Eve has been assembled and is now operating at Chalmers University of Technology. Eve’s f​irst mission is to identify and test drugs against Covid-19.​

Robot touch makes you feel better

Robot touch makes you feel better

People who were touched by a humanoid robot while conversing with it subsequently reported a better emotional state and were more likely to comply with a request from the robot.

Xenobots 2.0: The next generation of living robots

Xenobots 2.0: The next generation of living robots

Researchers have created life forms that self-assemble a body from single cells and do not require muscle cells to move. They're faster, live longer, and can now record information.

Robots can draw out reluctant participants in groups

Robots can draw out reluctant participants in groups

Can a robot draw a response simply by making “eye” contact, even with people who are less inclined to speak up. A recent study suggests that it can.

Most patients wouldn't mind a 'robotic doc'

Most patients wouldn't mind a 'robotic doc'

A study finds patients are receptive to interacting with robots designed to evaluate symptoms in a contact-free way.

An ear-bot 'hears' through the ear of a locust

An ear-bot 'hears' through the ear of a locust

For the first time, the ear of a dead locust was connected to a robot that receives the ear’s electrical signals and responds accordingly.

Artificial skin brings robots closer to 'touching' human lives

Artificial skin brings robots closer to 'touching' human lives

Researchers have constructed a 3D vision-guided artificial skin that enables tactile sensing with high performance, opening doors to innumerable applications in medicine.

How to train a robot - using AI and supercomputers

How to train a robot - using AI and supercomputers

Computer scientists use TACC systems to generate synthetic objects for robot training.

Robot learns fast and safe navigation strategy

Robot learns fast and safe navigation strategy

Researchers have proposed a new framework for training mobile robots to quickly navigate while maintaining low collision rates.

Popular articles

Subscribe to Newsletter