The research reveals how people react to common facial expressions superimposed on drones. "There is a lack of research on how drones are perceived and understood by humans, which is vastly different than ground robots." says Prof. Jessica Cauchard together with Viviane Herdel of BGU's Magic Lab, in the BGU Department of Industrial Engineering & Management. "For the first time, we showed that people can recognize different emotions and discriminate between different emotion intensities."
The researchers conducted two studies using a set of rendered robotic facial expressions on drones that convey basic emotions. The faces use four core facial features: eyes, eyebrows, pupils, and mouth. The results showed that five different emotions (joy, sadness, fear, anger, surprise) can be recognized with high accuracy in static stimuli, and four emotions (joy, surprise, sadness, anger) in dynamic videos. Disgust was the only emotion that was poorly recognized.
"Participants were further affected by the drone and presented different responses, including empathy, depending on the drone's emotion," Prof. Cauchard says. "Surprisingly, participants created narratives around the drone's emotional states and included themselves in these scenarios."
BGU researchers propose a number of recommendations that will enhance the acceptability of drones for use in emotional support and other social situations. These include adding anthropomorphic features, using the five basic emotions, and using empathetic responses to drive compliance in health and behavior change applications.
"BGU is spearheading some of the most remarkable robotic research in the world," says Doug Seserman, chief executive officer, Americans for Ben-Gurion University. "We foresee continued innovation leveraging human-drone interaction technologies, leading to greater adoption and more beneficial applications."