Alena M. Buyx, Professor of Ethics in Medicine and Health Technologies at the...
Alena M. Buyx, Professor of Ethics in Medicine and Health Technologies at the TUM in her office.
Source: Andreas Heddergott / TUM

Robot therapists need rules

Interactions with artificial intelligence (AI) will become an increasingly common aspect of our lives. A team at the Technical University of Munich (TUM) has now completed the first study of how “embodied AI” can help treat mental illness. Their conclusion: Important ethical questions of this technology remain unanswered. There is urgent need for action on the part of governments, professional associations and researchers.

Robot dolls that teach autistic children to communicate better, computer-generated avatars that help patients cope with hallucinations, and virtual chats offering support with depression: Numerous initiatives using embodied AI for improving mental health already exist. These applications are referred to as embodied because they involve interactions between individuals and an artificial agent, resulting in entirely new dynamics.

The use of AI in psychotherapy is not new as such. Back in the 1960s, the first chatbots created the illusion of a psychotherapy session. In reality, however, this was little more than a gimmick. With today’s advanced algorithms and higher computing power, much more is possible. “The algorithms behind these new applications have been trained with enormous data sets and can produce genuine therapeutic statements,” explains Alena Buyx, Professor of Ethics in Medicine and Health Technologies at TUM. With Dr. Amelia Fiske and Peter Henningsen, Professor of Psychosomatic Medicine and psychotherapy, she has conducted the first systematic survey of embodied AI applications for mental health and drawn conclusions on the related opportunities and challenges.

The new applications have enormous potential. They can make treatment accessible to more people because they are not limited to specific times or locations. In addition, some patients find it easier to interact with AI than with a human being. But there are risks, too. “AI methods cannot and must not be used as a cheaper substitute for treatment by human doctors,“ says Amelia Fiske.

“Although embodied AI has arrived in the clinical world, there are still very few recommendations from medical associations on how to deal with this issue. Urgent action is needed, however, if the benefits of these technologies are to be exploited while avoiding disadvantages and ensuring that reasonable checks are in place. Young doctors should also be exposed to this topic while still at medical school,” says Peter Henningsen, who is the dean of the TUM School of Medicine.

Ethical rules for AI still lacking

At present, there are increased efforts to draw up guidelines for AI, including the Ethics Guidelines for Trustworthy AI just issued by the EU. However, Buyx, Fiske and Henningsen also see an urgent need to regulate the use of AI in specialized fields. “Therapeutic AI applications are medical products for which we need appropriate approval processes and ethical guidelines,“ says Alena Buyx. “For example, if the programs can recognize whether patients are having suicidal thoughts, then they must follow clear warning protocols, just like therapists do, in case of serious concerns.“

In addition, intensive study is needed into the social effects of embodied AI. “We have very little information on how we as human beings are affected by contact with therapeutic AI,” says Alena Buyx. “For example, through contact with a robot, a child with a disorder on the autism spectrum might only learn how to interact better with robots – but not with people.”

Subscribe to our newsletter

Related articles

Motivation for using chatbots for mental health

Motivation for using chatbots for mental health

Researchers assessed what would motivate people to use chatbots for mental health services in the wake of a mass shooting.

Robin, the AI  social robot

Robin, the AI social robot

In response to the COVID-19 pandemic, UCLA Mattel Children’s Hospital launches an innovative project to support the emotional needs of children.

Covid-19: chatbots could be used to deliver psychotherapy

Covid-19: chatbots could be used to deliver psychotherapy

Researchers show chatbots could play a key role in helping people with issues around their health and wellbeing.

Mental health game changer

Mental health game changer

Using a simple computer game and AI techniques, researchers were able to identify behavioural patterns in subjects with depression and bipolar disorder.

Staying sharp with memory-jogging robot

Staying sharp with memory-jogging robot

Memory-jogging robot to keep people sharp in ‘smart’ retirement homes has been used in a trial to combat cognitive decline in later age.

Cerebral palsy: robotics trainer improves body control

Cerebral palsy: robotics trainer improves body control

Reseachers have developed robotic Trunk Support Trainer (TruST) that helps children with CP to sit more stably.

Exposure therapy in VR to prevent disorders

Exposure therapy in VR to prevent disorders

VR can have significant impact on the validity of remote health appointments for those with eating disorders, through called Virtual Reality Exposure Therapy (VRET).

Mini-brains help robots recognise pain

Mini-brains help robots recognise pain

Using a brain-inspired approach, scientists have developed a way for robots to have the AI to recognise pain and to self-repair when damaged.

Soft robots, origami combine to deliver treatments

Soft robots, origami combine to deliver treatments

Researchers have found a way to send tiny, soft robots into humans, potentially opening the door for less invasive surgeries and ways to deliver treatments for several conditions.

Popular articles