“We saw a sharp increase in mass shootings in the U.S. in recent years, and that can cause increases in the need for mental health services,” says Yang Cheng, first author of the study and an assistant professor of communication at NC State. “And automated online chatbots are an increasingly common tool for providing mental health services – such as providing information or an online version of talk therapy. But there has been little work done on the use of chatbots to provide mental health services in the wake of a mass shooting. We wanted to begin exploring this area, and started with an assessment of what variables would encourage people to use chatbots under those circumstances.”
The researchers conducted a survey of 1,114 U.S. adults who had used chatbots to seek mental health services at some point prior to the study. Study participants were given a scenario in which there had been a mass shooting, and were then asked a series of questions pertaining to the use of chatbots to seek mental health services in the wake of the shooting. The survey was nationally representative and the researchers controlled for whether study participants had personal experiences with mass shootings.
The researchers found a number of variables that were important in driving people to chatbots to address their own mental health needs. For example, people liked the fact that chatbots were fast and easy to access, and they thought chatbots would be good sources of information. The study also found that people felt it was important for chatbots to be humanlike, because they would want the chatbots to provide emotional support.
But researchers were surprised to learn that a bigger reason for people to use chatbots was to help other people who were struggling with mental health issues. “We found that the motivation of helping others was twice as powerful as the motivation of helping yourself,” Cheng says.
Helping others, in this context, would include talking to a chatbot in order to help a loved one experiencing mental illness from getting worse; finding ways to encourage the loved one to access the chatbot services; or to demonstrate to the loved one that the services are easy to use. “Our study offers detailed insights into what is driving people to access mental health information on chatbot platforms after a disaster, as well as how they are using that information,” Cheng says. “Among other applications, these findings should be valuable for the programmers and mental healthcare providers who are responsible for developing and deploying these chatbots.”