Shown here is a question from the social needs survey.
Shown here is a question from the social needs survey as a form (left), in the original chatbot (middle) and in the improved chatbot (right). The improvements are shown here as a) through e). a) The chatbot asked people if they wanted to continue hearing it read questions out loud. b) If they said no, the chatbot gave them an option to turn it back on later. c) The chatbot varied the amount of time it spent “typing” based on the length of its response. d) The team fixed the patient response area to one place on the screen. e) The chatbot’s responses were more specific to the context of the questions and the patient’s answers.
Source: University of Washington
24.11.2021 •

A chatbot for emergency department patients' social needs

A team led by the University of Washington developed a chatbot that could ask emergency department visitors about social needs, including housing, food, access to medical care and physical safety.

Americans visit hospital emergency departments nearly 130 million times per year. Although the focus of these visits is to address acute illness and injury, doctors are increasingly finding that social needs — such as food and housing insecurity — place many patients at higher risk of getting sick and requiring emergency care.

In order to better serve patients and possibly prevent future emergency department visits, doctors need a way to assess incoming patients to establish a wider context behind their visits.

The team tested it on 41 patients in Seattle and Los Angeles emergency departments. Results show that two groups of patients preferred the chatbot: patients who had less than a middle school level of health literacy and patients who appreciated establishing emotional connections.

"A few years ago there was a huge buzz around chatbots, and then people started realizing that maybe they aren't meant for everything," said co-senior author Gary Hsieh, a UW associate professor in the human-centered design and engineering department. "We have been trying to figure out opportunities where having a chatbot would actually be meaningful and make sense."

One good opportunity involved collaborating with emergency department doctors. "We want to understand the upstream issues that bring people into the emergency department. What are the social needs of the patients that we serve and how can we develop interventions that address these needs?" said co-author Dr. Herbert Duber, associate professor of emergency medicine in the UW School of Medicine. "For many people, including those with low literacy levels, a chatbot makes so much sense for collecting this information."

The team designed a chatbot named HarborBot, after the hospitals where it was tested. HarborBot takes patients through a social needs survey that was developed by the Los Angeles County Health Agency. This survey asks patients 36 questions related to demographics, finances, employment, education, housing, food and utilities. It also asks questions related to physical safety, legal needs and access to care.

HarborBot is displayed on a tablet as a typical chat window with the patient's and bot's conversation showing up in different colored bubbles. HarborBot's chat bubble shows animated ellipses when the bot is "typing."

Based on a previous study, the researchers improved the chatbot’s efficiency and social skills.

For efficiency, the researchers:

  • modified the amount of time the bot looked like it was typing to match the length of text the bot displayed. This means that the bot would "type" for a shorter amount of time for a shorter response
  • added a question at the beginning of the interaction that would allow patients to stop HarborBot from reading all of its questions and responses aloud
  • placed the patients' answer options in the same part of the screen so that patients, who were often tired or in pain, could respond without having to move their hands

To increase the empathy of the interaction, the team changed the bot's reactions to better match the content of the questions and patient responses.

"Some of the questions are quite sensitive — there are questions about violence and sexual abuse — and the bot's original responses said 'Sure,' 'Great' or 'Thanks for sharing with us,'" said lead author Rafał Kocielnik, who completed this project as a doctoral student at the UW and is now a postdoctoral fellow at Caltech. "We tried tailoring its responses in a way that made them more appropriate for the content and specific to the patients' responses, such as 'That must be stressful, thank you for letting me know.'"

Recommended article

After HarborBot received its upgrades, the researchers tested it at two emergency departments: one at Harborview Medical Center in Seattle and the other at the Harbor-UCLA Medical Center in Los Angeles.

For both locations, the researchers worked at night (between 8 p.m. and 1 a.m. in Seattle and between 4 p.m. and 4 a.m. in Los Angeles). The teams collaborated with triage nurses to select potential participants. Then the researchers took participants to a visitor room where they could still hear announcements. After the patients signed a consent form, they completed:

  • two surveys to gauge health literacy. One survey asks patients to pronounce health-related terms and the other asks patients to answer questions about the nutritional facts label on a pint of ice cream
  • the social needs survey as both a web form through SurveyGizmo and an interaction with HarborBot. These were given in a randomized order
  • evaluations for both the web form and HarborBot
  •  a survey to gauge a patient's desire for emotional interactions

At the end, the researchers interviewed the participants about the experience.

The team was not surprised to find that many people with low health literacy preferred the HarborBot version of the survey — 17 out of 20 low-literacy participants chose HarborBot, compared to 8 out of 21 high-literacy participants. People who valued emotional connection also liked the chatbot but these two groups didn't necessarily overlap.

"We thought maybe people with low health literacy would also be more in need of emotional interaction," Kocielnik said. "But it turns out, the two groups are not strongly correlated."

Of the 23 participants who scored high on the emotional interactions questionnaire, 18 chose HarborBot. Meanwhile, only 7 of the 18 participants who scored lower on that questionnaire preferred HarborBot.

"It's important to understand that chatbots can benefit people in different ways," said co-author Raina Langevin, a UW doctoral student in human-centered design and engineering.

In the future, the team plans to design a survey system that could tailor the experience to each user. For example, it could start out as the chatbot, but then based on how a user is answering the questions, it could shift into more of a survey format.

"Our vision would be some sort of kiosk people could use while they are waiting. Or even a QR code that people can scan with their own devices and then answer these questions," Hsieh said. "Ultimately we want to connect people entering emergency departments as smoothly as possible with the resources that they need."

Subscribe to our newsletter

Related articles

Chatbots help decrease opioid use after surgery

Chatbots help decrease opioid use after surgery

A study showed that patients receiving messages from a chatbot used fewer opioids after fracture surgery, and their overall pain level fell, too.

COVID-19: can chatbot ease medical providers' burden?

COVID-19: can chatbot ease medical providers' burden?

Research found that chatbots working for reputable organizations can ease the burden on medical providers and offer trusted guidance to those with symptoms.

An AI companion for lonely seniors

An AI companion for lonely seniors

Scientists are creating chatbots thet can express and respond to emotion during a conversation—including AI companions that could help relieve loneliness for seniors.

Figurative language confuses chatbots

Figurative language confuses chatbots

When chatbots are confronted with dialog that includes idioms or similes, their performance drops to between 10 and 20 percent.

Digital twin for personalized medicine

Digital twin for personalized medicine

As part of the “MED²ICIN” lighthouse project, seven Fraunhofer Institutes are presenting the first prototype of a digital patient model.

E-mental health: A solution for a rising challenge

E-mental health: A solution for a rising challenge

E-mental health services could provide a response to these challenges and offer effective ways for prevention, diagnosis, treatment, and aftercare.

Enabling AI-driven advances without sacrificing privacy

Enabling AI-driven advances without sacrificing privacy

Secure AI Labs is expanding access to encrypted health care data to advance AI-driven innovation in the field.

Chatbots could train our mental condition

Chatbots could train our mental condition

Technology will play an increasingly social and even emotional role in our lives. Virtual conversations lead to more self-compassion.

Can chatbots help fill the empathy gap?

Can chatbots help fill the empathy gap?

Stressed out? Need to talk? Turning to a chatbot for emotional support might help, research from Michigan State University shows.

Popular articles

Subscribe to Newsletter