Hassan Jassar (seated) wears a sensor-outfitted cap that detects changes in...
Hassan Jassar (seated) wears a sensor-outfitted cap that detects changes in blood flow and oxygenation, thus sensing brain activity. That information is transmitted to a computer and interpreted. Thiago Nascimento, left, views this brain activity in real-time while wearing augmented reality glasses, and the computer image shows that particular pain signature in the brain. From left to right, also pictured, Dr. Alex DaSilva, Dajung Kim, Manyoel Lim, Xiao-su Hu.
Source: University of Michigan

AR allows researchers to see patients’ real-time pain

Researchers have developed a technology to help clinicians "see" and map patient pain in real-time, through special augmented reality glasses.

Many patients, especially those who are anesthetized or emotionally challenged, cannot communicate precisely about their pain. For this reason, University of Michigan researchers have developed a technology to help clinicians “see” and map patient pain in real-time, through special augmented reality glasses.

The technology was tested on 21 volunteer dental patients, and researchers hope to one day include other types of pain and conditions. It’s years away from widespread use in a clinical setting, but the feasibility study is a good first step for dental patients, said Alex DaSilva, associate professor at the U-M School of Dentistry and director of the Headache and Orofacial Pain Effort Lab.

The portable CLARAi (clinical augmented reality and artificial intelligence) platform combines visualization with brain data using neuroimaging to navigate through a patient’s brain while they’re in the chair. “It’s very hard for us to measure and express our pain, including its expectation and associated anxiety,” DaSilva said. “Right now, we have a one to 10 rating system, but that’s far from a reliable and objective pain measurement.”

In the study, researchers triggered pain by administering cold to the teeth. Researchers used brain pain data to develop algorithms that, when coupled with new software and neuroimaging hardware, predicted pain or the absence of it about 70 percent of the time.

Participants wore a sensor-outfitted cap that detected changes to blood flow and oxygenation, thus measuring brain activity and responses to pain. That information was transmitted to a computer and interpreted.

Wearing special augmented reality glasses (in this case, the Microsoft HoloLens), researchers viewed the subject’s brain activity in real time on a reconstructed brain template, while the subjects sat in the clinical chair. The red and blue dots on the image denote location and level of brain activity, and this “pain signature” was mirror-displayed on the augmented reality screen. The more pain signatures the algorithm learns to read, the more accurate the pain assessment.

Subscribe to our newsletter

Related articles

Cloud computing expands brain sciences

Cloud computing expands brain sciences

A neuroscientist at University of Texas at Austin wants to democratize the field and support infrastructure.

Biomedical research: deep learning outperforms machine learning

Biomedical research: deep learning outperforms machine learning

Deep-learning methods have the potential to offer substantially better results, generating superior representations for characterizing the human brain.

Potential jurors favor use of AI in precision medicine

Potential jurors favor use of AI in precision medicine

Physicians who follow AI advice may be considered less liable for medical malpractice than is commonly thought, according to a new study of potential jury candidates in the U.S.

Teaching artificial intelligence to adapt

Teaching artificial intelligence to adapt

Researchers have used a computational model of brain activity to simulate this process more accurately than ever before.

Blind spots at the intersection of AI and neuroscience

Blind spots at the intersection of AI and neuroscience

Is it possible to read a person's mind by analyzing the electric signals from the brain? The answer may be much more complex than most people think.

How humans use objects to solve problems

How humans use objects to solve problems

What's SSUP? The Sample, Simulate, Update cognitive model developed by MIT researchers learns to use tools like humans do.

Towards an AI diagnosis like the doctor's

Towards an AI diagnosis like the doctor's

Researchers show how they can make an AI show how it's working, as well as let it diagnose more like a doctor, thus making AI-systems more relevant to clinical practice.

Neural networks: artificial brains need sleep too

Neural networks: artificial brains need sleep too

States that resemble sleep-like cycles in simulated neural networks quell the instability that comes with uninterrupted self-learning in artificial analogs of brains.

AI outperform doctors: Experts express concerns

AI outperform doctors: Experts express concerns

Many studies claiming that AI is as good as (or better than) human experts at interpreting medical images are of poor quality and are arguably exaggerated, warn researchers in The BMJ.

Popular articles

Subscribe to Newsletter