News Release
Researchers Receive $2.8 Million Grant to Study Hidden Biases in Healthcare
Nadir Weibel is part of a team that has received a $2.8 million grant from the National Library of Medicine to study hidden biases in health care. |
San Diego, Calif., Nov. 4, 2019 -- Individuals have their own inherent biases. Most are harmless – preferred foods, favorite cars, go-to streaming services. However, biases tied to race, gender, sexual orientation and socioeconomic status have serious consequences.
This is particularly true in medicine. Unintentional, hidden, biases may perpetuate healthcare disparities. While providers are not acting out of malice, these attitudes could have significant impacts on patient care.
To better understand these issues, Nadir Weibel, a researcher in UC San Diego’s Computer Science and Engineering Department and head of the Human-Centered and Ubiquitous Computing Lab, and collaborators Andrea Hartzler, Wanda Pratt and Janice Sabin at the University of Washington (UW) have received a $2.8 million grant from the National Library of Medicine to launch UnBIASED, a project to study hidden biases in healthcare and develop methods to help rectify them.
“The project seeks to use social signal processing (SSP), a computational approach that detects subtle cues in behavior that are typically invisible,” says Weibel. “For example, talk time, interruptions and body movements from health care providers might differ based on a patient’s race, gender or socioeconomic status.”
The team will use SSP technology to assess hidden biases during medical appointments. SSP analyzes video and audio to make sense of social interactions, such as body language, how long providers and patients talk, how much they interrupt, whether providers are looking at their patients and other clues that could potentially indicate bias.
Each university brings special expertise to the table. The UW team contributes clinical informatics. Weibel’s lab at UC San Diego’s Jacobs School of Engineering will lead the engineering efforts, creating the SSP model and the UnBIASED tool to delineate possible biases. The UC San Diego group will also contribute the human-centered design in concert with UW and clinicians and patients at UC San Diego and San Diego community clinics.
Graduate student Steven Rick, who helped organize CSE’s Celebration of Diversity, is the lead PhD student on the project. With his background in human-centered interaction, design and biomedical informatics, he will develop SSP models to extract verbal and nonverbal cues from patient/provider interactions.
Once SSP has analyzed the video and audio, the combined team will use the data to provide feedback for providers and help them recognize potential biases to improve their interactions with patients.
This approach builds on previous work in the Human-Centered and Ubiquitous Computing Lab. For example, the team has been studying ways to use video to determine if potential stroke patients suffer from weakness on one side of their bodies (hemiparesis).
“The stroke project is looking at their bodies to determine if there are any differences between the right and left-handed sides,” says Weibel. “Here, we’re looking at facial expression. Does it indicate frustration, happiness or other sentiments? We are using similar technologies but with very different goals. One is physiological – the stroke – the other is emotional, trying to understand the conversation.”
While there is no guarantee this research will identify specific biases, it will provide insights into how providers change their approaches across patient types, giving physicians new input for self-assessment.
“One thing that is really hard, that we will not be able to do, is tell them they are being biased,” says Weibel. “That is probably not even possible. But we can tell them how the conversation is different and understand the interaction between physician and patient.”
In the near-term, the group will deploy these tools at UC San Diego and local community clinics to see if they are effective, evaluating the results against the Implicit Association Test to assess accuracy. Eventually, they hope the technology can become a more routine part of care.
“The tool might end up being deployed in clinics or medical offices to make people more aware,” says Weibel. “And though it is not a specific goal of this study, this approach might be useful as a training tool in medical schools in the future.”
Media Contacts
Ioana Patringenaru
Jacobs School of Engineering
858-822-0899
ipatrin@ucsd.edu
Joshua Baxt
Computer Science and Engineering Writer
000-000-0000
none@none.none