The Biases of Medical Technology—and How to Overcome Them

“The best advice I have is to question what is normal. Likely conceptions of what is normal are probably based on old-fashioned concepts that are heteronormative, cis-gendered, and racist.”

Dr. Philip Verhoef, Adult and Pediatric Intensivist, Associate Program Director for the Kaiser Permanente Hawaii Internal Medicine Residency

The modern medical industry increasingly relies on technology to innovate and improve healthcare. From pulse oximeters to AI, medical devices are becoming more and more prevalent in diagnosing and treating patients. However, with this increased reliance on technology comes the potential for bias within these devices, leading to adverse outcomes and exacerbating structural social inequalities.

Bias in medicine is pervasive and results in systematic errors or prejudices that can influence medical decisions, research outcomes, and patient care. These biases, conscious or unconscious, can be towards race, gender, socioeconomic status, or personal beliefs of healthcare providers. While healthcare providers can be made aware of their bias and develop techniques to combat it, bias in medical technology can only be addressed by changing the hardware or software.

Tackling biases requires continuous efforts to raise awareness, promote diversity and inclusion in the medical workforce, and implement evidence-based interventions to minimize their effects on clinical practice and research. “As a scientist, I aim to question everything. Doctors are trained to take care of patients. They’re not necessarily trained to question, for example, was there some bias in the way this blood pressure cuff measured blood pressure?” says Dr. Philip Verhoef, an adult and pediatric intensivist at Kaiser Foundation Hospital and clinical assistant professor of medicine at the University of Hawaii John A. Burns School of Medicine.

As a researcher and a physician, Dr. Verhoef can see things from both sides. “As doctors, we assume that everything works. We then take our data and try to make the best decisions without questioning that data. And then the same scientist part of me questions everything. These two skills need to continue to meet so that we can really do the best for everybody instead of people that meet whatever the definition of normal may be,” he explains.

Keep reading to learn about recent discoveries of biased medical technology, how to identify biased technology, prevent bias, and balance it all.

Meet the Expert: Philip A. Verhoef, MD, PhD, FAAP, FACP

Philip Verhoef

Dr. Philip Verhoef is an adult and pediatric intensivist who has dedicated his career to clinical practice and research. He serves patients at Kaiser Foundation Hospital and Kapi’olani Medical Center for Women and Children while also holding the position of associate program director for the Kaiser Permanente Hawaii Internal Medicine Residency.

Additionally, Dr. Verhoef is a clinical assistant professor of medicine at the University of Hawaii John A. Burns School of Medicine. Recently, Dr. Verhoef co-authored a paper titled “Racial Differences in Detection of Fever Using Temporal vs. Oral Temperature Measurements in Hospitalized Patients” that was published in the Journal of the American Medical Association.

Biased Technology

While researching an unrelated subject, Dr. Verhoef noticed something interesting. “I had been looking at temperature measurements of patients when they get into the hospital to understand what’s going on with their immune system. We had been doing quite a bit of work related to that, specifically looking at people who came into the hospital with high temperatures to see how this reflects what’s going to happen to them,” he explains.

“In the course of doing that, we observed that the temperature measurements for patients with darker skin were not necessarily lining up exactly the same way as they were for patients with lighter skin. So, we asked the question, ‘Is temperature measurement equally accurate between these groups?’ Particularly if you use an oral thermometer or a temporal thermometer.”

To determine if there was an issue, Dr. Verhoef and other researchers compared the temperature data of white patients with those of patients with lighter skin. Specifically, they looked at patients with two temperature readings close together where one was using a scanner, and the other was oral: “We found that temperatures lined up pretty well for patients with lighter skin but for African American patients, it seemed like the temporal thermometers were underestimating their temperature. That means that maybe we’re potentially missing them as having a fever. This matters greatly in healthcare because if you see something abnormal, like a fever, you do something about it. So if you have a device that’s underestimating temperature, then you’re going to be underestimating people that have a fever, and potentially not providing the care that they need and deserve,” he explains.

Thermometers aren’t the only place where medical technology has had a bias due to skin pigmentation: “A 2020 pulse oximeter study was the big inspiration for us. Researchers suggest studying whether pulse oximetry was less accurate for patients with darker skin. If an oximeter is misidentifying whether or not oxygen levels are low, then patients might not be getting the care they need,” says Dr. Verhoef. “This study raised the issue of structural racism in a world that is based on a gold standard for diagnosis being people with light skin. It’s crazy because I assume, maybe incorrectly, that devices like this are tested on a wide range of people with different colors, ages, and ethnic backgrounds. And yet, here we have sort of real-world evidence suggesting that there’s real bias, and so it raises the question, ‘Where are there other places where such levels of bias exist?’”

How to Identify Bias In Medical Technology

Identifying bias in medical technology involves a multi-faceted approach that brings together all stakeholders, including developers, researchers, clinicians, and policymakers: “It involves a systematic approach to looking at inequities and disparities in healthcare that includes asking the question, ‘What are the systems that we think are normal and how are they actually not serving people?’” says Dr. Verhoef.

“For example, a recent report said all women should begin getting mammograms at the age of 40, particularly African American women because they are 40 percent more likely to die of breast cancer. So this is good work, right? But you need to ask the question why are African American women more likely to die? Some of it is that they don’t have access to care, but there’s probably much more to it. If you’re thinking of a hypothesis, you might ask, ‘Does chemotherapy for breast cancer work as well in African American women as it does for white women? Does the kind of surgery we perform have the same success rate in this group?’” posits Dr. Verhoef.

How To Prevent Biased Medical Technology

Bias in technology that is already on the market can be identified and corrected, but what can be done to prevent the bias in the first place?

“It’s an interesting question. We should hold device-makers to a strong set of standards as they craft new technology. I wonder how often, after something has been approved and assumed to be a standard, do we have to go back and change it because it is not working the way it was supposed to? This applies to drugs, medical devices, and any typical intervention,” says Dr. Verhoef. “You have to do real-world assessment of a device in a reasonable way to ensure equity.”

Other ways to minimize bias in medical technology include using diverse data sets that cover different races, genders, ages, and socioeconomic backgrounds. Algorithm transparency in developing new medical technologies can also help researchers identify potential bias. Ensuring inclusive development teams can help identify and address potential biases during development. Lastly, ethics committees that include medical, data science, and social sciences experts should review and approve medical technologies before they are implemented in clinical settings.

Advice For Healthcare Professionals

The best thing healthcare providers can do is be aware that bias exists, both in their own thinking and the technology and protocols they may be using: “The best advice I have is to question what is normal. Likely conceptions of what is normal are probably based on old-fashioned concepts that are heteronormative, cis-gendered, and racist. We’re drawing conclusions based on a fixed mindset of what normal is—and that can cause harm. Make sure that you are willing or open to questioning your standard and recognize it maybe not actually be standard,” encourages Dr. Verhoef.

Kimmy Gustafson
Kimmy Gustafson Writer

With her passion for uncovering the latest innovations and trends, Kimmy Gustafson has provided valuable insights and has interviewed experts to provide readers with the latest information in the rapidly evolving field of medical technology since 2019. Kimmy has been a freelance writer for more than a decade, writing hundreds of articles on a wide variety of topics such as startups, nonprofits, healthcare, kiteboarding, the outdoors, and higher education. She is passionate about seeing the world and has traveled to over 27 countries. She holds a bachelor’s degree in journalism from the University of Oregon. When not working she can be found outdoors, parenting, kiteboarding, or cooking.