Guiding principles to protect the public from discrimination of healthcare algorithms

A paper published today in JAMA Network Open addresses bias in healthcare algorithms and provides the healthcare community with guiding principles to avoid repeating errors that have tainted the use of algorithms in other sectors.

This work, conducted by a technical expert panel co-chaired by Marshall Chin, MD, MPH, the Richard Parrillo Family Distinguished Service Professor of Healthcare Ethics at the University of Chicago, supports the Biden Administration Executive Order 14091, Further Advancing Racial Equity and Support for Underserved Communities Through The Federal Government, issued on February 16, 2023. President Biden calls for Federal agencies to consider opportunities to prevent and remedy discrimination, including by protecting the public from algorithmic discrimination.

The technical expert panel was supported by the Agency for Healthcare Research and Quality (AHRQ) and the National Institute for Minority Health and Health Disparities at the National Institutes of Health (NIH) in partnership with the HHS Office of Minority Health and the Office of the National Coordinator for Health Information Technology.

Healthcare algorithms, including those developed by artificial intelligence, have potential for great benefit and great harm. We know that biased algorithms have harmed minoritized communities in other fields such as housing, banking, education, and criminal justice."

Marshall Chin, MD, MPH, the Richard Parrillo Family Distinguished Service Professor of Healthcare Ethics at the University of Chicago

The use of algorithms is expanding in many realms of healthcare, from diagnostics and treatments to payer systems and business processes. Every sector of the healthcare system is using these technologies to try to improve patient outcomes and reduce costs.

The panel developed a conceptual framework to apply the following guiding principles across an algorithm's life cycle to address the problems of structural racism and discrimination, centering on healthcare equity for patients and communities as the overreaching goal:

  1. Promote health and healthcare equity during all healthcare algorithm life cycle phases.
  2. Ensure healthcare algorithms and their use are transparent and explainable.
  3. Authentically engage patients and communities during all healthcare algorithm life cycle phases and earn trustworthiness.
  4. Explicitly identify healthcare algorithmic fairness issues and tradeoffs.
  5. Establish accountability for equity and fairness in outcomes from healthcare algorithms.

The technical expert panel reviewed evidence, heard from stakeholders, and received community feedback. Although algorithms are widely used and can offer value in diagnostics and treatments, not all individuals benefit equally from such algorithms, creating inequities. This is due primarily to biases that result in undue harm to disadvantaged populations, which perpetuates healthcare disparities and may violate civil rights protections. To rectify these issues, the healthcare community and the public must understand how using algorithms may lead to unintended biased outcomes, how to identify biases before implementation, and what to do with biases discovered after implementation.

"Algorithmic bias is neither inevitable nor merely a mechanical or technical issue. Conscious decisions by algorithm developers, algorithm users, the healthcare industry, and regulators can mitigate and prevent bias and proactively advance health equity," Chin said.

The paper, Guiding Principles to Address the Impact of Algorithm Bias on Racial and Ethnic Disparities in Health and Health Care, may be found in JAMA Network Open. The journal also links to an accompanying podcast interview of panel co-chairs Marshall Chin, MD, MPH, and Lucila Ohno-Machado, MD, PhD, MBA.

Source:
Journal reference:

Wang, Y., et al. (2023). Facility Fees for Colonoscopy Procedures at Hospitals and Ambulatory Surgery Centers. JAMA Health Forum. doi.org/10.1001/jamahealthforum.2023.4025.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of News Medical.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Crafting concise and authoritative healthcare resources