Intended for healthcare professionals

Editorials

Artificial intelligence and global health equity

BMJ 2024; 387 doi: https://doi.org/10.1136/bmj.q2194 (Published 11 October 2024) Cite this as: BMJ 2024;387:q2194

Linked Analysis

Health information for all: do large language models bridge or widen the digital divide?

  1. Robyn Gayle Dychiao, medical intern1,
  2. Lama Nazer, clinical affairs manager2,
  3. Donald Mlombwa, senior nursing officer3,
  4. Leo Anthony Celi, clinical research director4
  1. 1University of the Philippines College of Medicine, Philippine General Hospital, Manila Philippines
  2. 2Department of Pharmacy, King Hussein Cancer Center, Amman, Jordan
  3. 3Department of Critical Care, Zomba Central Hospital, Zomba, Malawi
  4. 4Laboratory for Computational Physiology, Massachusetts Institute of Technology, Cambridge, MA, USA
  1. Correspondence to: L Celi lceli{at}mit.edu

Regulation and monitoring are needed to prevent harmful bias in AI tools

Artificial intelligence (AI) could help achieve global health equity by extending efficient and cost effective healthcare to historically underserved patients.123 However, biases ingrained in AI tools can reduce equity and cause harm unless their creation, deployment, and assessment take account of existing health inequities and are tailored to the settings and populations in which they are being used.45678

AI algorithms are trained using large datasets. Unconscious judgments of researchers and AI experts about which issues to work on, which populations to study, and which training datasets to use can perpetuate prejudice.9 Large language models (LLMs), artificial neural networks that encode knowledge, are trained on clinical content that is rife with societal prejudices. This plays out in, for example, LLMs offering differential diagnoses and treatment recommendations that stereotype certain demographic groups, furthering health inequities.10

Inequities ingrained in healthcare

Social patterning in data generation reflects structural inequities ingrained in healthcare systems11: data tend to be sourced from majoritised demographics (ie, white western male populations) and devices that are developed and calibrated on young healthy people, typically white college …

View Full Text

Log in

Log in through your institution

Subscribe

* For online subscription