BMC Med. 2025 Oct 17;23(1):567. doi: 10.1186/s12916-025-04340-3.
ABSTRACT
BACKGROUND: Algorithms are increasingly used in healthcare, yet most algorithms lack thorough evaluation and impact assessment across diverse populations. This absence of comprehensive scrutiny introduces a significant risk of inequitable clinical outcomes, particularly between different demographic and socioeconomic groups.
MAIN BODY: Societal biases-rooted in structural inequalities and systemic discrimination-often shape the data used to develop these algorithms. When such biases become embedded into predictive models, algorithms frequently favor privileged populations, further deepening existing inequalities. Without proactive efforts to identify and mitigate these biases, algorithms risk disproportionately harming already marginalized groups, widening the gap between advantaged and disadvantaged patients. Various statistical metrics are available to assess algorithmic fairness, each addressing different dimensions of disparity in predictive performance across population groups. However, understanding and applying these fairness metrics in real-world healthcare settings remains limited. Transparency in both the development and communication of algorithms is essential to building a more equitable healthcare system. Openly addressing fairness concerns fosters trust and accountability, ensuring that fairness considerations become an integral part of algorithm design and implementation rather than an afterthought. Using a participatory approach involving three clinicians and three patients with lived experience of type 2 diabetes, we developed a set of guiding questions to help healthcare professionals assess algorithms critically, challenge existing practices, and stimulate discussions.
CONCLUSIONS: We aim to direct healthcare professionals on navigating the complexities of bias in healthcare algorithms by encouraging critical thinking about biases present in society, data, algorithms, and healthcare systems.
PMID:41107862 | DOI:10.1186/s12916-025-04340-3