Mutation rate differences across populations and association with performance disparities in pathology AI diagnostic models.
Genetic Association
Association (psychology)
DOI:
10.1200/jco.2025.43.16_suppl.1603
Publication Date:
2025-05-28T17:02:12Z
AUTHORS (15)
ABSTRACT
1603 Background: Previous studies have established artificial intelligence (AI) algorithms to classify cancer types, providing real-time diagnostic support. In addition, AI models identified previously unknown pathology patterns associated with genomic profiles. However, these exhibit variable performance in different demographic groups, and the causes remain largely unknown. To address this challenge, we investigated relationships between biases mutation rate disparities across populations evaluated efficacy of a fairness-aware contrastive learning (FACL) framework reducing disparities. Methods: We obtained whole-slide images, rates 5 most frequently mutated genes each type, age, sex, race from 9,217 patients The Cancer Genome Atlas 10 types. tasks employed generalized linear quantify relationship model bias type. further developed an FACL framework, its effectiveness mitigating using metrics including differences accuracy (DIA) equal opportunity. Results: Six profile prediction showed significant population groups (Table). Variations TP53 are differential error serous UCEC v. nonserous UCEC, mixed IDC ILC, LUAD LUSC, GBM LGG classification tasks. Differences CDH1 were linked racial disparity ILC age discrepancy Our mitigated out 6 where standard exhibited (p < 0.05). Conclusions: Biases AI-driven diagnosis stem somatic prevalence groups. Addressing is critical ensuring fairness global applicability tools. findings demonstrate that FACL-based effectively reduces disparities, making AI-powered diagnostics more reliable. Tasks Mutation Sensitive Attribute Groups Standard (S) (F) DIA sUCEC nsUCEC Race W 0.34 p<0.001 S: 0.13±0.10, p<0.001F: 0.10±0.05, p=0.088 B 0.45 Mixed 0.29 0.07±0.02, 0.11±0.04, p=0.233 0.46 0.15 p=0.047 0.12±0.02, p=0.023F: p=0.196 A 0.11 Age ≥59 yrs p=0.038 0.05±0.02, p=0.001F: 0.05±0.05, p=0.370 <59 0.17 LUSC Sex F 0.75 p=0.002 0.01±0.01, p=0.154 M 0.58 0.50 p=0.021 0.20±0.01, 0.29±0.02, p=0.005 0.33 W: White; B: Black; A: Asian.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (0)
CITATIONS (0)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....