This study highlights the complexity of Al mechanisms to determine race from medical imaging data and the difficulty of isolating relevant features, emphasizing the need for caution and further study as Al diagnostics are used to aid in clinical implementation.
<h3>INTERPRETATION-</h3>
Health Care inequality across radical group is a pressing issue garnering societal awareness. In medical imaging, there is no obvious correlation to race that experts can identify, consequently, medical imaging has been regarded as blind to race. Deep learning algorithms show promising success in making diagnoses from medical images. However, increasing concerns exist regarding algorithmic bias and its potential to exacerbate health care disparities.
Using data sets of racially diverse patients, Gichoya et al found that Al models accurately predicted race from patient medical images across multiple modalities, including radiography, chest CT, and mammography (all with area under the curve values > 0.80). Using regression analysis, they accounted for several covariates, such as body mass index, disease distribution, and breast density.
The authors also showed degraded images using frequency filters and image ablation experiments. The Al models still performed at surprisingly high rates, even when images were no longer discernable to experts as medical images.
The authors demonstrated that the features relevant to predicting race were delocalized and present throughout the image frequency spectrum, suggesting that race cannot be determined using human-identifiable image features.
Learn more about Radiology on:
brainly.com/question/1176933
#SPJ4