Technology for detecting skin cancer is forging ahead – but not for people of color, apparently
A new analysis suggests that AI-based skin cancer screening is being optimized for light skin.
Artificial intelligence has drawn scrutiny for perpetuating the biases of the mostly white tech guys developing it. Much of the criticism has swirled around the facial recognition algorithms used in surveillance technology, shown to have higher error rates for women and BIPOC, per the ACLU, increasing their risk of wrongful arrest and police violence.
Now, a new analysis reveals an insidious way that AI can widen racial health disparities, too. Researchers found that the datasets used to train AI programs to detect skin cancer includes hardly any images of dark skin, according to a National Cancer Research Institute press release. Simply put, this technology is being optimized for light skin.
First, some background on how this technology works: Engineers “train” these AI programs by showing them images and other data from many, many patients diagnosed with skin cancer, the NCRI explained. The promise of AI lies in its ability to make diagnoses quickly and cost-effectively — but its performance depends heavily on this data.
That’s why it’s crucial to know which images and patients are being used to train these programs, David Wen of the University of Oxford, the paper’s lead author, told the NCRI. “Research has shown that programs trained on images taken from people with lighter skin types only might not be as accurate for people with darker skin, and vice versa,” he said.
Wen and his team conducted what they claimed to be the first systematic review of publicly available skin cancer datasets, the press release stated. They examined a total of 21 datasets that included over 100,000 images. Information on the patients’ skin color and ethnicity accompanied only a small fraction of the images. Only 10 out of the 2,436 images that indicated skin color were of brown skin — and only one was of dark brown or Black skin. Out of the 1,585 images that indicated the ethnicity of the patient, zero were from people of African, Afro-Caribbean, or South Asian descent. The findings appeared online in Lancet Digital Health on November 9.
Yes, darker-skinned people have a low risk of skin cancer, as Wen pointed out in the NCRI press release —but “low” is not the same as “zero.” What’s more, if they do get skin cancer, “they may have worse disease or be more likely to die of the disease,” he said. “One factor contributing to this could be the result of skin cancer being diagnosed too late.”
The fact that physicians don’t know enough about what skin cancer looks like on darker skin could help explain this disparity. Indeed, Wen told the NCRI that images of Black and Asian people in medical textbooks are few and far between (although there have been a few signs of progress, such as this book on clinical features on darker skin).
Developing AI programs without addressing this huge knowledge gap would only widen the racial disparity in skin cancer outcomes. To that end, Wen and his team plan to create quality standards for the health datasets used to train AI programs, according to the NCRI. While it’s encouraging to see doctors turning their attention to yet another health inequity, the medical establishment really should’ve prevented it altogether.