inclusion4eu@gmail.com

Inclusion4EU

Co-Design for Inclusion in Software Development Design

Google’s AI Skincare tool

Who does this case study involve?

Google’s AI Skincare tool users with dark skin

The case

In 2020 Google released an AI-powered dermatology tool as a web-based application that could help to diagnose skin conditions. In order to use the tool users need to upload three photos of a problem area to the tool from different angles. According to Google “You’ll then be asked questions about your skin type, how long you’ve had the issue and other symptoms that help the tool narrow down the possibilities. The AI model analyzes this information and draws from its knowledge of 288 conditions to give you a list of possible matching conditions that you can then research further.”

 

Google researchers published the data and model that the Skincare tool was built on in the journal Nature Medicine (2020). According to the authors the tool was developed using a set of around 65,000 anonymised images and case data of diagnosed conditions, taken from a total of 16,114 individual cases.

Following a publication in Nature Medicine from Liu et al., a number of researchers highlighted issues with Google’s data for the skincare tool.

Dr Roxana Daneshjou tweeted her concerns that Google’s study data appeared not to include many patients with darker skin types.

Roxana Daneshjou MD/PHD @Roxana Daneshjou - May 18, 2021

Now, I'm sure that the Google team has been continually fine tuning their algorithm. But from all the PUBLICLY published data that we have, this algorithm has a HUGE lack of skin of color (types 5 and 6 of the Fitzpatrick scale) in the test set of images used. According to the Fitzpatrick skin type scale the test set of images used were: 10.2% type 2; 64.2% type 3; 19.3% type 4; 2.7% type 5 and 0% type 6.

The lack of diversity of skin colour would make the skincare tool unreliable and usable for people with darker skintones. Dr Daneshjou also highlighted issues with labelling in the model published by Google:

Roxana Daneshjou MD/PHD @Roxana Daneshjou - May 18, 2021

One of my biggest concerns is regarding labeling. The images in the original paper were largely labeled by consensus of dermatologists. That's right, MOST cutaneous malignancies were not labeled based on biopsy results.

Dr Tereza Hendle highlights that the “EU legal system prohibits discrimination also on the grounds of race and ethnicity, hence, medical tools should not discriminate against patients with brown or black skin tones” (Euronews, 2021).

References

Google’s AI Skincare tool
Scroll to top