Powered by People series: Understanding skin tone bias in cancer detecting AI

Working at speeds 100,000 times faster than the average laptop and able to process in one second what it would take the entire global population 80 years to achieve, Isambard-AI offers huge potential in fields such as robotics, big data, climate research and drug discovery.

Although Isambard-AI has been built by Bristol Centre for Supercomputing, based at University of Bristol, researchers from across the UK can apply to use Isambard – opening up and democratising the power of AI.

Our university colleagues are already using Isambard-AI for pioneering research projects. Below, one of these colleagues reveals his research project, how Isambard-AI is underpinning it and what the future might hold.

Dr. James Pope – understanding bias in skin cancer detecting AI

James Pope, a researcher who uses Isambard-AI

I’m James Pope and Senior Lecturer in Data Science. My research largely involves artificial intelligence and machine learning.

Tell us a bit about your project

The idea is that you might take a picture of a mole or skin lesion and then AI model would tell you whether it might be cancerous or not.

We wanted to see if the model was better for predicting cancer on darker skin or lighter skin, or if it was equally the same. Simply, was there tone bias which could affect the result?

We used about 4,000 images to train our model – that involved trying lots of configurations and combinations to make sure it works as well as possible. And you need a big machine to do this...

How are you using Isambard-AI?

It is a really big computer that’s much faster than a regular computer. Things that might have taken us five years to do, maybe Isambard can do in a couple of weeks.

I was surprised how fast it was when I first started using it. My first thought was something had gone wrong – but, I already had results and I hadn’t even finished my coffee!

We used Isambard-AI to tune the best possible model and then we went through lots of images of skin to see if it was biased or not. And the answer was yes – it did better for lighter skin tones than darker skin tones.

Next steps

It’s clear next steps should involve addressing bias in AI models. I think work that we've done here, can hopefully, help us put mitigations in place – such as including a test for bias.

I don't know that we can always remove it because it may simply be that it's harder to diagnose a dark mole on a dark skin, but we need to be aware of it.

I would like for apps where I can easily share my health concerns with my doctor. In this case, taking a picture of a mole and getting a screening result. I realise there could be some errors, and I would still want to see doctor, but it would help my life.