They’ll be some of the weirder selfies you’ve ever taken, but a study using artificial intelligence to analyze images of skin lesions suggests smartphones may soon help humans detect skin cancer.
Published today in Nature, the study began with an unremarkable image-recognition network provided by Google, pre-trained to identify objects in images. Led by Stanford professor and former Google exec Sebastian Thrun, researchers showed the AI thousands and thousands of medical images—129,450 from Stanford University Medical Center and 18 open-source repositories, to be exact—which are labeled to tell the machine what it’s looking at.
After looking at hundreds of images of a specific lesion, the AI begins to understand similarities between the images. The algorithm learns to differentiate lesions from healthy skin, potentially based on traits like coloration and contrast. As it sees more images, it can draw more accurate conclusions about benign, malignant, and non-neoplastic lesions. (Non-neoplastic lesions could be inflammation.) With this technology, people around the world could have access to low-cost screening for skin cancer, the most common form of the disease.
How does the algorithm stack up against human dermatologists?
For the most part, quite well. In one trial that pitted the AI against two dermatologists to determine whether a lesion was benign, malignant, or non-neoplastic, the algorithm scored 71 percent accuracy against the humans’ respective 66 percent and 65 percent scores.
The algorithm also outperformed humans in a separate trial involving 21 dermatologists. In another test, the team tasked the algorithm with looking at a lesion image and deciding whether to treat it or reassure the patient it would be OK. The algorithm proved slightly more accurate than most doctors, only underperforming in two shown scenarios.
There is a crucial caveat though: The Nature paper does not mention skin color, and all of the examples shown feature people of lighter skin color. Dark skin has traditionally confounded machine-learning engineers; algorithms have labeled black people as gorillas, and AI-based evaluations of beauty favor whites. Until researchers train the algorithm to work on people with darker skin (by showing it more examples of lesions on darker skin) the algorithm is only useful for a segment of the global population.
For now, the Stanford team’s algorithm only runs on full computers, but they’re interested in pursuing a smartphone app.
“Everyone will have a supercomputer in their pockets with a number of sensors in it, including a camera,” Andre Esteva, co-lead author of the paper, told Stanford News Service. “What if we could use it to visually screen for skin cancer? Or other ailments?”