Share this post:
Skin cancer is the most commonly diagnosed cancer in the United States. Over five million cases are diagnosed each year, costing the U.S. healthcare system over $8 billion. More than 100,000 of these cases involve melanoma, the deadliest form of skin cancer, which leads to over 9,000 deaths a year, and the numbers continue to grow. Internationally, melanoma also poses a major public health threat. In Australia, there are over 13,000 new instances of melanoma yearly, leading to more than 1,600 deaths. In Europe, it causes more than 20,000 deaths a year.
To combat the rising mortality rate of melanoma, mechanisms for detection are needed that can diagnose the disease in its earliest stages, when proper treatment can still produce a five-year survival rate over 98 percent. If disease progresses to the lymphatic system or beyond, the survival rate drops as low as 16 percent.
Today, using an imaging technique called Dermoscopy, expert dermatologists can detect disease in early stages, but there are two challenges that remain. The first is that there may be a limited supply of these specialized physicians, making them costly to visit or difficult to access in many geographic regions. Convenient and affordable access to screening on a regular basis, if available, could help maximize the chances of identifying the disease early. The second challenge is the potential for human error, despite extensive training. For example, up to nine lesions are surgically biopsied for every one melanoma discovered. The occurrence of surgical biopsies on non-melanoma lesions can lead to patient discomfort, disfigurement, and a rise in healthcare costs.
Therefore, there remains a need for innovation that can help improve the access to and accuracy of melanoma detection. Historically, we’ve seen with many diseases that diagnostic blood tests can be one way to provide widespread access to accurate screening mechanisms. Blood can be drawn by technicians with minimal training, and diagnostics can be performed by local care centers, or mailed to labs. The results can then be used by clinical staff, including primary care physicians, to help come to a diagnosis in the context of the patient’s symptoms and history. However, no reliable blood tests for melanoma currently exist.
Can doctors someday use pictures to routinely diagnose skin cancer?
At IBM Research, we have been focusing on potential technology applications in skin image analysis, as our expertise in healthcare, machine learning, computer vision, and cloud computing — as well as our partnerships with Memorial Sloan Kettering Cancer Center (MSK) and others — puts us in a unique position to approach this task. Recently, we’ve been developing techniques in computer vision that could one day enable clinical staff to use pictures to help them screen for disease. Our vision is that taking pictures to diagnose melanoma might one day be as routine as drawing blood to detect other diseases. Equipped with a smartphone or other camera attached to a Dermascope, the goal is that this type of technology would enable doctors, nurses, or support staff to take a photo of a concerning lesion, send it to a cloud-based analytics service, and receive a detailed report on the lesion in response. The report may contain a confidence indicator to help a clinician determine whether the lesion has characteristics of melanoma or other type of skin cancer. The report might also include other relevant supporting information, such as visual patterns observed on the lesion that might indicate certain underlying cellular structures, which may further assist clinical staff in assessing a patient’s risk.
Screen shot of IBM Research Melanoma Image Analysis system outlining areas of the skin believed to be part of a lesion.
In 2015, our team published preliminary research on this topic, in which we explained the ability of computer vision approaches we are developing to help find disease markers in dermoscopy images, using an initial dataset provided to us through a collaboration with MSK and the International Skin Imaging Collaboration (ISIC). While preliminary research demonstrated promising results, the algorithms studied required a human user to outline the skin lesion of interest from the entire image. In addition, no direct comparison to the diagnostic performance of human experts was provided.
In the past two years, we have continued to develop and further advance computer vision methods to automatically outline the lesion of interest, as well as more effectively analyze the lesion and surrounding skin to screen for melanoma. With support from our collaborators, we have also directly compared the performance of these new methods to that of eight specialists. Using some clinically relevant metrics evaluated on the most recent datasets, the approaches we’ve developed are now about three times better at recognizing melanoma than the previous methods we developed, and do as well at recognizing disease in the dataset as specialists. This work is scheduled be published in a 2017 issue of the IBM Journal of Research and Development. A pre-print has been made available for immediate access on the online library arXiv, managed by Cornell University.
It’s important to recognize that work is still in its early stages. As an analogy, a well-established dermatologist will see 25 patients or more per day. Assuming approximately 20 workdays a month, this results in over 6,000 patients per year, with potentially multiple lesions observed on each patient. By comparison, the IBM system has seen fewer than 3,000 lesions, meaning its experience level is along the lines of a medical student. More research and testing are necessary before the technology can contribute to clinical practice. But the hope is that one day, the simple act of taking a picture could help clinicians quickly and accurately test for melanoma and other dangerous skin diseases, allowing medical staff to focus more of their efforts on the appropriate treatment and management of disease.