Skip to main content

HPV

Using Artificial Intelligence to Detect Cervical Cancer

Posted on by

Doctor reviewing cell phone
Credit: gettyimages/Dean Mitchell

My last post highlighted the use of artificial intelligence (AI) to create an algorithm capable of detecting 10 different kinds of irregular heart rhythms. But that’s just one of the many potential medical uses of AI. In this post, I’ll tell you how NIH researchers are pairing AI analysis with smartphone cameras to help more women avoid cervical cancer.

In work described in the Journal of the National Cancer Institute [1], researchers used a high-performance computer to analyze thousands of cervical photographs, obtained more than 20 years ago from volunteers in a cancer screening study. The computer learned to recognize specific patterns associated with pre-cancerous and cancerous changes of the cervix, and that information was used to develop an algorithm for reliably detecting such changes in the collection of images. In fact, the AI-generated algorithm outperformed human expert reviewers and all standard screening tests in detecting pre-cancerous changes.

Nearly all cervical cancers are caused by the human papillomavirus (HPV). Cervical cancer screening—first with Pap smears and now also using HPV testing—have greatly reduced deaths from cervical cancer. But this cancer still claims the lives of more than 4,000 U.S. women each year, with higher frequency among women who are black or older [2]. Around the world, more than a quarter-million women die of this preventable disease, mostly in poor and remote areas [3].

These troubling numbers have kept researchers on the lookout for low cost, but easy-to-use, tools that could be highly effective at detecting HPV infections most likely to advance to cervical cancer. Such tools would also need to work well in areas with limited resources for sample preparation and lab analysis. That’s what led to this collaboration involving researchers from NIH’s National Cancer Institute (NCI) and Global Good, Bellevue, WA, which is an Intellectual Ventures collaboration with Bill Gates to invent life-changing technologies for the developing world.

Global Good researchers contacted NCI experts hoping to apply AI to a large dataset of cervical images. The NCI experts suggested an 18-year cervical cancer screening study in Costa Rica. The NCI-supported project, completed in the 1990s, generated nearly 60,000 cervical images, later digitized by NIH’s National Library of Medicine and stored away safely.

The researchers agreed that all these images, obtained in a highly standardized way, would serve as perfect training material for a computer to develop a detection algorithm for cervical cancer. This type of AI, called machine learning, involves feeding tens of thousands of images into a computer equipped with one or more high-powered graphics processing units (GPUs), similar to something you’d find in an Xbox or PlayStation. The GPUs allow the computer to crunch large sets of visual data in the images and devise a set of rules, or algorithms, that allow it to learn to “see” physical features.

Here’s how they did it. First, the researchers got the computer to create a convolutional neural network. That’s a fancy way of saying that they trained it to read images, filter out the millions of non-essential bytes, and retain the few hundred bytes in the photo that make it uniquely identifiable. They fed 1.28 million color images covering hundreds of common objects into the computer to create layers of processing ability that, like the human visual system, can distinguish objects and their qualities.

Once the convolutional neural network was formed, the researchers took the next big step: training the system to see the physical properties of a healthy cervix, a cervix with worrisome cellular changes, or a cervix with pre-cancer. That’s where the thousands of cervical images from the Costa Rican screening trial literally entered the picture.

When all these layers of processing ability were formed, the researchers had created the “automated visual evaluation” algorithm. It went on to identify with remarkable accuracy the images associated with the Costa Rican study’s 241 known precancers and 38 known cancers. The algorithm’s few minor hiccups came mainly from suboptimal images with faded colors or slightly blurred focus.

These minor glitches have the researchers now working hard to optimize the process, including determining how health workers can capture good quality photos of the cervix with a smartphone during a routine pelvic exam and how to outfit smartphones with the necessary software to analyze cervical photos quickly in real-world settings. The goal is to enable health workers to use a smartphone or similar device to provide women with cervical screening and treatment during a single visit.

In fact, the researchers are already field testing their AI-inspired approach on smartphones in the United States and abroad. If all goes well, this low-cost, mobile approach could provide a valuable new tool to help reduce the burden of cervical cancer among underserved populations.

The day that cervical cancer no longer steals the lives of hundreds of thousands of women a year worldwide will be a joyful moment for cancer researchers, as well as a major victory for women’s health.

References:

[1] An observational study of Deep Learning and automated evaluation of cervical images for cancer screening. Hu L, Bell D, Antani S, Xue Z, Yu K, Horning MP, Gachuhi N, Wilson B, Jaiswal MS, Befano B, Long LR, Herrero R, Einstein MH, Burk RD, Demarco M, Gage JC, Rodriguez AC, Wentzensen N, Schiffman M. J Natl Cancer Inst. 2019 Jan 10. [Epub ahead of print]

[2] “Study: Death Rate from Cervical Cancer Higher Than Thought,” American Cancer Society, Jan. 25, 2017.

[3] “World Cancer Day,” World Health Organization, Feb. 2, 2017.

Links:

Cervical Cancer (National Cancer Institute/NIH)

Global Good (Intellectual Ventures, Bellevue, WA)

NIH Support: National Cancer Institute; National Library of Medicine


Head and Neck Cancer: Building the Evidence Base for Precision Oncology

Posted on by

squamous cell carcinoma

Caption: Triple immunohistochemical stained oral squamous cell carcinoma: nuclei in brown, cytoplasm in red, and cytoplasmic membranes in blue green.
Credit: Alfredo A. Molinolo, Oral and Pharyngeal Cancer Branch, National Institute of Dental and Craniofacial Research, NIH

An exciting new era in cancer research is emerging, called precision oncology. It builds on decades of research establishing that cancers start with glitches in the genome, the cell’s instruction book. Researchers have now identified numerous ways that mutations in susceptible genes can drive the cancer process. Knowing where and how to look for them brings greater precision to diagnosing cancers and gives doctors key clues about which treatments might work and which ones won’t.

To build a firmer evidence base for precision oncology, more and more cancer genomes, from many different body sites, must be analyzed for clues about the drivers of the malignant process. That’s why it’s always exciting to see a new genomic analysis that adds substantially to our understanding of a common tumor. The latest to appear, published online at the journal Nature, comes from an NIH-supported study on the most common type of head and neck cancer, called squamous cell carcinoma. The technologically advanced analysis confirms that many previously suspected genes do indeed play a role in head and neck cancer. But that’s not all. The new data also identify several previously unknown subtypes of this cancer. The first descriptions of the abnormal molecular wiring in these subtypes are outlined, suggesting possible strategies  to neutralize or destroy the cancer cells. That’s potentially good news to help guide and inform the treatment of the estimated 55,000 Americans who are diagnosed with a head and neck cancer each year.