Skip to main content

smartphone

Using Artificial Intelligence to Detect Cervical Cancer

Posted on by

Doctor reviewing cell phone
Credit: gettyimages/Dean Mitchell

My last post highlighted the use of artificial intelligence (AI) to create an algorithm capable of detecting 10 different kinds of irregular heart rhythms. But that’s just one of the many potential medical uses of AI. In this post, I’ll tell you how NIH researchers are pairing AI analysis with smartphone cameras to help more women avoid cervical cancer.

In work described in the Journal of the National Cancer Institute [1], researchers used a high-performance computer to analyze thousands of cervical photographs, obtained more than 20 years ago from volunteers in a cancer screening study. The computer learned to recognize specific patterns associated with pre-cancerous and cancerous changes of the cervix, and that information was used to develop an algorithm for reliably detecting such changes in the collection of images. In fact, the AI-generated algorithm outperformed human expert reviewers and all standard screening tests in detecting pre-cancerous changes.

Nearly all cervical cancers are caused by the human papillomavirus (HPV). Cervical cancer screening—first with Pap smears and now also using HPV testing—have greatly reduced deaths from cervical cancer. But this cancer still claims the lives of more than 4,000 U.S. women each year, with higher frequency among women who are black or older [2]. Around the world, more than a quarter-million women die of this preventable disease, mostly in poor and remote areas [3].

These troubling numbers have kept researchers on the lookout for low cost, but easy-to-use, tools that could be highly effective at detecting HPV infections most likely to advance to cervical cancer. Such tools would also need to work well in areas with limited resources for sample preparation and lab analysis. That’s what led to this collaboration involving researchers from NIH’s National Cancer Institute (NCI) and Global Good, Bellevue, WA, which is an Intellectual Ventures collaboration with Bill Gates to invent life-changing technologies for the developing world.

Global Good researchers contacted NCI experts hoping to apply AI to a large dataset of cervical images. The NCI experts suggested an 18-year cervical cancer screening study in Costa Rica. The NCI-supported project, completed in the 1990s, generated nearly 60,000 cervical images, later digitized by NIH’s National Library of Medicine and stored away safely.

The researchers agreed that all these images, obtained in a highly standardized way, would serve as perfect training material for a computer to develop a detection algorithm for cervical cancer. This type of AI, called machine learning, involves feeding tens of thousands of images into a computer equipped with one or more high-powered graphics processing units (GPUs), similar to something you’d find in an Xbox or PlayStation. The GPUs allow the computer to crunch large sets of visual data in the images and devise a set of rules, or algorithms, that allow it to learn to “see” physical features.

Here’s how they did it. First, the researchers got the computer to create a convolutional neural network. That’s a fancy way of saying that they trained it to read images, filter out the millions of non-essential bytes, and retain the few hundred bytes in the photo that make it uniquely identifiable. They fed 1.28 million color images covering hundreds of common objects into the computer to create layers of processing ability that, like the human visual system, can distinguish objects and their qualities.

Once the convolutional neural network was formed, the researchers took the next big step: training the system to see the physical properties of a healthy cervix, a cervix with worrisome cellular changes, or a cervix with pre-cancer. That’s where the thousands of cervical images from the Costa Rican screening trial literally entered the picture.

When all these layers of processing ability were formed, the researchers had created the “automated visual evaluation” algorithm. It went on to identify with remarkable accuracy the images associated with the Costa Rican study’s 241 known precancers and 38 known cancers. The algorithm’s few minor hiccups came mainly from suboptimal images with faded colors or slightly blurred focus.

These minor glitches have the researchers now working hard to optimize the process, including determining how health workers can capture good quality photos of the cervix with a smartphone during a routine pelvic exam and how to outfit smartphones with the necessary software to analyze cervical photos quickly in real-world settings. The goal is to enable health workers to use a smartphone or similar device to provide women with cervical screening and treatment during a single visit.

In fact, the researchers are already field testing their AI-inspired approach on smartphones in the United States and abroad. If all goes well, this low-cost, mobile approach could provide a valuable new tool to help reduce the burden of cervical cancer among underserved populations.

The day that cervical cancer no longer steals the lives of hundreds of thousands of women a year worldwide will be a joyful moment for cancer researchers, as well as a major victory for women’s health.

References:

[1] An observational study of Deep Learning and automated evaluation of cervical images for cancer screening. Hu L, Bell D, Antani S, Xue Z, Yu K, Horning MP, Gachuhi N, Wilson B, Jaiswal MS, Befano B, Long LR, Herrero R, Einstein MH, Burk RD, Demarco M, Gage JC, Rodriguez AC, Wentzensen N, Schiffman M. J Natl Cancer Inst. 2019 Jan 10. [Epub ahead of print]

[2] “Study: Death Rate from Cervical Cancer Higher Than Thought,” American Cancer Society, Jan. 25, 2017.

[3] “World Cancer Day,” World Health Organization, Feb. 2, 2017.

Links:

Cervical Cancer (National Cancer Institute/NIH)

Global Good (Intellectual Ventures, Bellevue, WA)

NIH Support: National Cancer Institute; National Library of Medicine


Building a Smarter Bandage

Posted on by

Smart Bandage

Credit: Tufts University, Medford, MA

Smartphones, smartwatches, and smart electrocardiograms. How about a smart bandage?

This image features a prototype of a smart bandage equipped with temperature and pH sensors (lower right) printed directly onto the surface of a thin, flexible medical tape. You also see the “brains” of the operation: a microprocessor (upper left). When the sensors prompt the microprocessor, it heats up a hydrogel heating element in the bandage, releasing drugs and/or other healing substances on demand. It can also wirelessly transmit messages directly to a smartphone to keep patients and doctors updated.

While the smart bandage might help mend everyday cuts and scrapes, it was designed with the intent of helping people with hard-to-heal chronic wounds, such as leg and foot ulcers. Chronic wounds affect millions of Americans, including many seniors [1]. Such wounds are often treated at home and, if managed incorrectly, can lead to infections and potentially serious health problems.


Seven More Awesome Technologies Made Possible by Your Tax Dollars

Posted on by

We live in a world energized by technological advances, from that new app on your smartphone to drones and self-driving cars. As you can see from this video, NIH-supported researchers are also major contributors, developing a wide range of amazing biomedical technologies that offer tremendous potential to improve our health.

Produced by the NIH’s National Institute of Biomedical Imaging and Bioengineering (NIBIB), this video starts by showcasing some cool fluorescent markers that are custom-designed to light up specific cells in the body. This technology is already helping surgeons see and remove tumor cells with greater precision in people with head and neck cancer [1]. Further down the road, it might also be used to light up nerves, which can be very difficult to see—and spare—during operations for cancer and other conditions.

Other great things to come include:

  • A wearable tattoo that detects alcohol levels in perspiration and wirelessly transmits the information to a smartphone.
  • Flexible coils that produce high quality images during magnetic resonance imaging (MRI) [2-3]. In the future, these individualized, screen-printed coils may improve the comfort and decrease the scan times of people undergoing MRI, especially infants and small children.
  • A time-release capsule filled with a star-shaped polymer containing the anti-malarial drug ivermectin. The capsule slowly dissolves in the stomach over two weeks, with the goal of reducing the need for daily doses of ivermectin to prevent malaria infections in at-risk people [4].
  • A new radiotracer to detect prostate cancer that has spread to other parts of the body. Early clinical trial results show the radiotracer, made up of carrier molecules bonded tightly to a radioactive atom, appears to be safe and effective [5].
  • A new supercooling technique that promises to extend the time that organs donated for transplantation can remain viable outside the body [6-7]. For example, current technology can preserve donated livers outside the body for just 24 hours. In animal studies, this new technique quadruples that storage time to up to four days.
  • A wearable skin patch with dissolvable microneedles capable of effectively delivering an influenza vaccine. This painless technology, which has produced promising early results in humans, may offer a simple, affordable alternative to needle-and-syringe immunization [8].

If you like what you see here, be sure to check out this previous NIH video that shows six more awesome biomedical technologies that your tax dollars are helping to create. So, let me extend a big thanks to you from those of us at NIH—and from all Americans who care about the future of their health—for your strong, continued support!

References:

[1] Image-guided surgery in cancer: A strategy to reduce incidence of positive surgical margins. Wiley Interdiscip Rev Syst Biol Med. 2018 Feb 23.

[2] Screen-printed flexible MRI receive coils. Corea JR, Flynn AM, Lechêne B, Scott G, Reed GD, Shin PJ, Lustig M, Arias AC. Nat Commun. 2016 Mar 10;7:10839.

[3] Printed Receive Coils with High Acoustic Transparency for Magnetic Resonance Guided Focused Ultrasound. Corea J, Ye P, Seo D, Butts-Pauly K, Arias AC, Lustig M. Sci Rep. 2018 Feb 21;8(1):3392.

[4] Oral, ultra-long-lasting drug delivery: Application toward malaria elimination goals. Bellinger AM, Jafari M1, Grant TM, Zhang S, Slater HC, Wenger EA, Mo S, Lee YL, Mazdiyasni H, Kogan L, Barman R, Cleveland C, Booth L, Bensel T, Minahan D, Hurowitz HM, Tai T, Daily J, Nikolic B, Wood L, Eckhoff PA, Langer R, Traverso G. Sci Transl Med. 2016 Nov 16;8(365):365ra157.

[5] Clinical Translation of a Dual Integrin avb3– and Gastrin-Releasing Peptide Receptor–Targeting PET Radiotracer, 68Ga-BBN-RGD. Zhang J, Niu G, Lang L, Li F, Fan X, Yan X, Yao S, Yan W, Huo L, Chen L, Li Z, Zhu Z, Chen X. J Nucl Med. 2017 Feb;58(2):228-234.

[6] Supercooling enables long-term transplantation survival following 4 days of liver preservation. Berendsen TA, Bruinsma BG, Puts CF, Saeidi N, Usta OB, Uygun BE, Izamis ML, Toner M, Yarmush ML, Uygun K. Nat Med. 2014 Jul;20(7):790-793.

[7] The promise of organ and tissue preservation to transform medicine. Giwa S, Lewis JK, Alvarez L, Langer R, Roth AE, et a. Nat Biotechnol. 2017 Jun 7;35(6):530-542.

[8] The safety, immunogenicity, and acceptability of inactivated influenza vaccine delivered by microneedle patch (TIV-MNP 2015): a randomised, partly blinded, placebo-controlled, phase 1 trial. Rouphael NG, Paine M, Mosley R, Henry S, McAllister DV, Kalluri H, Pewin W, Frew PM, Yu T, Thornburg NJ, Kabbani S, Lai L, Vassilieva EV, Skountzou I, Compans RW, Mulligan MJ, Prausnitz MR; TIV-MNP 2015 Study Group.

Links:

National Institute of Biomedical Imaging and Bioengineering (NIH)

Center for Wearable Sensors (University of California, San Diego)

Hyperpolarized MRI Technology Resource Center (University of California, San Francisco)

Center for Engineering in Medicine (Massachusetts General Hospital, Boston)

Center for Drug Design, Development and Delivery (Georgia Tech University, Atlanta)

NIH Support: National Institute of Biomedical Imaging and Bioengineering; National Institute of Diabetes and Digestive and Kidney Diseases; National Institute of Allergy and Infectious Diseases


Cool Videos: Insulin from Bacteria to You

Posted on by

If you have a smartphone, you’ve probably used it to record a video or two. But could you use it to produce a video that explains a complex scientific topic in 2 minutes or less? That was the challenge posed by the RCSB Protein Data Bank last spring to high school students across the nation. And the winning result is the video that you see above!

This year’s contest, which asked students to provide a molecular view of diabetes treatment and management, attracted 53 submissions from schools from coast to coast. The winning team—Andrew Ma, George Song, and Anirudh Srikanth—created their video as their final project for their advanced placement (AP) biology class at West Windsor-Plainsboro High School South, Princeton Junction, NJ.


Portable System Uses Light to Diagnose Bacterial Infections Faster

Posted on by

PAD system

Caption: PAD system. Left, four optical testing cubes (blue and white) stacked on the electronic base station (white with initials); right, a smartphone with a special app to receive test results transmitted by the electronic base station.
Credit: Park et al. Sci. Adv. 2016

Every year, hundreds of thousands of Americans acquire potentially life-threatening bacterial infections while in the hospital, nursing home, or other health-care settings [1]. Such infections can be caused by a variety of bacteria, which may respond quite differently to different antibiotics. To match a patient with the most appropriate antibiotic therapy, it’s crucial to determine as quickly as possible what type of bacteria is causing his or her infection. In an effort to improve that process, an NIH-funded team is working to develop a point-of-care system and smartphone app aimed at diagnosing bacterial infections in a faster, more cost-effective manner.

The portable new system, described recently in the journal Science Advances, uses a novel light-based method for detecting telltale genetic sequences from bacteria in bodily fluids, such as blood, urine, or drainage from a skin abscess. Testing takes place within small, optical cubes that, when placed on an electronic base station, deliver test results within a couple of hours via a simple readout sent directly to a smartphone [2]. When the system was tested on clinical samples from a small number of hospitalized patients, researchers found that not only did it diagnose bacterial infections about as accurately and more swiftly than current methods, but it was also cheaper. This new system can potentially also be used to test for the presence of antibiotic-resistant bacteria and contamination of medical devices.