Posted on by Dr. Francis Collins
The herringbone motif is familiar as the classic, V-shaped patterned weave long popular in tweed jackets. But the nano-sized herringbone pattern seen here is much more than a fashion statement. It helps to solve a tricky design problem for a cancer-detecting “lab-on-a-chip” device.
A research team, led by Yong Zeng, University of Kansas, Lawrence, and Andrew Godwin at the University of Kansas Medical Center, Kansas City. previously developed a lab-on-a-chip that senses exosomes. They are tiny bubble-shaped structures that most mammalian cells secrete constantly into the bloodstream . Once thought of primarily as trash bags used by cells to rid themselves of waste products, exosomes carry important molecular information (RNA, protein, and metabolites) used by cells to communicate and influence the behavior of other cells.
What’s also interesting, tumor cells produce more exosomes than healthy cells. That makes these 30-to-150-nanometer structures (a nanometer is a billionth of a meter) potentially useful for detecting cancer. In fact, these NIH-funded researchers found that their microfluidic device can detect exosomes from ovarian cancer within a 2-microliter blood sample. That’s just 1/25th of a drop!
But there was a technical challenge. When such tiny samples are placed into microfluidic channels, the fluid and any particles within it tend to flow in parallel layers without any mixing between them. As a result, exosomes can easily pass through undetected, without ever touching the biosensors on the surface of the chip.
That’s where the herringbone comes in. As reported in Nature Biomedical Engineering, when fluid flows over those 3D herringbone structures, it produces a whirlpool-like effect . As a result, exosomes are more reliably swept into contact with the biosensors.
The team’s distinctive herringbone structures also increase the surface area within the chip. Because the surface is also porous, it allows fluid to drain out slowly to further encourage exosomes to reach the biosensors.
Zeng’s team put their “lab-on-a-chip” to the test using blood samples from 20 patients with ovarian cancer and 10 age-matched controls. The chip was able to detect rapidly the presence of exosomal proteins known to be associated with ovarian cancer.
The researchers report that their device is sensitive enough to detect just 10 exosomes in a 1-microliter sample. It also could be easily adapted to detect exosomal proteins associated with other cancers, and perhaps other conditions as well.
Zeng and colleagues haven’t mentioned whether they’re also looking into trying other geometric patterns in their designs. But the next time you see a tweed jacket, just remember that there’s more to its herringbone pattern than meets the eye.
 Ultrasensitive microfluidic analysis of circulating exosomes using a nanostructured graphene oxide/polydopamine coating. Zhang P, He M, Zeng Y. Lab Chip. 2016 Aug 2;16(16):3033-3042.
 Ultrasensitive detection of circulating exosomes with a 3D-nanopatterned microfluidic chip. Zhang P, Zhou X, He M, Shang Y, Tetlow AL, Godwin AK, Zeng Y. Nature Biomedical Engineering. February 25, 2019.
Ovarian, Fallopian Tube, and Primary Peritoneal Cancer—Patient Version (National Cancer Institute/NIH)
Extracellular RNA Communication (Common Fund/NIH)
Zeng Lab (University of Kansas, Lawrence)
Godwin Laboratory (University of Kansas Medical Center, Kansas City)
NIH Support: National Cancer Institute
Posted on by Dr. Francis Collins
My last post highlighted the use of artificial intelligence (AI) to create an algorithm capable of detecting 10 different kinds of irregular heart rhythms. But that’s just one of the many potential medical uses of AI. In this post, I’ll tell you how NIH researchers are pairing AI analysis with smartphone cameras to help more women avoid cervical cancer.
In work described in the Journal of the National Cancer Institute , researchers used a high-performance computer to analyze thousands of cervical photographs, obtained more than 20 years ago from volunteers in a cancer screening study. The computer learned to recognize specific patterns associated with pre-cancerous and cancerous changes of the cervix, and that information was used to develop an algorithm for reliably detecting such changes in the collection of images. In fact, the AI-generated algorithm outperformed human expert reviewers and all standard screening tests in detecting pre-cancerous changes.
Nearly all cervical cancers are caused by the human papillomavirus (HPV). Cervical cancer screening—first with Pap smears and now also using HPV testing—have greatly reduced deaths from cervical cancer. But this cancer still claims the lives of more than 4,000 U.S. women each year, with higher frequency among women who are black or older . Around the world, more than a quarter-million women die of this preventable disease, mostly in poor and remote areas .
These troubling numbers have kept researchers on the lookout for low cost, but easy-to-use, tools that could be highly effective at detecting HPV infections most likely to advance to cervical cancer. Such tools would also need to work well in areas with limited resources for sample preparation and lab analysis. That’s what led to this collaboration involving researchers from NIH’s National Cancer Institute (NCI) and Global Good, Bellevue, WA, which is an Intellectual Ventures collaboration with Bill Gates to invent life-changing technologies for the developing world.
Global Good researchers contacted NCI experts hoping to apply AI to a large dataset of cervical images. The NCI experts suggested an 18-year cervical cancer screening study in Costa Rica. The NCI-supported project, completed in the 1990s, generated nearly 60,000 cervical images, later digitized by NIH’s National Library of Medicine and stored away safely.
The researchers agreed that all these images, obtained in a highly standardized way, would serve as perfect training material for a computer to develop a detection algorithm for cervical cancer. This type of AI, called machine learning, involves feeding tens of thousands of images into a computer equipped with one or more high-powered graphics processing units (GPUs), similar to something you’d find in an Xbox or PlayStation. The GPUs allow the computer to crunch large sets of visual data in the images and devise a set of rules, or algorithms, that allow it to learn to “see” physical features.
Here’s how they did it. First, the researchers got the computer to create a convolutional neural network. That’s a fancy way of saying that they trained it to read images, filter out the millions of non-essential bytes, and retain the few hundred bytes in the photo that make it uniquely identifiable. They fed 1.28 million color images covering hundreds of common objects into the computer to create layers of processing ability that, like the human visual system, can distinguish objects and their qualities.
Once the convolutional neural network was formed, the researchers took the next big step: training the system to see the physical properties of a healthy cervix, a cervix with worrisome cellular changes, or a cervix with pre-cancer. That’s where the thousands of cervical images from the Costa Rican screening trial literally entered the picture.
When all these layers of processing ability were formed, the researchers had created the “automated visual evaluation” algorithm. It went on to identify with remarkable accuracy the images associated with the Costa Rican study’s 241 known precancers and 38 known cancers. The algorithm’s few minor hiccups came mainly from suboptimal images with faded colors or slightly blurred focus.
These minor glitches have the researchers now working hard to optimize the process, including determining how health workers can capture good quality photos of the cervix with a smartphone during a routine pelvic exam and how to outfit smartphones with the necessary software to analyze cervical photos quickly in real-world settings. The goal is to enable health workers to use a smartphone or similar device to provide women with cervical screening and treatment during a single visit.
In fact, the researchers are already field testing their AI-inspired approach on smartphones in the United States and abroad. If all goes well, this low-cost, mobile approach could provide a valuable new tool to help reduce the burden of cervical cancer among underserved populations.
The day that cervical cancer no longer steals the lives of hundreds of thousands of women a year worldwide will be a joyful moment for cancer researchers, as well as a major victory for women’s health.
 An observational study of Deep Learning and automated evaluation of cervical images for cancer screening. Hu L, Bell D, Antani S, Xue Z, Yu K, Horning MP, Gachuhi N, Wilson B, Jaiswal MS, Befano B, Long LR, Herrero R, Einstein MH, Burk RD, Demarco M, Gage JC, Rodriguez AC, Wentzensen N, Schiffman M. J Natl Cancer Inst. 2019 Jan 10. [Epub ahead of print]
 “Study: Death Rate from Cervical Cancer Higher Than Thought,” American Cancer Society, Jan. 25, 2017.
 “World Cancer Day,” World Health Organization, Feb. 2, 2017.