This year, too many Americans will go to the doctor for tissue biopsies to find out if they have cancer. Highly trained pathologists will examine the biopsies under a microscope for unusual cells that show the telltale physical features of a suspected cancer. As informative as the pathology will be for considering the road ahead, it would be even more helpful if pathologists had the tools to look widely inside cells for the actual molecules giving rise to the tumor.
Working this “molecular information” into the pathology report would bring greater diagnostic precision, drilling down to the actual biology driving the growth of the tumor. It also would help doctors to match the right treatments to a patient’s tumor and not waste time on drugs that will be ineffective.
That’s why researchers have been busy building the needed tools and also mapping out molecular atlases of common cancers. These atlases, really a series of 3D spatial maps detailing various biological features within the tumor, keep getting better all the time. That includes the comprehensive atlas of colorectal cancer just published in the journal Cell .
This colorectal atlas comes from an NIH-supported team led by Sandro Santagata, Brigham and Women’s Hospital, Boston, and Peter Sorger, Harvard Medical School, Cambridge, MA, in collaboration with investigators at Vanderbilt University, Nashville, TN. The colorectal atlas joins their previously published high-definition map of melanoma , and both are part of the Human Tumor Atlas Network that’s supported by NIH’s National Cancer Institute.
What’s so interesting with the colorectal atlas is the team combined traditional pathology with a sophisticated technique for imaging single cells, enabling them to capture their fine molecular details in an unprecedented way.
They did it using a cutting-edge technique known as cyclic immunofluorescence, or CyCIF. In CyCIF, researchers use many rounds of highly detailed molecular imaging on each tissue sample to generate a rich collection of molecular-level data, cell by cell. Altogether, the researchers captured this fine-scale visual information for nearly 100 million cancer cells isolated from tumor samples representing 93 individuals diagnosed with colorectal cancer.
With this single-cell information in hand, they next created detailed 2D maps covering the length and breadth of large portions of the colorectal cancers under study. Finally, with the aid of first author Jia-Ren Lin, also at Harvard Medical School, and colleagues they stitched together their 2D maps to produce detailed 3D reconstructions showing the length, breadth, and height of the tumors.
This more detailed view of colorectal cancer has allowed the team to explore differences between normal and tumor tissues, as well as variations within an individual tumor. In fact, they’ve uncovered physical features that had never been discovered.
For instance, an individual tumor has regions populated with malignant cells, while other areas look less affected by the cancer. In between are transitional areas that correspond to molecular gradients of information. With this high-resolution map as their guide, researchers can now study what this all might mean for the diagnosis, treatment, and prognosis of colorectal cancer.
The atlas also shows that the presence of immune cells varies dramatically within a single tumor. That’s an important discovery because of its potential implications for immunotherapies, in which treatments aim to unleash the immune system in the fight against cancer.
The maps also provide new insights into tumor structure. For example, scientists had previously identified what they thought were 2D pools of a mucus-like substance called mucin with clusters of cancer cells suspended inside. However, the new 3D reconstruction make clear that these aren’t simple mucin pools. Rather, they are cross sections of larger intricate caverns of mucin interconnected by channels, into which cancer cells make finger-like projections.
The good news is the researchers already are helping to bring these methods into the cancer clinic. They also hope to train other scientists to build their own cancer atlases and grow the collection even more.
In the meantime, the team will refine its 3D tumor reconstructions by integrating new imaging technologies and even more data into their maps. It also will map many more colorectal cancer samples to capture the diversity of their basic biology. Also of note, having created atlases for melanoma and colorectal cancer, the team has plans to tackle breast and brain cancers next.
Let me close by saying, if you’re between the ages of 45 and 75, don’t forget to stay up to date on your colorectal cancer screenings. These tests are very good, and they could save your life.
Computers are now being trained to “see” the patterns of disease often hidden in our cells and tissues. Now comes word of yet another remarkable use of computer-generated artificial intelligence (AI): swiftly providing neurosurgeons with valuable, real-time information about what type of brain tumor is present, while the patient is still on the operating table.
This latest advance comes from an NIH-funded clinical trial of 278 patients undergoing brain surgery. The researchers found they could take a small tumor biopsy during surgery, feed it into a trained computer in the operating room, and receive a diagnosis that rivals the accuracy of an expert pathologist.
Traditionally, sending out a biopsy to an expert pathologist and getting back a diagnosis optimally takes about 40 minutes. But the computer can do it in the operating room on average in under 3 minutes. The time saved helps to inform surgeons how to proceed with their delicate surgery and make immediate and potentially life-saving treatment decisions to assist their patients.
As reported in Nature Medicine, researchers led by Daniel Orringer, NYU Langone Health, New York, and Todd Hollon, University of Michigan, Ann Arbor, took advantage of AI and another technological advance called stimulated Raman histology (SRH). The latter is an emerging clinical imaging technique that makes it possible to generate detailed images of a tissue sample without the usual processing steps.
The SRH technique starts off by bouncing laser light rapidly through a tissue sample. This light enables a nearby fiberoptic microscope to capture the cellular and structural details within the sample. Remarkably, it does so by picking up on subtle differences in the way lipids, proteins, and nucleic acids vibrate when exposed to the light.
Then, using a virtual coloring program, the microscope quickly pieces together and colors in the fine structural details, pixel by pixel. The result: a high-resolution, detailed image that you might expect from a pathology lab, minus the staining of cells, mounting of slides, and the other time-consuming processing procedures.
To interpret the SRH images, the researchers turned to computers and machine learning. To teach a computer, it must be fed large datasets of examples in order to learn how to perform a given task. In this case, they used a special class of machine learning called deep neural networks, or deep learning. It’s inspired by the way neural networks in the human brain process information.
In deep learning, computers look for patterns in large collections of data. As they begin to recognize complex relationships, some connections in the network are strengthened while others are weakened. The finished network is typically composed of multiple information-processing layers, which operate on the data to return a result, in this case a brain tumor diagnosis.
The team trained the computer to classify tissues samples into one of 13 categories commonly found in a brain tumor sample. Those categories included the most common brain tumors: malignant glioma, lymphoma, metastatic tumors, and meningioma. The training was based on more than 2.5 million labeled images representing samples from 415 patients.
Next, they put the machine to the test. The researchers split each of 278 brain tissue samples into two specimens. One was sent to a conventional pathology lab for prepping and diagnosis. The other was imaged with SRH, and then the trained machine made a diagnosis.
Overall, the machine’s performance was quite impressive, returning the right answer about 95 percent of the time. That’s compared to an accuracy of 94 percent for conventional pathology.
Interestingly, the machine made a correct diagnosis in all 17 cases that a pathologist got wrong. Likewise, the pathologist got the right answer in all 14 cases in which the machine slipped up.
The findings show that the combination of SRH and AI can be used to make real-time predictions of a patient’s brain tumor diagnosis to inform surgical decision-making. That may be especially important in places where expert neuropathologists are hard to find.
Ultimately, the researchers suggest that AI may yield even more useful information about a tumor’s underlying molecular alterations, adding ever greater precision to the diagnosis. Similar approaches are also likely to work in supplying timely information to surgeons operating on patients with other cancers too, including cancers of the skin and breast. The research team has made a brief video to give you a more detailed look at the new automated tissue-to-diagnosis pipeline.
 Near real-time intraoperative brain tumor diagnosis using stimulated Raman histology and deep neural networks. Hollon TC, Pandian B, Adapa AR, Urias E, Save AV, Khalsa SSS, Eichberg DG, D’Amico RS, Farooq ZU, Lewis S, Petridis PD, Marie T, Shah AH, Garton HJL, Maher CO, Heth JA, McKean EL, Sullivan SE, Hervey-Jumper SL, Patil PG, Thompson BG, Sagher O, McKhann GM 2nd, Komotar RJ, Ivan ME, Snuderl M, Otten ML, Johnson TD, Sisti MB, Bruce JN, Muraszko KM, Trautman J, Freudiger CW, Canoll P, Lee H, Camelo-Piragua S, Orringer DA. Nat Med. 2020 Jan 6.
Caption: University of Washington team that developed new light-sheet microscope (center) includes (l-r) Jonathan Liu, Adam Glaser, Larry True, Nicholas Reder, and Ye Chen. Credit: Mark Stone/University of Washington
After surgically removing a tumor from a cancer patient, doctors like to send off some of the tissue for evaluation by a pathologist to get a better idea of whether the margins are cancer free and to guide further treatment decisions. But for technical reasons, completing the pathology report can take days, much to the frustration of patients and their families. Sometimes the results even require an additional surgical procedure.
Now, NIH-funded researchers have developed a groundbreaking new microscope to help perform the pathology in minutes, not days. How’s that possible? The device works like a scanner for tissues, using a thin sheet of light to capture a series of thin cross sections within a tumor specimen without having to section it with a knife, as is done with conventional pathology. The rapidly acquired 2D “optical sections” are processed by a computer that assembles them into a high-resolution 3D image for immediate analysis.