Skip to main content

2020 January

NIH HEAL Investigator Meeting

Posted on by

HEAL Investigators Meeting
The NIH HEAL Investigator Meeting is now underway. It was my pleasure to welcome more than 350 researchers to the two-day meeting. The Helping to End Addiction Long-term Initiative, or NIH HEAL Initiative, is a cross-cutting research effort launched last year to improve prevention and treatment strategies for opioid misuse and addiction, as well as to enhance pain management. This meeting will help to establish the HEAL investigator network, increase awareness of the initiative’s programs and scope, and allow the investigators to exchange ideas. The meeting opened on January 16, 2020 at the Hyatt Regency, Bethesda, MD. Credit: NIH

Time Well Spent in North Carolina

Posted on by

Visiting NCSSM
I had a fantastic time visiting with students at North Carolina School of Science and Mathematics (NCSSM), Durham. My grandson Sellers attends NCSSM, and I was touched when he introduced me in the school auditorium before my speech to the student body titled “The Golden Era of Biomedical Research is Now.” The NCSSM is the nation’s first public, residential STEM high school. I visited the school on January 10, 2020. Credit: Brian Faircloth, North Carolina School of Science and Mathematics.


A Special Honor from Washingtonian Magazine

Posted on by

Washingtonian Luncheon
What a thrill it was for me and my wife Diane Baker to join nine others in being named Washingtonians of the Year 2019. The award, now in its 48th year, is sponsored by Washingtonian Magazine and honors people whose “hard work, creativity, innovation, and commitment” help to make the Washington, D. C. area a great place to live. That certainly describes Diane, as well as our commitment as a couple to give back to the community. During the awards luncheon, the honorees gathered for a photo. Diane is in the front row wearing a gold sweater. The luncheon was held on January 15, 2020 at the Willard Hotel, Washington, D.C. Credit: NIH

A Real-Time Look at Value-Based Decision Making

Posted on by

All of us make many decisions every day. For most things, such as which jacket to wear or where to grab a cup of coffee, there’s usually no right answer, so we often decide using values rooted in our past experiences. Now, neuroscientists have identified the part of the mammalian brain that stores information essential to such value-based decision making.

Researchers zeroed in on this particular brain region, known as the retrosplenial cortex (RSC), by analyzing movies—including the clip shown about 32 seconds into this video—that captured in real time what goes on in the brains of mice as they make decisions. Each white circle is a neuron, and the flickers of light reflect their activity: the brighter the light, the more active the neuron at that point in time.

All told, the NIH-funded team, led by Ryoma Hattori and Takaki Komiyama, University of California at San Diego, La Jolla, made recordings of more than 45,000 neurons across six regions of the mouse brain [1]. Neural activity isn’t usually visible. But, in this case, researchers used mice that had been genetically engineered so that their neurons, when activated, expressed a protein that glowed.

Their system was also set up to encourage the mice to make value-based decisions, including choosing between two drinking tubes, each with a different probability of delivering water. During this decision-making process, the RSC proved to be the region of the brain where neurons persistently lit up, reflecting how the mouse evaluated one option over the other.

The new discovery, described in the journal Cell, comes as something of a surprise to neuroscientists because the RSC hadn’t previously been implicated in value-based decisions. To gather additional evidence, the researchers turned to optogenetics, a technique that enabled them to use light to inactivate neurons in the RSC’s of living animals. These studies confirmed that, with the RSC turned off, the mice couldn’t retrieve value information based on past experience.

The researchers note that the RSC is heavily interconnected with other key brain regions, including those involved in learning, memory, and controlling movement. This indicates that the RSC may be well situated to serve as a hub for storing value information, allowing it to be accessed and acted upon when it is needed.

The findings are yet another amazing example of how advances coming out of the NIH-led Brain Research through Advancing Innovative Neurotechnologies® (BRAIN) Initiative are revolutionizing our understanding of the brain. In the future, the team hopes to learn more about how the RSC stores this information and sends it to other parts of the brain. They note that it will also be important to explore how activity in this brain area may be altered in schizophrenia, dementia, substance abuse, and other conditions that may affect decision-making abilities. It will also be interesting to see how this develops during childhood and adolescence.

Reference:

[1] Area-Specificity and Plasticity of History-Dependent Value Coding During Learning. Hattori R, Danskin B, Babic Z, Mlynaryk N, Komiyama T. Cell. 2019 Jun 13;177(7):1858-1872.e15.

Links:

Brain Research through Advancing Innovative Neurotechnologies® (BRAIN) Initiative (NIH)

Komiyama Lab (UCSD, La Jolla)

NIH Support: National Institute of Neurological Disorders and Stroke; National Eye Institute; National Institute on Deafness and Other Communication Disorders


Artificial Intelligence Speeds Brain Tumor Diagnosis

Posted on by

Real time diagnostics in the operating room
Caption: Artificial intelligence speeds diagnosis of brain tumors. Top, doctor reviews digitized tumor specimen in operating room; left, the AI program predicts diagnosis; right, surgeons review results in near real-time.
Credit: Joe Hallisy, Michigan Medicine, Ann Arbor

Computers are now being trained to “see” the patterns of disease often hidden in our cells and tissues. Now comes word of yet another remarkable use of computer-generated artificial intelligence (AI): swiftly providing neurosurgeons with valuable, real-time information about what type of brain tumor is present, while the patient is still on the operating table.

This latest advance comes from an NIH-funded clinical trial of 278 patients undergoing brain surgery. The researchers found they could take a small tumor biopsy during surgery, feed it into a trained computer in the operating room, and receive a diagnosis that rivals the accuracy of an expert pathologist.

Traditionally, sending out a biopsy to an expert pathologist and getting back a diagnosis optimally takes about 40 minutes. But the computer can do it in the operating room on average in under 3 minutes. The time saved helps to inform surgeons how to proceed with their delicate surgery and make immediate and potentially life-saving treatment decisions to assist their patients.

As reported in Nature Medicine, researchers led by Daniel Orringer, NYU Langone Health, New York, and Todd Hollon, University of Michigan, Ann Arbor, took advantage of AI and another technological advance called stimulated Raman histology (SRH). The latter is an emerging clinical imaging technique that makes it possible to generate detailed images of a tissue sample without the usual processing steps.

The SRH technique starts off by bouncing laser light rapidly through a tissue sample. This light enables a nearby fiberoptic microscope to capture the cellular and structural details within the sample. Remarkably, it does so by picking up on subtle differences in the way lipids, proteins, and nucleic acids vibrate when exposed to the light.

Then, using a virtual coloring program, the microscope quickly pieces together and colors in the fine structural details, pixel by pixel. The result: a high-resolution, detailed image that you might expect from a pathology lab, minus the staining of cells, mounting of slides, and the other time-consuming processing procedures.

To interpret the SRH images, the researchers turned to computers and machine learning. To teach a computer, it must be fed large datasets of examples in order to learn how to perform a given task. In this case, they used a special class of machine learning called deep neural networks, or deep learning. It’s inspired by the way neural networks in the human brain process information.

In deep learning, computers look for patterns in large collections of data. As they begin to recognize complex relationships, some connections in the network are strengthened while others are weakened. The finished network is typically composed of multiple information-processing layers, which operate on the data to return a result, in this case a brain tumor diagnosis.

The team trained the computer to classify tissues samples into one of 13 categories commonly found in a brain tumor sample. Those categories included the most common brain tumors: malignant glioma, lymphoma, metastatic tumors, and meningioma. The training was based on more than 2.5 million labeled images representing samples from 415 patients.

Next, they put the machine to the test. The researchers split each of 278 brain tissue samples into two specimens. One was sent to a conventional pathology lab for prepping and diagnosis. The other was imaged with SRH, and then the trained machine made a diagnosis.

Overall, the machine’s performance was quite impressive, returning the right answer about 95 percent of the time. That’s compared to an accuracy of 94 percent for conventional pathology.

Interestingly, the machine made a correct diagnosis in all 17 cases that a pathologist got wrong. Likewise, the pathologist got the right answer in all 14 cases in which the machine slipped up.

The findings show that the combination of SRH and AI can be used to make real-time predictions of a patient’s brain tumor diagnosis to inform surgical decision-making. That may be especially important in places where expert neuropathologists are hard to find.

Ultimately, the researchers suggest that AI may yield even more useful information about a tumor’s underlying molecular alterations, adding ever greater precision to the diagnosis. Similar approaches are also likely to work in supplying timely information to surgeons operating on patients with other cancers too, including cancers of the skin and breast. The research team has made a brief video to give you a more detailed look at the new automated tissue-to-diagnosis pipeline.

Reference:

[1] Near real-time intraoperative brain tumor diagnosis using stimulated Raman histology and deep neural networks. Hollon TC, Pandian B, Adapa AR, Urias E, Save AV, Khalsa SSS, Eichberg DG, D’Amico RS, Farooq ZU, Lewis S, Petridis PD, Marie T, Shah AH, Garton HJL, Maher CO, Heth JA, McKean EL, Sullivan SE, Hervey-Jumper SL, Patil PG, Thompson BG, Sagher O, McKhann GM 2nd, Komotar RJ, Ivan ME, Snuderl M, Otten ML, Johnson TD, Sisti MB, Bruce JN, Muraszko KM, Trautman J, Freudiger CW, Canoll P, Lee H, Camelo-Piragua S, Orringer DA. Nat Med. 2020 Jan 6.

Links:

Video: Artificial Intelligence: Collecting Data to Maximize Potential (NIH)

New Imaging Technique Allows Quick, Automated Analysis of Brain Tumor Tissue During Surgery (National Institute of Biomedical Imaging and Bioengineering/NIH)

Daniel Orringer (NYU Langone, Perlmutter Cancer Center, New York City)

Todd Hollon (University of Michigan, Ann Arbor)

NIH Support: National Cancer Institute; National Institute of Biomedical Imaging and Bioengineering


Previous Page Next Page