Skip to main content

Dr. Francis Collins

The Amazing Brain: Tracking Molecular Events with Calling Cards

Posted on by

In days mostly gone by, it was fashionable in some circles for people to hand out calling cards to mark their arrival at special social events. This genteel human tradition is now being adapted to the lab to allow certain benign viruses to issue their own high-tech calling cards and mark their arrival at precise locations in the genome. These special locations show where there’s activity involving transcription factors, specialized proteins that switch genes on and off and help determine cell fate.

The idea is that myriad, well-placed calling cards can track brain development over time in mice and detect changes in transcription factor activity associated with certain neuropsychiatric disorders. This colorful image, which won first place in this year’s Show Us Your BRAINs! Photo and Video contest, provides a striking display of these calling cards in action in living brain tissue.

The image comes from Allen Yen, a PhD candidate in the lab of Joseph Dougherty, collaborating with the nearby lab of Rob Mitra. Both labs are located in the Washington University School of Medicine, St. Louis.

Yen and colleagues zoomed in on this section of mouse brain tissue under a microscope to capture dozens of detailed images that they then stitched together to create this high-resolution overview. The image shows neural cells (red) and cell nuclei (blue). But focus in on the neural cells (green) concentrated in the brain’s outer cortex (top) and hippocampus (two lobes in the upper center). They’ve been labelled with calling cards that were dropped off by adeno-associated virus [1].

Once dropped off, a calling card doesn’t bear a pretentious name or title. Rather, the calling card, is a small mobile snippet of DNA called a transposon. It gets dropped off with the other essential component of the technology: a specialized enzyme called a transposase, which the researchers fuse to one of many specific transcription factors of interest.

Each time one of these transcription factors of interest binds DNA to help turn a gene on or off, the attached transposase “grabs” a transposon calling card and inserts it into the genome. As a result, it leaves behind a permanent record of the interaction.

What’s also nice is the calling cards are programmed to give away their general locations. That’s because they encode a fluorescent marker (in this image, it’s a green fluorescent protein). In fact, Yen and colleagues could look under a microscope and tell from all the green that their calling card technology was in place and working as intended.
The final step, though, was to find out precisely where in the genome those calling cards had been left. For this, the researchers used next-generation sequencing to produce a cumulative history and map of each and every calling card dropped off in the genome.

These comprehensive maps allow them to identify important DNA-protein binding events well after the fact. This innovative technology also enables scientists to attribute past molecular interactions with observable developmental outcomes in a way that isn’t otherwise possible.

While the Mitra and Dougherty labs continue to improve upon this technology, it’s already readily adaptable to answer many important questions about the brain and brain disorders. In fact, Yen is now applying the technology to study neurodevelopment in mouse models of neuropsychiatric disorders, specifically autism spectrum disorder (ASD) [2]. This calling card technology also is available for any lab to deploy for studying a transcription factor of interest.

This research is supported by the Brain Research through Advancing Innovative Neurotechnologies® (BRAIN) Initiative. One of the major goals of BRAIN Initiative is to accelerate the development and application of innovative technologies to gain new understanding of the brain. This award-winning image is certainly a prime example of striving to meet this goal. I’ll look forward to what these calling cards will tell us in the future about ASD and other important neurodevelopmental conditions affecting the brain.

References:

[1] A viral toolkit for recording transcription factor-DNA interactions in live mouse tissues. Cammack AJ, Moudgil A, Chen J, Vasek MJ, Shabsovich M, McCullough K, Yen A, Lagunas T, Maloney SE, He J, Chen X, Hooda M, Wilkinson MN, Miller TM, Mitra RD, Dougherty JD. Proc Natl Acad Sci U S A. 2020 May 5;117(18):10003-10014.

[2] A MYT1L Syndrome mouse model recapitulates patient phenotypes and reveals altered brain development due to disrupted neuronal maturation. Jiayang Chen, Mary E. Lambo, Xia Ge, Joshua T. Dearborn, Yating Liu, Katherine B. McCullough, Raylynn G. Swift, Dora R. Tabachnick, Lucy Tian, Kevin Noguchi, Joel R. Garbow, John N. Constantino. bioRxiv. May 27, 2021.

Links:

Brain Research through Advancing Innovative Neurotechnologies® (BRAIN) Initiative (NIH)

Autism Spectrum Disorder (National Institute of Mental Health/NIH)

Dougherty Lab (Washington University School of Medicine, St. Louis)

Mitra Lab (Washington University School of Medicine)

Show Us Your BRAINs! Photo and Video Contest (BRAIN Initiative/NIH)

NIH Support: National Institute of Neurological Disorders and Stroke; National Institute of Mental Health; National Center for Advancing Translational Sciences; National Human Genome Research Institute; National Institute of General Medical Sciences


Thanking NIH’s Call Center and Contact Investigation Teams

Posted on by

Screen shot of Zoom Meeting

Introduced by the leader of NIH’s Occupational Medical Service, Dr. Heike Bailin, and with my wife Diane Baker at my side, I recently met with the NIH Call Center and Contact Investigation teams to express my gratitude for the vital role they play in keeping our community safe from COVID-19. This screenshot of our virtual meeting on August 11 shows some of the more than 100 people that make up these important teams. At the same event, I also thanked the Positive Results and Return to Work teams for providing compassionate, knowledgeable guidance to NIH staff facing uncertainty and stress at home and at work. Credit: NIH

The Amazing Brain: A Sharper Image of the Pyramidal Tract

Posted on by

Flip the image above upside down, and the shape may remind you of something. If you think it resembles a pyramid, then you and a lot of great neuroscientists are thinking alike. What you are viewing is a colorized, 3D reconstruction of a pyramidal tract, which are bundles of nerve fibers that originate from the brain’s cerebral cortex and relay signals to the brainstem or the spinal cord. These signals control many important activities, including the voluntary movement of our arms, legs, head, and face.

For a while now, it’s been possible to combine a specialized form of magnetic resonance imaging (MRI) with computer modeling tools to produce 3D reconstructions of complicated networks of nerve fibers, such as the pyramidal tract. Still, for technical reasons, the quality of these reconstructions has remained poor in parts of the brain where nerve fibers cross at angles of 40 degrees or less.

The video above demonstrates how adding a sophisticated algorithm, called Orientation Distribution Function (ODF)-Fingerprinting, to such modeling can help overcome this problem when reconstructing a pyramidal tract. It has potential to enhance the reliability of these 3D reconstructions as neurosurgeons begin to use them to plan out their surgeries to help ensure they are carried out with the utmost safety and precision.

In the first second of the video, you see gray, fuzzy images from a diffusion MRI of the pyramidal tract. But, very quickly, a more colorful, detailed 3D reconstruction begins to appear, swiftly filling in from the top down. Colors are used to indicate the primary orientations of the nerve fibers: left to right (red), back to front (green), and top to bottom (blue). The orange, magenta, and other colors represent combinations of these primary directional orientations.

About three seconds into the video, a rough draft of the 3D reconstruction is complete. The top of the pyramidal tract looks pretty good. However, looking lower down, you can see distortions in color and relatively poor resolution of the nerve fibers in the middle of the tract—exactly where the fibers cross each other at angles of less than 40 degrees. So, researchers tapped into the power of their new ODF-Fingerprinting software to improve the image—and, starting about nine seconds into the video, you can see an impressive final result.

The researchers who produced this amazing video are Patryk Filipiak and colleagues in the NIH-supported lab of Steven Baete, Center for Advanced Imaging Innovation and Research, New York University Grossman School of Medicine, New York. The work paired diffusion MRI data from the NIH Human Connectome Project with the ODF-Fingerprinting algorithm, which was created by Baete to incorporate additional MRI imaging data on the shape of nerve fibers to infer their directionality [1].

This innovative approach to imaging recently earned Baete’s team second place in the 2021 “Show Us Your BRAINs” Photo and Video contest, sponsored by the NIH-led Brain Research through Advancing Innovative Neurotechnologies® (BRAIN) Initiative. But researchers aren’t stopping there! They are continuing to refine ODF-Fingerprinting, with the aim of modeling the pyramidal tract in even higher resolution for use in devising new and better ways of helping people undergoing neurosurgery.

Reference:

[1] Fingerprinting Orientation Distribution Functions in diffusion MRI detects smaller crossing angles. Baete SH, Cloos MA, Lin YC, Placantonakis DG, Shepherd T, Boada FE. Neuroimage. 2019 Sep;198:231-241.

Links:

Brain Research through Advancing Innovative Neurotechnologies® (BRAIN) Initiative (NIH)

Human Connectome Project (University of Southern California, Los Angeles)

Steven Baete (Center for Advanced Imaging Innovation and Research, New York University Grossman School of Medicine, New York)

Show Us Your BRAINs! Photo and Video Contest (BRAIN Initiative/NIH)

NIH Support: National Institute of Biomedical Imaging and Bioengineering; National Institute of Neurological Disorders and Stroke; National Cancer Institute


The Amazing Brain: Visualizing Data to Understand Brain Networks

Posted on by

The NIH-led Brain Research through Advancing Innovative Neurotechnologies® (BRAIN) Initiative continues to teach us about the world’s most sophisticated computer: the human brain. This striking image offers a spectacular case in point, thanks to a new tool called Visual Neuronal Dynamics (VND).

VND is not a camera. It is a powerful software program that can display, animate, and analyze models of neurons and their connections, or networks, using 3D graphics. What you’re seeing in this colorful image is a strip of mouse primary visual cortex, the area in the brain where incoming sensory information gets processed into vision.

This strip contains more than 230,000 neurons of 17 different cell types. Long and spindly excitatory neurons that point upward (purple, blue, red, orange) are intermingled with short and stubby inhibitory neurons (green, cyan, magenta). Slicing through the neuronal landscape is a neuropixels probe (silver): a tiny flexible silicon detector that can record brain activity in awake animals [1].

Developed by Emad Tajkhorshid and his team at University of Illinois at Urbana-Champaign, along with Anton Arkhipov of the Allen Institute, Seattle, VND represents a scientific milestone for neuroscience: using an adept software tool to see and analyze massive neuronal datasets on a computer. What’s also nice is the computer doesn’t have to be a fancy one, and VND’s instructions, or code, are publicly available for anyone to use.

VND is the neuroscience-adapted cousin of Visual Molecular Dynamics (VMD), a popular molecular biology visualization tool to see life up close in 3D, also developed by Tajkhorshid’s group [2]. By modeling and visualizing neurons and their connections, VND helps neuroscientists understand at their desktops how neural networks are organized and what happens when they are manipulated. Those visualizations then lay the groundwork for follow-up lab studies to validate the data and build upon them.

Through the Allen Institute, the NIH BRAIN Initiative is compiling a comprehensive whole-brain atlas of cell types in the mouse, and Arkhipov’s work integrates these data into computer models. In May 2020, his group published comprehensive models of the mouse primary visual cortex [3].

Arkhipov and team are now working to understand how the primary visual cortex’s physical structure (the cell shapes and connections within its complicated circuits) determines its outputs. For example, how do specific connections determine network activity? Or, how fast do cells fire under different conditions?

Ultimately, such computational research may help us understand how brain injuries or disease affect the structure and function of these neural networks. VND should also propel understanding of many other areas of the brain, for which the data are accumulating rapidly, to answer similar questions that still remain mysterious to scientists.

In the meantime, VND is also creating some award-winning art. The image above was the second-place photo in the 2021 “Show us Your BRAINs!” Photo and Video Contest sponsored by the NIH BRAIN Initiative.

References:

[1] Fully integrated silicon probes for high-density recording of neural activity. Jun JJ, Steinmetz NA, Siegle JH, Denman DJ, Bauza M, Barbarits B, Lee AK, Anastassiou CA, Andrei A, Aydın Ç, Barbic M, Blanche TJ, Bonin V, Couto J, Dutta B, Gratiy SL, Gutnisky DA, Häusser M, Karsh B, Ledochowitsch P, Lopez CM, Mitelut C, Musa S, Okun M, Pachitariu M, Putzeys J, Rich PD, Rossant C, Sun WL, Svoboda K, Carandini M, Harris KD, Koch C, O’Keefe J, Harris TD. Nature. 2017 Nov 8;551(7679):232-236.

[2] VMD: visual molecular dynamics. Humphrey W, Dalke A, Schulten K. J Mol Graph. 1996 Feb;14(1):33-8, 27-8.

[3] Systematic integration of structural and functional data into multi-scale models of mouse primary visual cortex. Billeh YN, Cai B, Gratiy SL, Dai K, Iyer R, Gouwens NW, Abbasi-Asl R, Jia X, Siegle JH, Olsen SR, Koch C, Mihalas S, Arkhipov A. Neuron. 2020 May 6;106(3):388-403.e18

Links:

The Brain Research Through Advancing Innovative Neurotechnologies® (BRAIN) Initiative (NIH)

Models of the Mouse Primary Visual Cortex (Allen Institute, Seattle)

Visual Neuronal Dynamics (NIH Center for Macromolecular Modeling and Bioinformatics, University of Illinois at Urbana-Champaign)

Tajkhorshid Lab (University of Illinois at Urbana-Champaign)

Arkhipov Lab (Allen Institute)

Show Us Your BRAINs! Photo & Video Contest (BRAIN Initiative/NIH)

NIH Support: National Institute of Neurological Disorders and Stroke


A Gala Event for Down Syndrome

Posted on by

Global Down Syndrome Foundation teleconference with faces of participants
What a nice evening it was attending this year’s virtual AcceptAbility Gala, hosted by the Global Down Syndrome Foundation (GLOBAL) I was part of a panel that answered questions posed by “self-advocates,” folks with Down syndrome and/or their siblings. The questions generated a wide-ranging discussion and gave me the chance to highlight the excellent research now underway through NIH’s INCLUDE Project. GLOBAL, part of a network of affiliate organizations that work to improve the lives of people with Down syndrome, held the gala on July 21.

Previous Page Next Page