Skip to main content

imaging

New Microscope Technique Provides Real-Time 3D Views

Posted on by

Most of the “cool” videos shared on my blog are borne of countless hours behind a microscope. Researchers must move a biological sample through a microscope’s focus, slowly acquiring hundreds of high-res 2D snapshots, one painstaking snap at a time. Afterwards, sophisticated computer software takes this ordered “stack” of images, calculates how the object would look from different perspectives, and later displays them as 3D views of life that can be streamed as short videos.

But this video is different. It was created by what’s called a multi-angle projection imaging system. This new optical device requires just a few camera snapshots and two mirrors to image a biological sample from multiple angles at once. Because the device eliminates the time-consuming process of acquiring individual image slices, it’s up to 100 times faster than current technologies and doesn’t require computer software to construct the movie. The kicker is that the video can be displayed in real time, which isn’t possible with existing image-stacking methods.

The video here shows two human melanoma cells, rotating several times between overhead and side views. You can see large amounts of the protein PI3K (brighter orange hues indicate higher concentrations), which helps some cancer cells divide and move around. Near the cell’s perimeter are small, dynamic surface protrusions. PI3K in these “blebs” is thought to help tumor cells navigate and survive in foreign tissues as the tumor spreads to other organs, a process known as metastasis.

The new multi-angle projection imaging system optical device was described in a paper published recently in the journal Nature Methods [1]. It was created by Reto Fiolka and Kevin Dean at the University of Texas Southwestern Medical Center, Dallas.

Like most technology, this device is complicated. Rather than the microscope and camera doing all the work, as is customary, two mirrors within the microscope play a starring role. During a camera exposure, these mirrors rotate ever so slightly and warp the acquired image in such a way that successive, unique perspectives of the sample magically come into view. By changing the amount of warp, the sample appears to rotate in real-time. As such, each view shown in the video requires only one camera snapshot, instead of acquiring hundreds of slices in a conventional scheme.

The concept traces to computer science and an algorithm called the shear warp transform method. It’s used to observe 3D objects from different perspectives on a 2D computer monitor. Fiolka, Dean, and team found they could implement a similar algorithm optically for use with a microscope. What’s more, their multi-angle projection imaging system is easy-to-use, inexpensive, and can be converted for use on any camera-based microscope.

The researchers have used the device to view samples spanning a range of sizes: from mitochondria and other tiny organelles inside cells to the beating heart of a young zebrafish. And, as the video shows, it has been applied to study cancer and other human diseases.

In a neat, but also scientifically valuable twist, the new optical method can generate a virtual reality view of a sample. Any microscope user wearing the appropriately colored 3D glasses immediately sees the objects.

While virtual reality viewing of cellular life might sound like a gimmick, Fiolka and Dean believe that it will help researchers use their current microscopes to see any sample in 3D—offering the chance to find rare and potentially important biological events much faster than is possible with even the most advanced microscopes today.

Fiolka, Dean, and team are still just getting started. Because the method analyzes tissue very quickly within a single image frame, they say it will enable scientists to observe the fastest events in biology, such as the movement of calcium throughout a neuron—or even a whole bundle of neurons at once. For neuroscientists trying to understand the brain, that’s a movie they will really want to see.

Reference:

[1] Real-time multi-angle projection imaging of biological dynamics. Chang BJ, Manton JD, Sapoznik E, Pohlkamp T, Terrones TS, Welf ES, Murali VS, Roudot P, Hake K, Whitehead L, York AG, Dean KM, Fiolka R. Nat Methods. 2021 Jul;18(7):829-834.

Links:

Metastatic Cancer: When Cancer Spreads (National Cancer Institute)

Fiolka Lab (University of Texas Southwestern Medical Center, Dallas)

Dean Lab (University of Texas Southwestern)

Microscopy Innovation Lab (University of Texas Southwestern)

NIH Support: National Cancer Institute; National Institute of General Medical Sciences


The Amazing Brain: Motor Neurons of the Cervical Spine

Posted on by

Today, you may have opened a jar, done an upper body workout, played a guitar or a piano, texted a friend, or maybe even jotted down a grocery list longhand. All of these “skilled” arm, wrist, and hand movements are made possible by the bundled nerves, or circuits, running through a part of the central nervous system in the neck area called the cervical spine.

This video, which combines sophisticated imaging and computation with animation, shows the density of three types of nerve cells in the mouse cervical spine. There are the V1 interneurons (red), which sit between sensory and motor neurons; motor neurons associated with controlling the movement of the bicep (blue); and motor neurons associated with controlling the tricep (green).

At 4 seconds, the 3D animation morphs to show all the colors and cells intermixed as they are naturally in the cervical spine. At 8 seconds, the animation highlights the density of these three cells types. Notice in the bottom left corner, a light icon appears indicating the different imaging perspectives. What’s unique here is the frontal, or rostral, view of the cervical spine. The cervical spine is typically imaged from a lateral, or side, perspective.

Starting at 16 seconds, the animation highlights the location and density of each of the individual neurons. For the grand finale, viewers zoom off on a brief fly-through of the cervical spine and a flurry of reds, blues, and greens.

The video comes from Jamie Anne Mortel, a research assistant in the lab of Samuel Pfaff, Salk Institute, La Jolla, CA. Mortel is part of a team supported by the NIH-led Brain Research through Advancing Innovative Neurotechnologies® (BRAIN) Initiative that’s developing a comprehensive atlas of the circuitry within the cervical spine that controls how mice control their forelimb movements, such as reaching and grasping.

This basic research will provide a better understanding of how the mammalian brain and spinal cord work together to produce movement. More than that, this research may provide valuable clues into better treating paralysis to arms, wrists, and/or hands caused by neurological diseases and spinal cord injuries.

As a part of this project, the Pfaff lab has been busy developing a software tool to take their imaging data from different parts of the cervical spine and present it in 3D. Mortel, who likes to make cute cartoon animations in her spare time, noticed that the software lacked animation capability. So she took the initiative and spent the next three weeks working after hours to produce this video—her first attempt at scientific animation. No doubt she must have been using a lot of wrist and hand movements!

With a positive response from her Salk labmates, Mortel decided to enter her scientific animation debut in the 2021 Show Us BRAINs! Photo and Video Contest. To her great surprise and delight, Mortel won third place in the video competition. Congratulations, and continued success for you and the team in producing this much-needed atlas to define the circuitry underlying skilled arm, wrist, and hand movements.

Links:

Brain Research through Advancing Innovative Neurotechnologies® (BRAIN) Initiative (NIH)

Spinal Cord Injury Information Page (National Institute of Neurological Disorders and Stroke/NIH)

Samuel Pfaff (Salk Institute, La Jolla, CA)

Show Us Your BRAINs! Photo and Video Contest (Brain Initiative/NIH)

NIH Support: National Institute of Neurological Disorders and Stroke


The Amazing Brain: Tracking Molecular Events with Calling Cards

Posted on by

In days mostly gone by, it was fashionable in some circles for people to hand out calling cards to mark their arrival at special social events. This genteel human tradition is now being adapted to the lab to allow certain benign viruses to issue their own high-tech calling cards and mark their arrival at precise locations in the genome. These special locations show where there’s activity involving transcription factors, specialized proteins that switch genes on and off and help determine cell fate.

The idea is that myriad, well-placed calling cards can track brain development over time in mice and detect changes in transcription factor activity associated with certain neuropsychiatric disorders. This colorful image, which won first place in this year’s Show Us Your BRAINs! Photo and Video contest, provides a striking display of these calling cards in action in living brain tissue.

The image comes from Allen Yen, a PhD candidate in the lab of Joseph Dougherty, collaborating with the nearby lab of Rob Mitra. Both labs are located in the Washington University School of Medicine, St. Louis.

Yen and colleagues zoomed in on this section of mouse brain tissue under a microscope to capture dozens of detailed images that they then stitched together to create this high-resolution overview. The image shows neural cells (red) and cell nuclei (blue). But focus in on the neural cells (green) concentrated in the brain’s outer cortex (top) and hippocampus (two lobes in the upper center). They’ve been labelled with calling cards that were dropped off by adeno-associated virus [1].

Once dropped off, a calling card doesn’t bear a pretentious name or title. Rather, the calling card, is a small mobile snippet of DNA called a transposon. It gets dropped off with the other essential component of the technology: a specialized enzyme called a transposase, which the researchers fuse to one of many specific transcription factors of interest.

Each time one of these transcription factors of interest binds DNA to help turn a gene on or off, the attached transposase “grabs” a transposon calling card and inserts it into the genome. As a result, it leaves behind a permanent record of the interaction.

What’s also nice is the calling cards are programmed to give away their general locations. That’s because they encode a fluorescent marker (in this image, it’s a green fluorescent protein). In fact, Yen and colleagues could look under a microscope and tell from all the green that their calling card technology was in place and working as intended.
The final step, though, was to find out precisely where in the genome those calling cards had been left. For this, the researchers used next-generation sequencing to produce a cumulative history and map of each and every calling card dropped off in the genome.

These comprehensive maps allow them to identify important DNA-protein binding events well after the fact. This innovative technology also enables scientists to attribute past molecular interactions with observable developmental outcomes in a way that isn’t otherwise possible.

While the Mitra and Dougherty labs continue to improve upon this technology, it’s already readily adaptable to answer many important questions about the brain and brain disorders. In fact, Yen is now applying the technology to study neurodevelopment in mouse models of neuropsychiatric disorders, specifically autism spectrum disorder (ASD) [2]. This calling card technology also is available for any lab to deploy for studying a transcription factor of interest.

This research is supported by the Brain Research through Advancing Innovative Neurotechnologies® (BRAIN) Initiative. One of the major goals of BRAIN Initiative is to accelerate the development and application of innovative technologies to gain new understanding of the brain. This award-winning image is certainly a prime example of striving to meet this goal. I’ll look forward to what these calling cards will tell us in the future about ASD and other important neurodevelopmental conditions affecting the brain.

References:

[1] A viral toolkit for recording transcription factor-DNA interactions in live mouse tissues. Cammack AJ, Moudgil A, Chen J, Vasek MJ, Shabsovich M, McCullough K, Yen A, Lagunas T, Maloney SE, He J, Chen X, Hooda M, Wilkinson MN, Miller TM, Mitra RD, Dougherty JD. Proc Natl Acad Sci U S A. 2020 May 5;117(18):10003-10014.

[2] A MYT1L Syndrome mouse model recapitulates patient phenotypes and reveals altered brain development due to disrupted neuronal maturation. Jiayang Chen, Mary E. Lambo, Xia Ge, Joshua T. Dearborn, Yating Liu, Katherine B. McCullough, Raylynn G. Swift, Dora R. Tabachnick, Lucy Tian, Kevin Noguchi, Joel R. Garbow, John N. Constantino. bioRxiv. May 27, 2021.

Links:

Brain Research through Advancing Innovative Neurotechnologies® (BRAIN) Initiative (NIH)

Autism Spectrum Disorder (National Institute of Mental Health/NIH)

Dougherty Lab (Washington University School of Medicine, St. Louis)

Mitra Lab (Washington University School of Medicine)

Show Us Your BRAINs! Photo and Video Contest (BRAIN Initiative/NIH)

NIH Support: National Institute of Neurological Disorders and Stroke; National Institute of Mental Health; National Center for Advancing Translational Sciences; National Human Genome Research Institute; National Institute of General Medical Sciences


Celebrating the Fourth with Neuroscience Fireworks

Posted on by

There’s so much to celebrate about our country this Fourth of July. That includes giving thanks to all those healthcare providers who have put themselves in harm’s way to staff the ERs, hospital wards, and ICUs to care for those afflicted with COVID-19, and also for everyone who worked so diligently to develop, test, and distribute COVID-19 vaccines.

These “shots of hope,” created with rigorous science and in record time, are making it possible for a great many Americans to gather safely once again with family and friends. So, if you’re vaccinated (and I really hope you are—because these vaccines have been proven safe and highly effective), fire up the grill, crank up the music, and get ready to show your true red, white, and blue colors. My wife and I—both fully vaccinated—intend to do just that!

To help get the celebration rolling, I’d like to share a couple minutes of some pretty amazing biological fireworks. While the track of a John Philip Sousa march is added just for fun, what you see in the video above is the result of some very serious neuroscience research that is scientifically, as well as visually, breath taking. Credit for this work goes to an NIH-supported team that includes Ricardo Azevedo and Sunil Gandhi, at the Center for the Neurobiology of Learning and Memory, University of California, Irvine, and their collaborator Damian Wheeler, Translucence Biosystems, Irvine, CA. Azevedo is also an NIH National Research Service Award fellow and a Medical Scientist Training Program trainee with Gandhi.

The team’s video starts off with 3D, colorized renderings of a mouse brain at cellular resolution. About 25 seconds in, the video flashes to a bundle of nerve fibers called the fornix. Thanks to the wonders of fluorescent labeling combined with “tissue-clearing” and other innovative technologies, you can clearly see the round cell bodies of individual neurons, along with the long, arm-like axons that they use to send out signals and connect with other neurons to form signaling circuits. The human brain has nearly 100 trillion of these circuits and, when activated, they process incoming sensory information and provide outputs that lead to our thoughts, words, feelings, and actions.

As shown in the video, the nerve fibers of the fornix provide a major output pathway from the hippocampus, a region of the brain involved in memory. Next, we travel to the brain’s neocortex, the outermost part of the brain that’s responsible for complex behaviors, and then move on to explore an intricate structure called the corticospinal tract, which carries motor commands to the spinal cord. The final stop is the olfactory tubercle —towards the base of the frontal lobe—a key player in odor processing and motivated behaviors.

Azevedo and his colleagues imaged the brain in this video in about 40 minutes using their imaging platform called the Translucence Biosystems’ Mesoscale Imaging System™. This process starts with a tissue-clearing method that eliminates light-scattering lipids, leaving the mouse brain transparent. From there, advanced light-sheet microscopy makes thin optical sections of the tissue, and 3D data processing algorithms reconstruct the image to high resolution.

Using this platform, researchers can take brain-wide snapshots of neuronal activity linked to a specific behavior. They can also use it to trace neural circuits that span various regions of the brain, allowing them to form new hypotheses about the brain’s connectivity and how such connectivity contributes to memory and behavior.

The video that you see here is a special, extended version of the team’s first-place video from the NIH-supported BRAIN Initiative’s 2020 “Show Us Your BRAINS!” imaging contest. Because of the great potential of this next-generation technology, Translucence Biosystems has received Small Business Innovation Research grants from NIH’s National Institute of Mental Health to disseminate its “brain-clearing” imaging technology to the neuroscience community.

As more researchers try out this innovative approach, one can only imagine how much more data will be generated to enhance our understanding of how the brain functions in health and disease. That is what will be truly spectacular for everyone working on new and better ways to help people suffering from Alzheimer’s disease, Parkinson’s disease, schizophrenia, autism, epilepsy, traumatic brain injury, depression, and so many other neurological and psychiatric disorders.

Wishing all of you a happy and healthy July Fourth!

Links:

Brain Research Through Advancing Innovative Neurotechnologies® (BRAIN) Initiative (NIH)

NIH National Research Service Award

Medical Scientist Training Program (National Institute of General Medical Sciences/NIH)

Small Business Innovation Research and Small Business Technology Transfer (NIH)

Translucence Biosystems (Irvine, CA)

Sunil Gandhi (University of California, Irvine)

Ricardo Azevedo (University of California, Irvine)

Video: iDISCO-cleared whole brain from a Thy1-GFP mouse (Translucence Biosystems)

Show Us Your BRAINs! Photo & Video Contest (Brain Initiative/NIH)

NIH Support: National Institute of Mental Health; National Eye Institute


Finding Better Ways to Image the Retina

Posted on by

Two light microscopy fields of the retina showing small blue dots (rods) surrounding larger yellow dots (cones)
Credit: Johnny Tam, National Eye Institute, NIH

Every day, all around the world, eye care professionals are busy performing dilated eye exams. By looking through a patient’s widened pupil, they can view the retina—the postage stamp-sized tissue lining the back of the inner eye—and look for irregularities that may signal the development of vision loss.

The great news is that, thanks to research, retinal imaging just keeps getting better and better. The images above, which show the same cells viewed with two different microscopic techniques, provide good examples of how tweaking existing approaches can significantly improve our ability to visualize the retina’s two types of light-sensitive neurons: rod and cone cells.

Specifically, these images show an area of the outer retina, which is the part of the tissue that’s observed during a dilated eye exam. Thanks to colorization and other techniques, a viewer can readily distinguish between the light-sensing, color-detecting cone cells (orange) and the much smaller, lowlight-sensing rod cells (blue).

These high-res images come from Johnny Tam, a researcher with NIH’s National Eye Institute. Working with Alfredo Dubra, Stanford University, Palo Alto, CA, Tam and his team figured out how to limit light distortion of the rod cells. The key was illuminating the eye using less light, provided as a halo instead of the usual solid, circular beam.

But the researchers’ solution hit a temporary snag when the halo reflected from the rods and cones created another undesirable ring of light. To block it out, Tam’s team introduced a tiny pinhole, called a sub-Airy disk. Along with use of adaptive optics technology [1] to correct for other distortions of light, the scientists were excited to see such a clear view of individual rods and cones. They published their findings recently in the journal Optica [2]

The resolution produced using these techniques is so much improved (33 percent better than with current methods) that it’s even possible to visualize the tiny inner segments of both rods and cones. In the cones, for example, these inner segments help direct light coming into the eye to other, photosensitive parts that absorb single photons of light. The light is then converted into electrical signals that stream to the brain’s visual centers in the occipital cortex, which makes it possible for us to experience vision.

Tam and team are currently working with physician-scientists in the NIH Clinical Center to image the retinas of people with a variety of retinal diseases, including age-related macular degeneration (AMD), a leading cause of vision loss in older adults. These research studies are ongoing, but offer hopeful possibilities for safe and non-intrusive monitoring of individual rods and cones over time, as well as across disease types. That’s obviously good news for patients. Plus it will help scientists understand how a rod or cone cell stops working, as well as more precisely test the effects of gene therapy and other experimental treatments aimed at restoring vision.

References:

[1] Noninvasive imaging of the human rod photoreceptor mosaic using a confocal adaptive optics scanning ophthalmoscope. Dubra A, Sulai Y, Norris JL, Cooper RF, Dubis AM, Williams DR, Carroll J. Biomed Opt Express. 2011 Jul 1;2(7):1864-76.

[1] In-vivo sub-diffraction adaptive optics imaging of photoreceptors in the human eye with annular pupil illumination and sub-Airy detection. Rongwen L, Aguilera N, Liu T, Liu J, Giannini JP, Li J, Bower AJ, Dubra A, Tam J. Optica 2021 8, 333-343. https://doi.org/10.1364/OPTICA.414206

Links:

Get a Dilated Eye Exam (National Eye Institute/NIH)

How the Eyes Work (NEI)

Eye Health Data and Statistics (NEI)

Tam Lab (NEI)

Dubra Lab (Stanford University, Palo Alto, CA)

NIH Support: National Eye Institute


Next Page