Skip to main content

Cool Videos

A Real-Time Look at Value-Based Decision Making

Posted on by

All of us make many decisions every day. For most things, such as which jacket to wear or where to grab a cup of coffee, there’s usually no right answer, so we often decide using values rooted in our past experiences. Now, neuroscientists have identified the part of the mammalian brain that stores information essential to such value-based decision making.

Researchers zeroed in on this particular brain region, known as the retrosplenial cortex (RSC), by analyzing movies—including the clip shown about 32 seconds into this video—that captured in real time what goes on in the brains of mice as they make decisions. Each white circle is a neuron, and the flickers of light reflect their activity: the brighter the light, the more active the neuron at that point in time.

All told, the NIH-funded team, led by Ryoma Hattori and Takaki Komiyama, University of California at San Diego, La Jolla, made recordings of more than 45,000 neurons across six regions of the mouse brain [1]. Neural activity isn’t usually visible. But, in this case, researchers used mice that had been genetically engineered so that their neurons, when activated, expressed a protein that glowed.

Their system was also set up to encourage the mice to make value-based decisions, including choosing between two drinking tubes, each with a different probability of delivering water. During this decision-making process, the RSC proved to be the region of the brain where neurons persistently lit up, reflecting how the mouse evaluated one option over the other.

The new discovery, described in the journal Cell, comes as something of a surprise to neuroscientists because the RSC hadn’t previously been implicated in value-based decisions. To gather additional evidence, the researchers turned to optogenetics, a technique that enabled them to use light to inactivate neurons in the RSC’s of living animals. These studies confirmed that, with the RSC turned off, the mice couldn’t retrieve value information based on past experience.

The researchers note that the RSC is heavily interconnected with other key brain regions, including those involved in learning, memory, and controlling movement. This indicates that the RSC may be well situated to serve as a hub for storing value information, allowing it to be accessed and acted upon when it is needed.

The findings are yet another amazing example of how advances coming out of the NIH-led Brain Research through Advancing Innovative Neurotechnologies® (BRAIN) Initiative are revolutionizing our understanding of the brain. In the future, the team hopes to learn more about how the RSC stores this information and sends it to other parts of the brain. They note that it will also be important to explore how activity in this brain area may be altered in schizophrenia, dementia, substance abuse, and other conditions that may affect decision-making abilities. It will also be interesting to see how this develops during childhood and adolescence.

Reference:

[1] Area-Specificity and Plasticity of History-Dependent Value Coding During Learning. Hattori R, Danskin B, Babic Z, Mlynaryk N, Komiyama T. Cell. 2019 Jun 13;177(7):1858-1872.e15.

Links:

Brain Research through Advancing Innovative Neurotechnologies® (BRAIN) Initiative (NIH)

Komiyama Lab (UCSD, La Jolla)

NIH Support: National Institute of Neurological Disorders and Stroke; National Eye Institute; National Institute on Deafness and Other Communication Disorders


Seeing the Cytoskeleton in a Whole New Light

Posted on by

It’s been 25 years since researchers coaxed a bacterium to synthesize an unusual jellyfish protein that fluoresced bright green when irradiated with blue light. Within months, another group had also fused this small green fluorescent protein (GFP) to larger proteins to make their whereabouts inside the cell come to light—like never before.

To mark the anniversary of this Nobel Prize-winning work and show off the rainbow of color that is now being used to illuminate the inner workings of the cell, the American Society for Cell Biology (ASCB) recently held its Green Fluorescent Protein Image and Video Contest. Over the next few months, my blog will feature some of the most eye-catching entries—starting with this video that will remind those who grew up in the 1980s of those plasma balls that, when touched, light up with a simulated bolt of colorful lightning.

This video, which took third place in the ASCB contest, shows the cytoskeleton of a frequently studied human breast cancer cell line. The cytoskeleton is made from protein structures called microtubules, made visible by fluorescently tagging a protein called doublecortin (orange). Filaments of another protein called actin (purple) are seen here as the fine meshwork in the cell periphery.

The cytoskeleton plays an important role in giving cells shape and structure. But it also allows a cell to move and divide. Indeed, the motion in this video shows that the complex network of cytoskeletal components is constantly being organized and reorganized in ways that researchers are still working hard to understand.

Jeffrey van Haren, Erasmus University Medical Center, Rotterdam, the Netherlands, shot this video using the tools of fluorescence microscopy when he was a postdoctoral researcher in the NIH-funded lab of Torsten Wittman, University of California, San Francisco.

All good movies have unusual plot twists, and that’s truly the case here. Though the researchers are using a breast cancer cell line, their primary interest is in the doublecortin protein, which is normally found in association with microtubules in the developing brain. In fact, in people with mutations in the gene that encodes this protein, neurons fail to migrate properly during development. The resulting condition, called lissencephaly, leads to epilepsy, cognitive disability, and other neurological problems.

Cancer cells don’t usually express doublecortin. But, in some of their initial studies, the Wittman team thought it would be much easier to visualize and study doublecortin in the cancer cells. And so, the researchers tagged doublecortin with an orange fluorescent protein, engineered its expression in the breast cancer cells, and van Haren started taking pictures.

This movie and others helped lead to the intriguing discovery that doublecortin binds to microtubules in some places and not others [1]. It appears to do so based on the ability to recognize and bind to certain microtubule geometries. The researchers have since moved on to studies in cultured neurons.

This video is certainly a good example of the illuminating power of fluorescent proteins: enabling us to see cells and their cytoskeletons as incredibly dynamic, constantly moving entities. And, if you’d like to see much more where this came from, consider visiting van Haren’s Twitter gallery of microtubule videos here:

Reference:

[1] Doublecortin is excluded from growing microtubule ends and recognizes the GDP-microtubule lattice. Ettinger A, van Haren J, Ribeiro SA, Wittmann T. Curr Biol. 2016 Jun 20;26(12):1549-1555.

Links:

Lissencephaly Information Page (National Institute of Neurological Disorders and Stroke/NIH)

Wittman Lab (University of California, San Francisco)

Green Fluorescent Protein Image and Video Contest (American Society for Cell Biology, Bethesda, MD)

NIH Support: National Institute of General Medical Sciences


A Neuronal Light Show

Posted on by

Credit: Chen X, Cell, 2019

These colorful lights might look like a video vignette from one of the spectacular evening light shows taking place this holiday season. But they actually aren’t. These lights are illuminating the way to a much fuller understanding of the mammalian brain.

The video features a new research method called BARseq (Barcoded Anatomy Resolved by Sequencing). Created by a team of NIH-funded researchers led by Anthony Zador, Cold Spring Harbor Laboratory, NY, BARseq enables scientists to map in a matter of weeks the location of thousands of neurons in the mouse brain with greater precision than has ever been possible before.

How does it work? With BARseq, researchers generate uniquely identifying RNA barcodes and then tag one to each individual neuron within brain tissue. As reported recently in the journal Cell, those barcodes allow them to keep track of the location of an individual cell amid millions of neurons [1]. This also enables researchers to map the tangled paths of individual neurons from one region of the mouse brain to the next.

The video shows how the researchers read the barcodes. Each twinkling light is a barcoded neuron within a thin slice of mouse brain tissue. The changing colors from frame to frame correspond to one of the four letters, or chemical bases, in RNA (A=purple, G=blue, U=yellow, and C=white). A neuron that flashes blue, purple, yellow, white is tagged with a barcode that reads GAUC, while yellow, white, white, white is UCCC.

By sequencing and reading the barcodes to distinguish among seemingly identical cells, the researchers mapped the connections of more than 3,500 neurons in a mouse’s auditory cortex, a part of the brain involved in hearing. In fact, they report they’re now able to map tens of thousands of individual neurons in a mouse in a matter of weeks.

What makes BARseq even better than the team’s previous mapping approach, called MAPseq, is its ability to read the barcodes at their original location in the brain tissue [2]. As a result, they can produce maps with much finer resolution. It’s also possible to maintain other important information about each mapped neuron’s identity and function, including the expression of its genes.

Zador reports that they’re continuing to use BARseq to produce maps of other essential areas of the mouse brain with more detail than had previously been possible. Ultimately, these maps will provide a firm foundation for better understanding of human thought, consciousness, and decision-making, along with how such mental processes get altered in conditions such as autism spectrum disorder, schizophrenia, and depression.

Here’s wishing everyone a safe and happy holiday season. It’s been a fantastic year in science, and I look forward to bringing you more cool NIH-supported research in 2020!

References:

[1] High-Throughput Mapping of Long-Range Neuronal Projection Using In Situ Sequencing. Chen X, Sun YC, Zhan H, Kebschull JM, Fischer S, Matho K, Huang ZJ, Gillis J, Zador AM. Cell. 2019 Oct 17;179(3):772-786.e19.

[2] High-Throughput Mapping of Single-Neuron Projections by Sequencing of Barcoded RNA. Kebschull JM, Garcia da Silva P, Reid AP, Peikon ID, Albeanu DF, Zador AM. Neuron. 2016 Sep 7;91(5):975-987.

Links:

Brain Research through Advancing Innovative Neurotechnologies® (BRAIN) Initiative (NIH)

Zador Lab (Cold Spring Harbor Laboratory, Cold Spring Harbor, NY)

NIH Support: National Institute of Neurological Disorders and Stroke; National Institute on Drug Abuse; National Cancer Institute


3D Neuroscience at the Speed of Life

Posted on by

This fluorescent worm makes for much more than a mesmerizing video. It showcases a significant technological leap forward in our ability to capture in real time the firing of individual neurons in a living, freely moving animal.

As this Caenorhabditis elegans worm undulates, 113 neurons throughout its brain and body (green/yellow spots) get brighter and darker as each neuron activates and deactivates. In fact, about halfway through the video, you can see streaks tracking the positions of individual neurons (blue/purple-colored lines) from one frame to the next. Until now, it would have been technologically impossible to capture this “speed of life” with such clarity.

With funding from the NIH-led Brain Research through Advancing Innovative Neurotechnologies® (BRAIN) Initiative, Elizabeth Hillman at Columbia University’s Zuckerman Institute, New York, has pioneered the pairing of a 3D live-imaging microscope with an ultra-fast camera. This pairing, showcased above, is a technique called Swept Confocally Aligned Planar Excitation (SCAPE) microscopy.

Since first demonstrating SCAPE in February 2015 [1], Hillman and her team have worked hard to improve, refine, and expand the approach. Recently, they used SCAPE 1.0 to image how proprioceptive neurons in fruit-fly larvae sense body position while crawling. Now, as described in Nature Methods, they introduce SCAPE “2.0,” with boosted resolution and a much faster camera—enabling 3D imaging at speeds hundreds of times faster than conventional microscopes [2]. To track a very wiggly worm, the researchers image their target 25 times a second!

As with the first-generation SCAPE, version 2.0 uses a scanning mirror to sweep a slanted sheet of light across a sample. This same mirror redirects light coming from the illuminated plane to focus onto a stationary high-speed camera. The approach lets SCAPE grab 3D imaging at very high speeds, while also causing very little photobleaching compared to conventional point-scanning microscopes, reducing sample damage that often occurs during time-lapse microscopy.

Like SCAPE 1.0, since only a single, stationary objective lens is used, the upgraded 2.0 system doesn’t need to hold, move, or disturb a sample during imaging. This flexibility enables scientists to use SCAPE in a wide range of experiments where they can present stimuli or probe an animal’s behavior—all while imaging how the underlying cells drive and depict those behaviors.

The SCAPE 2.0 paper shows the system’s biological versatility by also recording the beating heart of a zebrafish embryo at record-breaking speeds. In addition, SCAPE 2.0 can rapidly image large fixed, cleared, and expanded tissues such as the retina, brain, and spinal cord—enabling tracing of the shape and connectivity of cellular circuits. Hillman and her team are dedicated to exporting their technology; they provide guidance and a parts list for SCAPE 2.0 so that researchers can build their own version using inexpensive off-the-shelf parts.

Watching worms wriggling around may remind us of middle-school science class. But to neuroscientists, these images represent progress toward understanding the nervous system in action, literally at the speed of life!

References:

[1] . Swept confocally-aligned planar excitation (SCAPE) microscopy for high speed volumetric imaging of behaving organisms. Bouchard MB, Voleti V, Mendes CS, Lacefield C, et al Nature Photonics. 2015;9(2):113-119.

[2] Real-time volumetric microscopy of in vivo dynamics and large-scale samples with SCAPE 2.0. Voleti V, Patel KB, Li W, Campos CP, et al. Nat Methods. 2019 Sept 27;16:1054–1062.

Links:

Using Research Organisms to Study Health and Disease (National Institute of General Medical Sciences/NIH)

The Brain Research through Advancing Innovative Neurotechnologies® (BRAIN) Initiative (NIH)

Hillman Lab (Columbia University, New York)

NIH Support: National Institute of Neurological Disorders and Stroke; National Heart, Lung, and Blood Institute


Giving Thanks for Biomedical Research

Posted on by

This Thanksgiving, Americans have an abundance of reasons to be grateful—loving family and good food often come to mind. Here’s one more to add to the list: exciting progress in biomedical research. To check out some of that progress, I encourage you to watch this short video, produced by NIH’s National Institute of Biomedical Imaging and Engineering (NIBIB), that showcases a few cool gadgets and devices now under development.

Among the technological innovations is a wearable ultrasound patch for monitoring blood pressure [1]. The patch was developed by a research team led by Sheng Xu and Chonghe Wang, University of California San Diego, La Jolla. When this small patch is worn on the neck, it measures blood pressure in the central arteries and veins by emitting continuous ultrasound waves.

Other great technologies featured in the video include:

Laser-Powered Glucose Meter. Peter So and Jeon Woong Kang, researchers at Massachusetts Institute of Technology (MIT), Cambridge, and their collaborators at MIT and University of Missouri, Columbia have developed a laser-powered device that measures glucose through the skin [2]. They report that this device potentially could provide accurate, continuous glucose monitoring for people with diabetes without the painful finger pricks.

15-Second Breast Scanner. Lihong Wang, a researcher at California Institute of Technology, Pasadena, and colleagues have combined laser light and sound waves to create a rapid, noninvasive, painless breast scan. It can be performed while a woman rests comfortably on a table without the radiation or compression of a standard mammogram [3].

White Blood Cell Counter. Carlos Castro-Gonzalez, then a postdoc at Massachusetts Institute of Technology, Cambridge, and colleagues developed a portable, non-invasive home monitor to count white blood cells as they pass through capillaries inside a finger [4]. The test, which takes about 1 minute, can be carried out at home, and will help those undergoing chemotherapy to determine whether their white cell count has dropped too low for the next dose, avoiding risk for treatment-compromising infections.

Neural-Enabled Prosthetic Hand (NEPH). Ranu Jung, a researcher at Florida International University, Miami, and colleagues have developed a prosthetic hand that restores a sense of touch, grip, and finger control for amputees [5]. NEPH is a fully implantable, wirelessly controlled system that directly stimulates nerves. More than two years ago, the FDA approved a first-in-human trial of the NEPH system.

If you want to check out more taxpayer-supported innovations, take a look at NIBIB’s two previous videos from 2013 and 2018 As always, let me offer thanks to you from the NIH family—and from all Americans who care about the future of their health—for your continued support. Happy Thanksgiving!

References:

[1] Monitoring of the central blood pressure waveform via a conformal ultrasonic device. Wang C, Li X, Hu H, Zhang, L, Huang Z, Lin M, Zhang Z, Yun Z, Huang B, Gong H, Bhaskaran S, Gu Y, Makihata M, Guo Y, Lei Y, Chen Y, Wang C, Li Y, Zhang T, Chen Z, Pisano AP, Zhang L, Zhou Q, Xu S. Nature Biomedical Engineering. September 2018, 687-695.

[2] Evaluation of accuracy dependence of Raman spectroscopic models on the ratio of calibration and validation points for non-invasive glucose sensing. Singh SP, Mukherjee S, Galindo LH, So PTC, Dasari RR, Khan UZ, Kannan R, Upendran A, Kang JW. Anal Bioanal Chem. 2018 Oct;410(25):6469-6475.

[3] Single-breath-hold photoacoustic computed tomography of the breast. Lin L, Hu P, Shi J, Appleton CM, Maslov K, Li L, Zhang R, Wang LV. Nat Commun. 2018 Jun 15;9(1):2352.

[4] Non-invasive detection of severe neutropenia in chemotherapy patients by optical imaging of nailfold microcirculation. Bourquard A, Pablo-Trinidad A, Butterworth I, Sánchez-Ferro Á, Cerrato C, Humala K, Fabra Urdiola M, Del Rio C, Valles B, Tucker-Schwartz JM, Lee ES, Vakoc BJ9, Padera TP, Ledesma-Carbayo MJ, Chen YB, Hochberg EP, Gray ML, Castro-González C. Sci Rep. 2018 Mar 28;8(1):5301.

[5] Enhancing Sensorimotor Integration Using a Neural Enabled Prosthetic Hand System

Links:

Sheng Xu Lab (University of California San Diego, La Jolla)

So Lab (Massachusetts Institute of Technology, Cambridge)

Lihong Wang (California Institute of Technology, Pasadena)

Video: Lihong Wang: Better Cancer Screenings

Carlos Castro-Gonzalez (Madrid-MIT M + Visión Consortium, Cambridge, MA)

Video: Carlos Castro-Gonzalez (YouTube)

Ranu Jung (Florida International University, Miami)

Video: New Prosthetic System Restores Sense of Touch (Florida International)

NIH Support: National Institute of Biomedical Imaging and Bioengineering; National Institute of Neurological Diseases and Stroke; National Heart, Lung, and Blood Institute; National Cancer Institute; Common Fund


Previous Page Next Page