Skip to main content

optogenetics

What a Memory Looks Like

Posted on by

Engram Image
Credit: Stephanie Grella, Ramirez Group, Boston University

Your brain has the capacity to store a lifetime of memories, covering everything from the name of your first pet to your latest computer password. But what does a memory actually look like? Thanks to some very cool neuroscience, you are looking at one.

The physical manifestation of a memory, or engram, consists of clusters of brain cells active when a specific memory was formed. Your brain’s hippocampus plays an important role in storing and retrieving these memories. In this cross-section of a mouse hippocampus, imaged by the lab of NIH-supported neuroscientist Steve Ramirez, at Boston University, cells belonging to an engram are green, while blue indicates those not involved in forming the memory.

When a memory is recalled, the cells within an engram reactivate and turn on, to varying degrees, other neural circuits (e.g., sight, sound, smell, emotions) that were active when that memory was recorded. It’s not clear how these brain-wide connections are made. But it appears that engrams are the gatekeepers that mediate memory.

The story of this research dates back several years, when Ramirez helped develop a system that made it possible to image engrams by tagging cells in the mouse brain with fluorescent dyes. Using an innovative technology developed by other researchers, called optogenetics, Ramirez’s team then discovered it could shine light onto a collection of hippocampal neurons storing a specific memory and reactivate the sensation associated with the memory [1].

Ramirez has since gone on to show that, at least in mice, optogenetics can be used to trick the brain into creating a false memory [2]. From this work, he has also come to the interesting and somewhat troubling conclusion that the most accurate memories appear to be the ones that are never recalled. The reason: the mammalian brain edits—and slightly changes—memories whenever they are accessed.

All of the above suggested to Ramirez that, given its tremendous plasticity, the brain may possess the power to downplay a traumatic memory or to boost a pleasant recollection. Toward that end, Ramirez’s team is now using its mouse system to explore ways of suppressing one engram while enhancing another [3].

For Ramirez, though, the ultimate goal is to develop brain-wide maps that chart all of the neural networks involved in recording, storing, and retrieving memories. He recently was awarded an NIH Director’s Transformative Research Award to begin the process. Such maps will be invaluable in determining how stress affects memory, as well as what goes wrong in dementia and other devastating memory disorders.

References:

[1] Optogenetic stimulation of a hippocampal engram activates fear memory recall. Liu X, Ramirez S, Pang PT, Puryear CB, Govindarajan A, Deisseroth K, Tonegawa S. Nature. 2012 Mar 22;484(7394):381-385.

[2] Creating a false memory in the hippocampus. Ramirez S, Liu X, Lin PA, Suh J, Pignatelli M, Redondo RL, Ryan TJ, Tonegawa S. Science. 2013 Jul 26;341(6144):387-391.

[3] Artificially Enhancing and Suppressing Hippocampus-Mediated Memories. Chen BK, Murawski NJ, Cincotta C, McKissick O, Finkelstein A, Hamidi AB, Merfeld E, Doucette E, Grella SL, Shpokayte M, Zaki Y, Fortin A, Ramirez S. Curr Biol. 2019 Jun 3;29(11):1885-1894.

Links:

The Ramirez Group (Boston University, MA)

Ramirez Project Information (Common Fund/NIH)

NIH Director’s Early Independence Award (Common Fund)

NIH Director’s Transformative Research Award (Common Fund)

NIH Support: Common Fund


The Amazing Brain: Making Up for Lost Vision

Posted on by

Recently, I’ve highlighted just a few of the many amazing advances coming out of the NIH-led Brain Research through Advancing Innovative Neurotechnologies® (BRAIN) Initiative. And for our grand finale, I’d like to share a cool video that reveals how this revolutionary effort to map the human brain is opening up potential plans to help people with disabilities, such as vision loss, that were once unimaginable.

This video, produced by Jordi Chanovas and narrated by Stephen Macknik, State University of New York Downstate Health Sciences University, Brooklyn, outlines a new strategy aimed at restoring loss of central vision in people with age-related macular degeneration (AMD), a leading cause of vision loss among people age 50 and older. The researchers’ ultimate goal is to give such people the ability to see the faces of their loved ones or possibly even read again.

In the innovative approach you see here, neuroscientists aren’t even trying to repair the part of the eye destroyed by AMD: the light-sensitive retina. Instead, they are attempting to recreate the light-recording function of the retina within the brain itself.

How is that possible? Normally, the retina streams visual information continuously to the brain’s primary visual cortex, which receives the information and processes it into the vision that allows you to read these words. In folks with AMD-related vision loss, even though many cells in the center of the retina have stopped streaming, the primary visual cortex remains fully functional to receive and process visual information.

About five years ago, Macknik and his collaborator Susana Martinez-Conde, also at Downstate, wondered whether it might be possible to circumvent the eyes and stream an alternative source of visual information to the brain’s primary visual cortex, thereby restoring vision in people with AMD. They sketched out some possibilities and settled on an innovative system that they call OBServ.

Among the vital components of this experimental system are tiny, implantable neuro-prosthetic recording devices. Created in the Macknik and Martinez-Conde labs, this 1-centimeter device is powered by induction coils similar to those in the cochlear implants used to help people with profound hearing loss. The researchers propose to surgically implant two of these devices in the rear of the brain, where they will orchestrate the visual process.

For technical reasons, the restoration of central vision will likely be partial, with the window of vision spanning only about the size of one-third of an adult thumbnail held at arm’s length. But researchers think that would be enough central vision for people with AMD to regain some of their lost independence.

As demonstrated in this video from the BRAIN Initiative’s “Show Us Your Brain!” contest, here’s how researchers envision the system would ultimately work:

• A person with vision loss puts on a specially designed set of glasses. Each lens contains two cameras: one to record visual information in the person’s field of vision; the other to track that person’s eye movements enabled by residual peripheral vision.
• The eyeglass cameras wirelessly stream the visual information they have recorded to two neuro-prosthetic devices implanted in the rear of the brain.
• The neuro-prosthetic devices process and project this information onto a specific set of excitatory neurons in the brain’s hard-wired visual pathway. Researchers have previously used genetic engineering to turn these neurons into surrogate photoreceptor cells, which function much like those in the eye’s retina.
• The surrogate photoreceptor cells in the brain relay visual information to the primary visual cortex for processing.
• All the while, the neuro-prosthetic devices perform quality control of the visual signals, calibrating them to optimize their contrast and clarity.

While this might sound like the stuff of science-fiction (and this actual application still lies several years in the future), the OBServ project is now actually conceivable thanks to decades of advances in the fields of neuroscience, vision, bioengineering, and bioinformatics research. All this hard work has made the primary visual cortex, with its switchboard-like wiring system, among the brain’s best-understood regions.

OBServ also has implications that extend far beyond vision loss. This project provides hope that once other parts of the brain are fully mapped, it may be possible to design equally innovative systems to help make life easier for people with other disabilities and conditions.

Links:

Age-Related Macular Degeneration (National Eye Institute/NIH)

Macknik Lab (SUNY Downstate Health Sciences University, Brooklyn)

Martinez-Conde Laboratory (SUNY Downstate Health Sciences University)

Show Us Your Brain! (BRAIN Initiative/NIH)

Brain Research through Advancing Innovative Neurotechnologies® (BRAIN) Initiative (NIH)

NIH Support: BRAIN Initiative


‘Nanoantennae’ Make Infrared Vision Possible

Posted on by

Nanoparticles for infrared vision
Caption: Nanoparticles (green) bind to light-sensing rod (violet) and cone (red) cells in the mouse retina. Dashed lines (white) highlight cells’ inner/outer segments.
Credit: Ma et al. Cell, 2019

Infrared vision often brings to mind night-vision goggles that allow soldiers to see in the dark, like you might have seen in the movie Zero Dark Thirty. But those bulky goggles may not be needed one day to scope out enemy territory or just the usual things that go bump in the night. In a dramatic advance that brings together material science and the mammalian vision system, researchers have just shown that specialized lab-made nanoparticles applied to the retina, the thin tissue lining the back of the eye, can extend natural vision to see in infrared light.

The researchers showed in mouse studies that their specially crafted nanoparticles bind to the retina’s light-sensing cells, where they act like “nanoantennae” for the animals to see and recognize shapes in infrared—day or night—for at least 10 weeks. Even better, the mice maintained their normal vision the whole time and showed no adverse health effects. In fact, some of the mice are still alive and well in the lab, although their ability to see in infrared may have worn off.

When light enters the eyes of mice, humans, or any mammal, light-sensing cells in the retina absorb wavelengths within the range of visible light. (That’s roughly from 400 to 700 nanometers.) While visible light includes all the colors of the rainbow, it actually accounts for only a fraction of the full electromagnetic spectrum. Left out are the longer wavelengths of infrared light. That makes infrared light invisible to the naked eye.

In the study reported in the journal Cell, an international research team including Gang Han, University of Massachusetts Medical School, Worcester, wanted to find a way for mammalian light-sensing cells to absorb and respond to the longer wavelengths of infrared [1]. It turns out Han’s team had just the thing to do it.

His NIH-funded team was already working on the nanoparticles now under study for application in a field called optogenetics—the use of light to control living brain cells [2]. Optogenetics normally involves the stimulation of genetically modified brain cells with blue light. The trouble is that blue light doesn’t penetrate brain tissue well.

That’s where Han’s so-called upconversion nanoparticles (UCNPs) came in. They attempt to get around the normal limitations of optogenetic tools by incorporating certain rare earth metals. Those metals have a natural ability to absorb lower energy infrared light and convert it into higher energy visible light (hence the term upconversion).

But could those UCNPs also serve as miniature antennae in the eye, receiving infrared light and emitting readily detected visible light? To find out in mouse studies, the researchers injected a dilute solution containing UCNPs into the back of eye. Such sub-retinal injections are used routinely by ophthalmologists to treat people with various eye problems.

These UCNPs were modified with a protein that allowed them to stick to light-sensing cells. Because of the way that UCNPs absorb and emit wavelengths of light energy, they should to stick to the light-sensing cells and make otherwise invisible infrared light visible as green light.

Their hunch proved correct, as mice treated with the UCNP solution began seeing in infrared! How could the researchers tell? First, they shined infrared light into the eyes of the mice. Their pupils constricted in response just as they would with visible light. Then the treated mice aced a series of maneuvers in the dark that their untreated counterparts couldn’t manage. The treated animals also could rely on infrared signals to make out shapes.

The research is not only fascinating, but its findings may also have a wide range of intriguing applications. One could imagine taking advantage of the technology for use in hiding encrypted messages in infrared or enabling people to acquire a temporary, built-in ability to see in complete darkness.

With some tweaks and continued research to confirm the safety of these nanoparticles, the system might also find use in medicine. For instance, the nanoparticles could potentially improve vision in those who can’t see certain colors. While such infrared vision technologies will take time to become more widely available, it’s a great example of how one area of science can cross-fertilize another.

References:

[1] Mammalian Near-Infrared Image Vision through Injectable and Self-Powered Retinal Nanoantennae. Ma Y, Bao J, Zhang Y, Li Z, Zhou X, Wan C, Huang L, Zhao Y, Han G, Xue T. Cell. 2019 Feb 27. [Epub ahead of print]

[2] Near-Infrared-Light Activatable Nanoparticles for Deep-Tissue-Penetrating Wireless Optogenetics. Yu N, Huang L, Zhou Y, Xue T, Chen Z, Han G. Adv Healthc Mater. 2019 Jan 11:e1801132.

Links:

Diagram of the Eye (National Eye Institute/NIH)

Infrared Waves (NASA)

Visible Light (NASA)

Han Lab (University of Massachusetts, Worcester)

NIH Support: National Institute of Mental Health; National Institute of General Medical Sciences


‘Tis the Season for Good Cheer

Posted on by

Whether it’s Rockefeller Center, the White House, or somewhere else across the land, ‘tis the season to gather with neighbors for a communal holiday tree-lighting ceremony. But this festive image has more do with those cups of cider in everyone’s hands than admiring the perfect Douglas fir. What looks like lights and branches are actually components of a high-resolution map from a part of the brain that controls thirst.

The map, drawn up from mouse studies, shows that when thirst arises, neurons activate a gene called c-fos (red)—lighting up the tree—indicating it’s time for a drink. In response, other neurons (green) direct additional parts of the brain to compensate by managing internal water levels. In a mouse that’s no longer thirsty, the tree would look almost all green.

This wiring map comes from a part of the brain called the hypothalamus, which is best known for its role in hunger, thirst, and energy balance. Thanks to powerful molecular tools from NIH’s Brain Research through Advancing Innovative Technologies (BRAIN) Initiative, Yuki Oka of the California Institute of Technology, Pasadena, and his team were able to draw detailed maps of the tree-shaped region, called the median preoptic nucleus (MnPO).

Using a technique called optogenetics, Oka’s team, led by Vineet Augustine, could selectively turn on genes in the MnPO [1]. By doing so, they could control a mouse’s thirst and trace the precise control pathways responsible for drinking or not.

This holiday season, as you gather with loved ones, take a moment to savor the beautiful complexity of biology and the gift of human health. Happy holidays to all of you, and peace and joy into the new year!

Reference:

[1] Hierarchical neural architecture underlying thirst regulation. Augustine V, Gokce SK, Lee S, Wang B, Davidson TJ, Reimann F, Gribble F, Deisseroth K, Lois C, Oka Y. Nature. 2018 Mar 8;555(7695):204-209. 

Links:

Oka Lab, California Institute of Technology, Pasadena

The BRAIN Initiative (NIH)

NIH Support: National Institute of Neurological Disorders and Stroke


Unlocking the Brain’s Memory Retrieval System

Posted on by

Memory Trace in Mouse Hippocampus

Credit:Sahay Lab, Massachusetts General Hospital, Boston

Play the first few bars of any widely known piece of music, be it The Star-Spangled Banner, Beethoven’s Fifth, or The Rolling Stones’ (I Can’t Get No) Satisfaction, and you’ll find that many folks can’t resist filling in the rest of the melody. That’s because the human brain thrives on completing familiar patterns. But, as we grow older, our pattern completion skills often become more error prone.

This image shows some of the neural wiring that controls pattern completion in the mammalian brain. Specifically, you’re looking at a cross-section of a mouse hippocampus that’s packed with dentate granule neurons and their signal-transmitting arms, called axons, (light green). Note how the axons’ short, finger-like projections, called filopodia (bright green), are interacting with a neuron (red) to form a “memory trace” network. Functioning much like an online search engine, memory traces use bits of incoming information, like the first few notes of a song, to locate and pull up more detailed information, like the complete song, from the brain’s repository of memories in the cerebral cortex.


Next Page