Skip to main content

3D imaging

New Microscope Technique Provides Real-Time 3D Views

Posted on by

Most of the “cool” videos shared on my blog are borne of countless hours behind a microscope. Researchers must move a biological sample through a microscope’s focus, slowly acquiring hundreds of high-res 2D snapshots, one painstaking snap at a time. Afterwards, sophisticated computer software takes this ordered “stack” of images, calculates how the object would look from different perspectives, and later displays them as 3D views of life that can be streamed as short videos.

But this video is different. It was created by what’s called a multi-angle projection imaging system. This new optical device requires just a few camera snapshots and two mirrors to image a biological sample from multiple angles at once. Because the device eliminates the time-consuming process of acquiring individual image slices, it’s up to 100 times faster than current technologies and doesn’t require computer software to construct the movie. The kicker is that the video can be displayed in real time, which isn’t possible with existing image-stacking methods.

The video here shows two human melanoma cells, rotating several times between overhead and side views. You can see large amounts of the protein PI3K (brighter orange hues indicate higher concentrations), which helps some cancer cells divide and move around. Near the cell’s perimeter are small, dynamic surface protrusions. PI3K in these “blebs” is thought to help tumor cells navigate and survive in foreign tissues as the tumor spreads to other organs, a process known as metastasis.

The new multi-angle projection imaging system optical device was described in a paper published recently in the journal Nature Methods [1]. It was created by Reto Fiolka and Kevin Dean at the University of Texas Southwestern Medical Center, Dallas.

Like most technology, this device is complicated. Rather than the microscope and camera doing all the work, as is customary, two mirrors within the microscope play a starring role. During a camera exposure, these mirrors rotate ever so slightly and warp the acquired image in such a way that successive, unique perspectives of the sample magically come into view. By changing the amount of warp, the sample appears to rotate in real-time. As such, each view shown in the video requires only one camera snapshot, instead of acquiring hundreds of slices in a conventional scheme.

The concept traces to computer science and an algorithm called the shear warp transform method. It’s used to observe 3D objects from different perspectives on a 2D computer monitor. Fiolka, Dean, and team found they could implement a similar algorithm optically for use with a microscope. What’s more, their multi-angle projection imaging system is easy-to-use, inexpensive, and can be converted for use on any camera-based microscope.

The researchers have used the device to view samples spanning a range of sizes: from mitochondria and other tiny organelles inside cells to the beating heart of a young zebrafish. And, as the video shows, it has been applied to study cancer and other human diseases.

In a neat, but also scientifically valuable twist, the new optical method can generate a virtual reality view of a sample. Any microscope user wearing the appropriately colored 3D glasses immediately sees the objects.

While virtual reality viewing of cellular life might sound like a gimmick, Fiolka and Dean believe that it will help researchers use their current microscopes to see any sample in 3D—offering the chance to find rare and potentially important biological events much faster than is possible with even the most advanced microscopes today.

Fiolka, Dean, and team are still just getting started. Because the method analyzes tissue very quickly within a single image frame, they say it will enable scientists to observe the fastest events in biology, such as the movement of calcium throughout a neuron—or even a whole bundle of neurons at once. For neuroscientists trying to understand the brain, that’s a movie they will really want to see.


[1] Real-time multi-angle projection imaging of biological dynamics. Chang BJ, Manton JD, Sapoznik E, Pohlkamp T, Terrones TS, Welf ES, Murali VS, Roudot P, Hake K, Whitehead L, York AG, Dean KM, Fiolka R. Nat Methods. 2021 Jul;18(7):829-834.


Metastatic Cancer: When Cancer Spreads (National Cancer Institute)

Fiolka Lab (University of Texas Southwestern Medical Center, Dallas)

Dean Lab (University of Texas Southwestern)

Microscopy Innovation Lab (University of Texas Southwestern)

NIH Support: National Cancer Institute; National Institute of General Medical Sciences

The Amazing Brain: A Sharper Image of the Pyramidal Tract

Posted on by

Flip the image above upside down, and the shape may remind you of something. If you think it resembles a pyramid, then you and a lot of great neuroscientists are thinking alike. What you are viewing is a colorized, 3D reconstruction of a pyramidal tract, which are bundles of nerve fibers that originate from the brain’s cerebral cortex and relay signals to the brainstem or the spinal cord. These signals control many important activities, including the voluntary movement of our arms, legs, head, and face.

For a while now, it’s been possible to combine a specialized form of magnetic resonance imaging (MRI) with computer modeling tools to produce 3D reconstructions of complicated networks of nerve fibers, such as the pyramidal tract. Still, for technical reasons, the quality of these reconstructions has remained poor in parts of the brain where nerve fibers cross at angles of 40 degrees or less.

The video above demonstrates how adding a sophisticated algorithm, called Orientation Distribution Function (ODF)-Fingerprinting, to such modeling can help overcome this problem when reconstructing a pyramidal tract. It has potential to enhance the reliability of these 3D reconstructions as neurosurgeons begin to use them to plan out their surgeries to help ensure they are carried out with the utmost safety and precision.

In the first second of the video, you see gray, fuzzy images from a diffusion MRI of the pyramidal tract. But, very quickly, a more colorful, detailed 3D reconstruction begins to appear, swiftly filling in from the top down. Colors are used to indicate the primary orientations of the nerve fibers: left to right (red), back to front (green), and top to bottom (blue). The orange, magenta, and other colors represent combinations of these primary directional orientations.

About three seconds into the video, a rough draft of the 3D reconstruction is complete. The top of the pyramidal tract looks pretty good. However, looking lower down, you can see distortions in color and relatively poor resolution of the nerve fibers in the middle of the tract—exactly where the fibers cross each other at angles of less than 40 degrees. So, researchers tapped into the power of their new ODF-Fingerprinting software to improve the image—and, starting about nine seconds into the video, you can see an impressive final result.

The researchers who produced this amazing video are Patryk Filipiak and colleagues in the NIH-supported lab of Steven Baete, Center for Advanced Imaging Innovation and Research, New York University Grossman School of Medicine, New York. The work paired diffusion MRI data from the NIH Human Connectome Project with the ODF-Fingerprinting algorithm, which was created by Baete to incorporate additional MRI imaging data on the shape of nerve fibers to infer their directionality [1].

This innovative approach to imaging recently earned Baete’s team second place in the 2021 “Show Us Your BRAINs” Photo and Video contest, sponsored by the NIH-led Brain Research through Advancing Innovative Neurotechnologies® (BRAIN) Initiative. But researchers aren’t stopping there! They are continuing to refine ODF-Fingerprinting, with the aim of modeling the pyramidal tract in even higher resolution for use in devising new and better ways of helping people undergoing neurosurgery.


[1] Fingerprinting Orientation Distribution Functions in diffusion MRI detects smaller crossing angles. Baete SH, Cloos MA, Lin YC, Placantonakis DG, Shepherd T, Boada FE. Neuroimage. 2019 Sep;198:231-241.


Brain Research through Advancing Innovative Neurotechnologies® (BRAIN) Initiative (NIH)

Human Connectome Project (University of Southern California, Los Angeles)

Steven Baete (Center for Advanced Imaging Innovation and Research, New York University Grossman School of Medicine, New York)

Show Us Your BRAINs! Photo and Video Contest (BRAIN Initiative/NIH)

NIH Support: National Institute of Biomedical Imaging and Bioengineering; National Institute of Neurological Disorders and Stroke; National Cancer Institute

Watching Cancer Cells Play Ball

Posted on by

Credit: Ning Wang, University of Illinois at Urbana-Champaign

As tumor cells divide and grow, they push, pull, and squeeze one another. While scientists have suspected those mechanical stresses may play important roles in cancer, it’s been tough to figure out how. That’s in large part because there hadn’t been a good way to measure those forces within a tissue. Now, there is.

As described in Nature Communications, an NIH-funded research team has developed a technique for measuring those subtle mechanical forces in cancer and also during development [1]. Their ingenious approach is called the elastic round microgel (ERMG) method. It relies on round elastic microspheres—similar to miniature basketballs, only filled with fluorescent nanoparticles in place of air. In the time-lapse video above, you see growing and dividing melanoma cancer cells as they squeeze and spin one of those cell-sized “balls” over the course of 24 hours.

Creative Minds: Reprogramming the Brain

Posted on by

Cells of a mouse retina

Caption: Neuronal circuits in the mouse retina. Cone photoreceptors (red) enable color vision; bipolar neurons (magenta) relay information further along the circuit; and a subtype of bipolar neuron (green) helps process signals sensed by other photoreceptors in dim light.
Credit: Brian Liu and Melanie Samuel, Baylor College of Medicine, Houston.

When most people think of reprogramming something, they probably think of writing code for a computer or typing commands into their smartphone. Melanie Samuel thinks of brain circuits, the networks of interconnected neurons that allow different parts of the brain to work together in processing information.

Samuel, a researcher at Baylor College of Medicine, Houston, wants to learn to reprogram the connections, or synapses, of brain circuits that function less well in aging and disease and limit our memory and ability to learn. She has received a 2016 NIH Director’s New Innovator Award to decipher the molecular cues that encourage the repair of damaged synapses or enable neurons to form new connections with other neurons. Because extensive synapse loss is central to most degenerative brain diseases, Samuel’s reprogramming efforts could help point the way to preventing or correcting wiring defects before they advance to serious and potentially irreversible cognitive problems.

Tumor Scanner Promises Fast 3D Imaging of Biopsies

Posted on by

UW light sheet microscope team

Caption: University of Washington team that developed new light-sheet microscope (center) includes (l-r) Jonathan Liu, Adam Glaser, Larry True, Nicholas Reder, and Ye Chen.
Credit: Mark Stone/University of Washington

After surgically removing a tumor from a cancer patient, doctors like to send off some of the tissue for evaluation by a pathologist to get a better idea of whether the margins are cancer free and to guide further treatment decisions. But for technical reasons, completing the pathology report can take days, much to the frustration of patients and their families. Sometimes the results even require an additional surgical procedure.

Now, NIH-funded researchers have developed a groundbreaking new microscope to help perform the pathology in minutes, not days. How’s that possible? The device works like a scanner for tissues, using a thin sheet of light to capture a series of thin cross sections within a tumor specimen without having to section it with a knife, as is done with conventional pathology. The rapidly acquired 2D “optical sections” are processed by a computer that assembles them into a high-resolution 3D image for immediate analysis.

Next Page