New tool removes motion artifacts when imaging dynamic samples. Imaging microscopic samples requires capturing multiple, sequential measurements, then using computational algorithms to reconstruct a single, high-resolution image. This process can work well when the sample is static, but if it’s moving — as is common with live, biological specimens — the final image may be blurry or distorted.

Now, Berkeley researchers have developed a method to improve temporal resolution for these dynamic samples. In a study published in Nature Methods, they demonstrated a new computational imaging tool, dubbed the neural space-time model (NSTM), that uses a small, lightweight neural network to reduce motion artifacts and solve for the motion trajectories.