News Release

Event-based ‘neuromorphic camera’ enables kilohertz vascular imaging and functional reconstruction in the living brain

Neuromorphic sensors, typically for motion tracking, repurposed for functional imaging: capture fast cortical blood-flow and reconstruct neuronal activity from sparse events to continuous signals via self-supervised algorithm.

Peer-Reviewed Publication

Chinese Society for Optical Engineering

In vivo cortical vascular imaging using an event camera.

image: 

Figure 1: In vivo cortical vascular imaging using an event camera. a, Schematic of the mouse preparation. A mouse with a cranial window was anesthetized for cortical imaging. b, Left, widefield fluorescence image of pial vasculature acquired using an sCMOS camera. Scale bar, 20 μm. Middle, accumulated events from the event camera over time. Right, standard deviation image from naïve accumulation, highlighting three selected regions of interest (ROIs) selected for kymograph-based flow analysis. c, 100 Hz reconstructed sequential event frames displayed at 0.1s intervals. Positive (green) and negative (magenta) events correspond to RBC-induced brightness fluctuations in narrow vessels. d, Mean blood flow velocities measured from 1000 Hz reconstructions across the three ROIs. Error bars represent standard deviation over time. e, Kymographs generated from 1000 Hz reconstructions. Diagonal band slopes correspond to flow velocity. f, Mean event counts over time from a single vessel ROI (white box in left image). Alternating green and magenta spikes reflect RBC passages through the vessel.

view more 

Credit: Prof. Myunghwan Choi and Prof. Kunyoo Shin at Seoul National University, Prof. Young-gyu Yoon at KAIST, Prof. Euiheon Chung at GIST, PhotoniX

Event cameras were mostly used for movement: they fire events when brightness changes sharply at moving edges, making them ideal for robotics, autonomous driving, and other computer vision tasks that involve fast motion and large contrast changes. However, applying event cameras for functional brain imaging presents a very different challenge. Here, signals of interest are subtle changes in fluorescence tied to blood flow or neuronal activity. Wheter an event camera could reliably detect such small brightness changes-and do so in a way that is scientifically useful-has been an open question.
In a new paper in PhotoniX, researchers in South Korea and the United States present what they describe as the first comprehensive event-based framework tailored to functional biological imaging. The work is led by the Neurophotonics Lab in the School of Biological Sciences at Seoul National University and the NICA Lab in Korea Advanced Institute of Science and Technology (KAIST), in collaboration with groups in Seoul National University and Gwangju Institute of Science and Technology (GIST). Together, they show that event cameras can move beyond motion edges to capture both vascular and neuronal dynamics in vivo, using a combination of careful sensor characterization, multimodal animal experiments, and a new unsupervised reconstruction algorithm.

A central goal in modern neurophotonics is to monitor activity over large fields of view at high temporal resolution without being overwhelmed by data. Conventional frame-based cameras acquire full images at fixed frame rates, discarding most of the acquired information during analysis while still limiting the maximum speed and field of view. Event cameras flip this acquisition model: each pixel independently reports only changes in brightness, producing a sparse stream of asynchronous “events” with sub-millisecond timing and a very wide dynamic range. These properties are attractive for functional imaging in principle, but they had not previously been validated in vivo for small-amplitude activity signals.

To test whether event cameras are workable for functional imaging, the team first performed a quantitative optical characterization of the sensor in a regime that mimics functional signals rather than large object motion. They measured how reliably the camera reports small, slowly varying changes in brightness, assessed its temporal precision under these conditions, and evaluated noise behavior and sensitivity relevant for biological imaging. This calibration provides a link between the binary events recorded by the neuromorphic sensor and the underlying changes in intensity that neuroscientists typically interpret as blood-flow or calcium signals.

Armed with this characterization, the researchers integrated the event camera for monitoring cortical vascular dynamics in mice. By comparing the event streams to conventional widefield measurements and other physiological readouts, they showed that the sensor can faithfully track fast changes in blood flow signals. Importantly, the vascular activity was captured at an effective rate of 1kHz, highlighting that event-based acquisition can reach kilohertz temporal resolution while covering a large field-of-view without the overhead of reading out full image frames at those speeds.

The same framework was also applied to neuronal calcium dynamics, both in cultured neurons and in vivo mouse cortex. However, raw event data are not in the form that neuroscientists typically work with. Each event simply encodes that a pixel has become brighter or dimmer at a specific time, without directly providing a continuous intensity value or ΔF/F0 trace. To turn these binary, asynchronous measurements into interpretable functional images, the authors developed a self-supervised reconstruction framework called Implicit Neural Factorization (INF). Rather than requiring paired frame-based “ground truth” movies, INF learns a continuous representation of the underlying activity directly from the event stream, using the precise timing of events as a constraint. INF maps the sparse event data back into smooth, time-resolved functional ΔF/F0 images. For neuronal calcium activity, INF enabled direct comparison to conventional sCMOS recordings while still benefiting from the data efficiency of event-based acquisition.

By connecting quantitative sensor characterization, in vivo vascular and neuronal experiments, and an unsupervised reconstruction that yields familiar functional data formats, the study establishes event cameras as practical tools for biological imaging rather than just theoretical curiosities. The authors argue that the same principles could extend naturally to faster optical reporters, including genetically encoded voltage indicators or voltage sensitive dyes, where millisecond-scale dynamics and data bottlenecks are major challenges. In that regime, event cameras and algorithms like INF could offer a path toward large-field, high-speed recording that avoids much of the redundancy and bandwidth cost of conventional frame-based approaches.

More broadly, the work highlights how neuromorphic sensors can be repurposed for basic neuroscience and biomedical imaging when combined with appropriate modeling and reconstruction methods. By showing that event cameras can be calibrated for small functional signals, used to image cortical blood flow at kilohertz rates, and coupled to an unsupervised algorithm that reconstructs ΔF/F0 functional images from binary events, this collaboration between Seoul National University, KAIST, and GIST suggests a new direction for high-speed, data-efficient functional imaging across scales.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.