cala, mio, & noob
Fully integrated streaming calcium imaging and analysis
Abstract¶
Naturalistic neuroscientific experiments require us to acquire and analyze long-term multimodal data. Calcium imaging analysis tools are typically monolithic and challenging to integrate with the rest of the tooling used in an experiment. Graph-based frameworks like LabVIEW and Bonsai are widely used for acquisition, but face similar integration challenges. This forces researchers to compromise their experiment workflow around the available tools, instead of utilizing the tools around their experiment designs.
We present cala, a streaming Calcium fluorescence imaging analysis package with a focus on reproducibility, usability, and extensibility. cala is easy enough to deploy, and efficient enough to run on a laptop without compromising accuracy.
Powering cala is noob, a streaming pipeline framework for Python that can wrap existing code with minimal adaptation. This allows us to provide high-performance analysis and acquisition tools as structured compositions of smaller operations. Devices and analysis methods are written as composable pipelines that still allow full access to their individual parts, giving researchers much finer control; e.g. patching plotting, I/O, or custom algorithms into a preset pipeline, or creating their own. This design supports both the freeform creativity needed throughout the experimental process and the structure needed for reproducible, standards-driven science. Processing pipelines are declaratively configured with YAML, aiming for a zero-code UI and end-to-end experimental replication from a single file.
This allows cala to operate three loops simultaneously: the top layer GUI, the analysis loop that yields the initial pass of fluorescence footprints and traces, and the refinement loop that optimizes both over longer timescales. Furthermore, it can provide clear visibility into data transformations across the entire processing pipeline at the same time.
We further show how an integrative pipelining system can be integrated into acquisition hardware and software with our Miniscope I/O SDK mio. This creates a unified programming interface from photons to figures, dissolving distinctions between analysis vs. acquisition tools and open- vs. closed-loop experiments. By focusing on the glue between tools, rather than building vertically-integrated, single-purpose hardware/software rigs, we facilitate the kind of longterm, heterogeneous, collaborative experiments required to understand the naturally behaving brain.
At SfN¶
- Time: Monday, November 17th, 1PM-5PM
- Location: SDCC Halls B-H, PSTR254.08 / ZZ13
