Deep-learning two-photon fiberscopy for video-rate brain imaging in freely-behaving mice

Honghua Guan, Dawei Li, Hyeon cheol Park, Ang Li, Yuanlei Yue, Yungtian A. Gau, Ming Jun Li, Dwight E. Bergles, Hui Lu, Xingde Li

Research output: Contribution to journalArticlepeer-review


Scanning two-photon (2P) fiberscopes (also termed endomicroscopes) have the potential to transform our understanding of how discrete neural activity patterns result in distinct behaviors, as they are capable of high resolution, sub cellular imaging yet small and light enough to allow free movement of mice. However, their acquisition speed is currently suboptimal, due to opto-mechanical size and weight constraints. Here we demonstrate significant advances in 2P fiberscopy that allow high resolution imaging at high speeds (26 fps) in freely-behaving mice. A high-speed scanner and a down-sampling scheme are developed to boost imaging speed, and a deep learning (DL) algorithm is introduced to recover image quality. For the DL algorithm, a two-stage learning transfer strategy is established to generate proper training datasets for enhancing the quality of in vivo images. Implementation enables video-rate imaging at ~26 fps, representing 10-fold improvement in imaging speed over the previous 2P fiberscopy technology while maintaining a high signal-to-noise ratio and imaging resolution. This DL-assisted 2P fiberscope is capable of imaging the arousal-induced activity changes in populations of layer2/3 pyramidal neurons in the primary motor cortex of freely-behaving mice, providing opportunities to define the neural basis of behavior.

Original languageEnglish (US)
Article number1534
JournalNature communications
Issue number1
StatePublished - Dec 2022

ASJC Scopus subject areas

  • General Physics and Astronomy
  • General Chemistry
  • General Biochemistry, Genetics and Molecular Biology


Dive into the research topics of 'Deep-learning two-photon fiberscopy for video-rate brain imaging in freely-behaving mice'. Together they form a unique fingerprint.

Cite this