End-to-end deep learning approach to mouse behavior classification from cortex-wide calcium imaging

This article has 0 evaluations Published on
Read the full article Related papers
This article on Sciety

Abstract

Deep learning is a powerful tool for neural decoding, broadly applied to systems neuroscience and clinical studies. Interpretable and transparent models which can explain neural decoding for intended behaviors are crucial to identify essential features of deep learning decoders in brain activity. In this study, we examine the performance of deep learning to classify mouse behavioral states from mesoscopic cortex-wide calcium imaging data. Our convolutional neural network (CNN)-based end-to-end decoder combined with recurrent neural network (RNN) classifies the behavioral states with high accuracy and robustness to individual differences on temporal scales of sub-seconds. Using the CNN-RNN decoder, we identify that the forelimb and hindlimb areas in the somatosensory cortex significantly contribute to behavioral classification. Our findings imply that the end-to-end approach has the potential to be an interpretable deep learning method with unbiased visualization of critical brain regions.

Author Summary

Deep learning is used in neuroscience, and it has become possible to classify and predict behavior from massive data of neural signals from animals, including humans. However, little is known about how deep learning discriminates the features of neural signals. In this study, we perform behavioral classification from calcium imaging data of the mouse cortex and investigate brain regions important for the classification. By the end-to-end approach, an unbiased method without data pre-processing, we clarify that information on the somatosensory areas in the cortex is important for distinguishing between resting and moving states in mice. This study will contribute to the development of interpretable deep-learning technology.

Related articles

Related articles are currently not available for this article.