Neural Population Dynamics of Computing with Synaptic Modulations

This article has 5 evaluations Published on
Read the full article Related papers
This article on Sciety

Abstract

In addition to long-time scale rewiring, synapses in the brain are subject to significant modulation that occurs at much shorter time scales and allows them to process short-term information. Despite this, models of the brain like recurrent neural networks (RNNs) often have their weights frozen after training, relying on an internal state stored in neuron activity to process temporal information. Although networks with dynamical synapses have been explored previously, often said dynamics are added to networks that also have recurrent connections and thus the short-time scale computational capabilities of synapse modulation alone remain unclear. In this work, we analyze the dynamics of a network that relies solely on synaptic modulations to process short-time scale information, the multi-plasticity network (MPN). We thoroughly examine the neural population dynamics of the MPN trained on integration-based tasks and compare it to known RNN dynamics, findings the two to have fundamentally different behavior and attractor structure. We find said differences in dynamics allow the MPN to outperform its RNN counterparts on several neuroscience-relevant tasks. Of note, the MPN has a significantly simpler attractor structure that allows it to be more flexible in training and sequential-learning settings. Lastly, how the dynamics change for MPNs trained on contextual and continuous integration tasks is also investigated.

Related articles

Related articles are currently not available for this article.