Understanding neural circuit principles for representation learning through joint-embedding predictive architectures

This article has 0 evaluations Published on
Read the full article Related papers
This article on Sciety

Abstract

Tracking prey or recognizing a lurking predator is as crucial for survival as anticipating their actions. To guide behavior, the brain must extract information about object identities and their dynamics from entangled sensory inputs. How it accomplishes this feat remains an open question. Classical predictive coding theories propose that this ability arises by comparing predicted sensory signals with actual inputs and reducing the associated prediction errors. While such models capture important aspects of cortical computation, they typically focus on faithfully predicting sensory input and do not explicitly address how abstract, untangled representations of objects and their dynamics emerge solely through experience. Here, we develop a theory of representation learning in neural circuits that shifts the focus from prediction in the input space to prediction in representation space, without relying on external supervision or labeled data. Specifically, we introduce recurrent predictive learning (RPL), a recurrent joint-embedding predictive architecture inspired by self-supervised machine learning, that learns abstract representations of object identity and their dynamics and predicts future object motion from continuous sensory streams. Crucially, the model learns sequence representations that resemble successor-like representations observed in the primary visual cortex of humans. The model also develops abstract sequence representations comparable to those reported in the macaque prefrontal cortex. Finally, we outline how RPL’s modular feedforward-recurrent organization could map onto cortical microcircuits. Our work establishes a circuit-centric theory framework that provides new perspectives on how the brain may acquire an internal model of the world through experience.

Related articles

Related articles are currently not available for this article.