Skip to yearly menu bar Skip to main content


In-Person Poster presentation / poster accept

Interneurons accelerate learning dynamics in recurrent neural networks for statistical adaptation

David Lipshutz · Cengiz Pehlevan · Dmitri Chklovskii

MH1-2-3-4 #130

Keywords: [ implicit acceleration ] [ recurrent neural networks ] [ statistical whitening ] [ Interneurons ] [ gradient flows ] [ Neuroscience and Cognitive Science ]


Abstract:

Early sensory systems in the brain rapidly adapt to fluctuating input statistics, which requires recurrent communication between neurons. Mechanistically, such recurrent communication is often indirect and mediated by local interneurons. In this work, we explore the computational benefits of mediating recurrent communication via interneurons compared with direct recurrent connections. To this end, we consider two mathematically tractable recurrent neural networks that statistically whiten their inputs --- one with direct recurrent connections and the other with interneurons that mediate recurrent communication. By analyzing the corresponding continuous synaptic dynamics and numerically simulating the networks, we show that the network with interneurons is more robust to initialization than the network with direct recurrent connections in the sense that the convergence time for the synaptic dynamics in the network with interneurons (resp. direct recurrent connections) scales logarithmically (resp. linearly) with the spectrum of their initialization. Our results suggest that interneurons are computationally useful for rapid adaptation to changing input statistics. Interestingly, the network with interneurons is an overparameterized solution of the whitening objective for the network with direct recurrent connections, so our results can be viewed as a recurrent neural network analogue of the implicit acceleration phenomenon observed in overparameterized feedforward linear networks.

Chat is not available.