Skip to yearly menu bar Skip to main content


Poster
in
Workshop: 5th Workshop on practical ML for limited/low resource settings (PML4LRS) @ ICLR 2024

A variational framework for local learning with probabilistic latent representations

David Kappel · Khaleelulla Khan Nazeer · Cabrel Teguemne Fokam · Christian Mayr · Anand Subramoney


Abstract:

We propose a new method for distributed learning by dividing a deep neural network into blocks and introducing a feedback network that propagates information from the targets backward to provide auxiliary local losses. Forward and backward propagation can operate in parallel and with different sets of weights, addressing the problems of locking and weight transport. Our approach derives from a statistical interpretation of training that treats output activations of network blocks as parameters of probability distributions. The resulting learning framework uses these parameters to evaluate the agreement between forward and backward information. Error backpropagation is then performed locally within each block, leading to ``block-local'' learning. We present preliminary results on a variety of tasks and architectures, demonstrating state-of-the-art performance using block-local learning. These results provide a new principled framework for distributed asynchronous learning.

Chat is not available.