Associated Learning: an Alternative to End-to-End Backpropagation that Works on CNN, RNN, and Transformer

Dennis Wu · Di-Nan Lin · Vincent Chen · Hung-Hsuan Chen

Keywords: [ backpropagation ]

[ Abstract ]
[ Visit Poster at Spot C2 in Virtual World ] [ OpenReview
Tue 26 Apr 10:30 a.m. PDT — 12:30 p.m. PDT


This paper studies Associate Learning (AL), an alternative methodology to the end-to-end backpropagation (BP). We introduce the workflow to convert a neural network into a proper structure such that AL can be used to learn the weights for various types of neural networks. We compared AL and BP on some of the most successful types of neural networks -- Convolutional Neural Network (CNN), Recurrent Neural Network (RNN), and Transformer. Experimental results show that AL consistently outperforms BP on various open datasets. We discuss possible reasons for AL's success and its limitations.

Chat is not available.