Skip to yearly menu bar Skip to main content


Multilinear Operator Networks

Yixin Cheng · Grigorios Chrysos · Markos Georgopoulos · Volkan Cevher

Halle B #247
[ ]
Tue 7 May 7:30 a.m. PDT — 9:30 a.m. PDT


Despite the remarkable capabilities of deep neural networks in image recognition, the dependence on activation functions remains a largely unexplored area and has yet to be eliminated. On the other hand, Polynomial Networks is a class of models that does not require activation functions, but have yet to perform on par with modern architectures. In this work, we aim close this gap and propose MONet, which relies solely on multilinear operators. The core layer of MONet, called Mu-Layer, captures multiplicative interactions of the elements of the input token. MONet captures high-degree interactions of the input elements and we demonstrate the efficacy of our approach on a series of image recognition and scientific computing benchmarks. The proposed model outperforms prior polynomial networks and performs on par with modern architectures. We believe that MONet can inspire further research on models that use entirely multilinear operations.

Chat is not available.