Loading [MathJax]/jax/output/CommonHTML/jax.js
Skip to yearly menu bar Skip to main content


Poster

On the Expressiveness of Rational ReLU Neural Networks With Bounded Depth

Gennadiy Averkov · Christopher Hojny · Maximilian Merkert

Hall 3 + Hall 2B #420
[ ]
Sat 26 Apr midnight PDT — 2:30 a.m. PDT

Abstract: To confirm that the expressive power of ReLU neural networks grows with their depth, the function Fn=max(0,x1,,xn) has been considered in the literature. A conjecture by Hertrich, Basu, Di Summa, and Skutella [NeurIPS 2021] states that any ReLU network that exactly represents Fn has at least log2(n+1) hidden layers. The conjecture has recently been confirmed for networks with integer weights by Haase, Hertrich, and Loho [ICLR 2023]. We follow up on this line of research and show that, within ReLU networks whose weights are decimal fractions, Fn can only be represented by networks with at least log3(n+1) hidden layers. Moreover, if all weights are N-ary fractions, then Fn can only be represented by networks with at least Ω(lnnlnlnN) layers. These results are a partial confirmation of the above conjecture for rational ReLU networks, and provide the first non-constant lower bound on the depth of practically relevant ReLU networks.

Live content is unavailable. Log in and register to view live content