Processing math: 100%
Skip to yearly menu bar Skip to main content


Poster

Higher-Order Graphon Neural Networks: Approximation and Cut Distance

Daniel Herbst · Stefanie Jegelka

Hall 3 + Hall 2B #196
[ ]
Wed 23 Apr 7 p.m. PDT — 9:30 p.m. PDT

Abstract: Graph limit models, like *graphons* for limits of dense graphs, have recently been used to study size transferability of graph neural networks (GNNs). While most literature focuses on message passing GNNs (MPNNs), in this work we attend to the more powerful *higher-order* GNNs. First, we extend the k-WL test for graphons (Böker, 2023) to the graphon-signal space and introduce *signal-weighted homomorphism densities* as a key tool. As an exemplary focus, we generalize *Invariant Graph Networks* (IGNs) to graphons, proposing *Invariant Graphon Networks* (IWNs) defined via a subset of the IGN basis corresponding to bounded linear operators. Even with this restricted basis, we show that IWNs of order k are at least as powerful as the k-WL test, and we establish universal approximation results for graphon-signals in Lp distances. This significantly extends the prior work of Cai & Wang (2022), showing that IWNs—a subset of their *IGN-small*—retain effectively the same expressivity as the full IGN basis in the limit.In contrast to their approach, our blueprint of IWNs also aligns better with the geometry of graphon space, for example facilitating comparability to MPNNs. We highlight that, while typical higher-order GNNs are discontinuous w.r.t.\ cut distance—which causes their lack of convergence and is inherently tied to the definition of k-WL—their transferability remains comparable to MPNNs.

Live content is unavailable. Log in and register to view live content