Skip to yearly menu bar Skip to main content


poster
in
Affinity Workshop: Tiny Papers Poster Session 1

GNN-VPA: A Variance-Preserving Aggregation Strategy for Graph Neural Networks

Lisa Schneckenreiter

#302
[ ] [ Project Page ]
Tue 7 May 1:45 a.m. PDT — 3:45 a.m. PDT

Abstract:

The successful graph neural networks (GNNs) and particularly message passing neural networks critically depend on the functions employed for message aggregation and graph-level readout. Using signal propagation theory, we propose a variance-preserving aggregation function, which maintains the expressivity of GNNs while improving learning dynamics. Our results could pave the way towards normalizer-free or self-normalizing GNNs.

Live content is unavailable. Log in and register to view live content