Skip to yearly menu bar Skip to main content


poster
in
Affinity Workshop: Tiny Papers Poster Session 1

GNN-VPA: A Variance-Preserving Aggregation Strategy for Graph Neural Networks

Lisa Schneckenreiter

#302

Abstract:

The successful graph neural networks (GNNs) and particularly message passing neural networks critically depend on the functions employed for message aggregation and graph-level readout. Using signal propagation theory, we propose a variance-preserving aggregation function, which maintains the expressivity of GNNs while improving learning dynamics. Our results could pave the way towards normalizer-free or self-normalizing GNNs.

Chat is not available.