Poster
in
Workshop: 5th Workshop on practical ML for limited/low resource settings (PML4LRS) @ ICLR 2024
GNN-VPA: A Variance-Preserving Aggregation Strategy for Graph Neural Networks
Lisa Schneckenreiter · Richard Freinschlag · Florian Sestak · Johannes Brandstetter · Günter Klambauer · Andreas Mayr
Graph neural networks (GNNs), and especially message-passing neural networks, excel in a variety of domains such as physics, drug discovery, and molecular modeling. In low resource settings, it is crucial for stochastic gradient descent to promptly optimize the objective meaningfully rather than spending initial iterations on adjusting weights towards suitable value ranges for efficiently reducing the loss. In accordance with signal propagation theory, we propose a variance-preserving aggregation function (VPA) for message aggregation and graph-level readout to achieve such favorable forward and backward dynamics. Moreover, VPA maintains the expressivity of GNNs with respect to their ability to discriminate non-isomorphic graphs. Experiments demonstrate that VPA leads to increased predictive performance for popular GNN architectures as well as improved learning dynamics. Our results could pave the way towards even moreefficient GNNs by enabling normalizer-free or self-normalizing architectures.