Skip to yearly menu bar Skip to main content


Poster

Simple GNN Regularisation for 3D Molecular Property Prediction and Beyond

Jonathan Godwin · Michael Schaarschmidt · Alexander Gaunt · Alvaro Sanchez Gonzalez · Yulia Rubanova · Petar Veličković · James Kirkpatrick · Peter Battaglia

Keywords: [ deep learning ] [ graph neural networks ] [ GNNs ]


Abstract:

In this paper we show that simple noisy regularisation can be an effective way to address oversmoothing. We first argue that regularisers ad-dressing oversmoothing should both penalise node latent similarity and encourage meaningful node representations. From this observation we derive “Noisy Nodes”,a simple technique in which we corrupt the input graph with noise, and add a noise correcting node-level loss. The diverse node level loss encourages latent node diversity, and the denoising objective encourages graph manifold learning. Our regulariser applies well-studied methods in simple, straightforward ways which allow even generic architectures to overcome oversmoothing and achieve state of the art results on quantum chemistry tasks such as QM9 and Open Catalyst, and improve results significantly on Open Graph Benchmark (OGB) datasets. Our results suggest Noisy Nodes can serve as a complementary building block in the GNN toolkit.

Chat is not available.