Skip to yearly menu bar Skip to main content


Poster

A PAC-Bayesian Approach to Generalization Bounds for Graph Neural Networks

Renjie Liao · Raquel Urtasun · Richard Zemel

Keywords: [ generalization bounds ] [ PAC Bayes ] [ graph neural networks ] [ Graph Convolutional Neural Networks ] [ Message Passing GNNs ]


Abstract:

In this paper, we derive generalization bounds for two primary classes of graph neural networks (GNNs), namely graph convolutional networks (GCNs) and message passing GNNs (MPGNNs), via a PAC-Bayesian approach. Our result reveals that the maximum node degree and the spectral norm of the weights govern the generalization bounds of both models. We also show that our bound for GCNs is a natural generalization of the results developed in \citep{neyshabur2017pac} for fully-connected and convolutional neural networks. For MPGNNs, our PAC-Bayes bound improves over the Rademacher complexity based bound \citep{garg2020generalization}, showing a tighter dependency on the maximum node degree and the maximum hidden dimension. The key ingredients of our proofs are a perturbation analysis of GNNs and the generalization of PAC-Bayes analysis to non-homogeneous GNNs. We perform an empirical study on several synthetic and real-world graph datasets and verify that our PAC-Bayes bound is tighter than others.

Chat is not available.