Skip to yearly menu bar Skip to main content


In-Person Poster presentation / poster accept

$\mathscr{N}$-WL: A New Hierarchy of Expressivity for Graph Neural Networks

Qing Wang · Dillon Chen · Asiri Wijesinghe · Shouheng Li · Muhammad Farhan

MH1-2-3-4 #40

Keywords: [ Deep Learning and representational learning ] [ graph classification ] [ k-WL hierarchy ] [ Weisfeiler-Lehman algorithm ] [ graph neural network ]


Abstract: The expressive power of Graph Neural Networks (GNNs) is fundamental for understanding their capabilities and limitations, i.e., what graph properties can or cannot be learnt by a GNN. Since standard GNNs have been characterised to be upper-bounded by the Weisfeiler-Lehman (1-WL) algorithm, recent attempts concentrated on developing more expressive GNNs in terms of the $k$-WL hierarchy, a well-established framework for graph isormorphism tests. In this work we show that, contrary to the widely accepted view, the $k$-WL hierarchy is not well-suited for measuring expressive GNNs. This is due to limitations that are inherent to high-dimensional WL algorithms such as the lack of a natural interpretation and high computational costs, which makes it difficult to draw any firm conclusions about the expressive power of GNNs beyond 1-WL. Thus, we propose a novel hierarchy of graph isomorphism tests, namely Neighbourhood WL ($\mathscr{N}$-WL), and also establish a new theorem on the equivalence of expressivity between induced connected subgraphs and induced subgraphs within this hierarchy. Further, we design a GNN model upon $\mathscr{N}$-WL, Graph Neighbourhood Neural Network (G3N), and empirically verify its expressive power on synthetic and real-world benchmarks.

Chat is not available.