Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Geometrical and Topological Representation Learning

Graph Anisotropic Diffusion

Ahmed Elhag · Gabriele Corso · Hannes Stärk · Michael Bronstein


Abstract:

Traditional Graph Neural Networks (GNNs) rely on message passing, which amounts to permutation-invariant local aggregation of neighbour features. Such a process is isotropic and there is no notion of `direction' on the graph. We present a new GNN architecture called Graph Anisotropic Diffusion. Our model alternates between linear diffusion, for which a closed-form solution is available, and local anisotropic filters to obtain efficient multi-hop anisotropic kernels. We test our model on two common molecular property prediction benchmarks (ZINC and QM9) and show its competitive performance.

Chat is not available.