Skip to yearly menu bar Skip to main content


Poster

Graph Transformers Dream of Electric Flow

Xiang Cheng · Lawrence Carin · Suvrit Sra

Hall 3 + Hall 2B #136
[ ]
Fri 25 Apr midnight PDT — 2:30 a.m. PDT

Abstract:

We show theoretically and empirically that the linear Transformer, when applied to graph data, can implement algorithms that solve canonical problems such as electric flow and eigenvector decomposition. The Transformer has access to information on the input graph only via the graph's incidence matrix. We present explicit weight configurations for implementing each algorithm, and we bound the constructed Transformers' errors by the errors of the underlying algorithms. Our theoretical findings are corroborated by experiments on synthetic data. Additionally, on a real-world molecular regression task, we observe that the linear Transformer is capable of learning a more effective positional encoding than the default one based on Laplacian eigenvectors. Our work is an initial step towards elucidating the inner-workings of the Transformer for graph data. Code is available at https://github.com/chengxiang/LinearGraphTransformer

Live content is unavailable. Log in and register to view live content