Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Bridging the Gap Between Practice and Theory in Deep Learning

Long-Range Synthetic Knowledge Graph Benchmarks for Double-Equivariant Models

Bruna Jasinowodolinski · Yucheng Zhang · Jincheng Zhou · Beatrice Bevilacqua · Bruno Ribeiro


Abstract:

In the landscape of relational data, Knowledge Graphs (KGs) structure triplets of the form (head entity, relation, tail entity). Recent methods, namely double-equivariant models, have considered the task of predicting missing triplets under fully-inductive scenarios involving both new entities and novel relations during testing. Despite their great promise, a consensus on their practical capabilities, particularly in capturing long-range dependencies, remains elusive. This paper investigates the ability of double-equivariant models to capture long-range dependencies in an input KG, and transfer knowledge to a new test KG, which also requires distant information. We present multiple synthetic yet semantically sound datasets which require distant information for accurate predictions. Our preliminary empirical results highlight that existing double-equivariant models face significant challenges in effectively incorporating distant information.

Chat is not available.