Skip to yearly menu bar Skip to main content


Poster
in
Workshop: AI4MAT-ICLR-2025: AI for Accelerated Materials Design

Transformer as a Neural Knowledge Graph

Yuki Nishihori · Yusei Ito · Yuta Suzuki · Ryo Igarashi · Yoshitaka Ushiku · Kanta Ono

Keywords: [ Contrastive learning ] [ Multimodal learning ] [ Crystal structure ] [ Knowledge graph ]


Abstract:

In this study, we propose an effective contrastive learning method that bridges crystal structures with their linguistic properties (e.g., superconductor). Contrastive learning enables both the retrieval of crystal structures based on linguistic characteristics and the inference of linguistic properties from crystal structures, which are essential for accelerating materials discovery. However, a major challenge lies in the limitation of available datasets, which currently include only crystal structures paired with their corresponding article titles and abstracts. Because many papers depend on referenced works and shared domain knowledge—often explored in detail within the main text—titles and abstracts alone do not sufficiently capture the full characteristics of a crystal. To address this issue, we introduce a neural knowledge graph by incorporating a transformer into the text encoder of the existing contrastive learning framework, rather than expanding the dataset. This modification enables the model to dynamically incorporate related knowledge, thereby enhancing its representation of linguistic properties and facilitating more accurate correlations between crystal structures and their properties.

Chat is not available.