Spotlight
in
Workshop: AI4MAT-ICLR-2025: AI for Accelerated Materials Design
All-atom Diffusion Transformers: Unified generative modelling of molecules and materials
Chaitanya Joshi · Xiang Fu · Yi-Lun Liao · Vahe Gharakhanyan · Benjamin Kurt Miller · Anuroop Sriram · Zachary Ulissi
Keywords: [ Transformers ] [ Crystals ] [ Diffusion ] [ Foundation models ] [ Molecules ]
in
Workshop: AI4MAT-ICLR-2025: AI for Accelerated Materials Design
Diffusion models are the standard toolkit for generative modelling of 3D atomic systems. However, for different types of atomic systems – such as molecules and materials – the generative processes are usually highly specific to the target system despite the underlying physics being the same. We introduce the All-atom Diffusion Transformer (ADiT), a unified latent diffusion framework for jointly generating both periodic materials and non-periodic molecular systems using the same model: (1) An autoencoder maps a unified, all-atom representations of molecules and materials to a shared latent embedding space; and (2) A diffusion model is trained to generate new latent embeddings that the autoencoder can decode to sample new molecules or materials. Experiments on QM9 and MP20 datasets demonstrate that jointly trained ADiT generates realistic and valid molecules as well as materials, exceeding state-of-the-art results from molecule and crystal-specific models. ADiT uses standard Transformers, resulting in significant speedups during training and inference compared to equivariant diffusion models. Scaling ADiT up to half a billion parameters predictably improves performance, representing a step towards broadly generalizable foundation models for generative chemistry.