Skip to yearly menu bar Skip to main content


Oral
in
Workshop: Towards Agentic AI for Science: Hypothesis Generation, Comprehension, Quantification, and Validation

Large Language Models Are Innate Crystal Structure Generators

Jingru Gan · Peichen Zhong · Yuanqi Du · Yanqiao Zhu · Chenru Duan · Haorui Wang · Daniel Schwalbe-Koda · Carla Gomes · Kristin Persson · Wei Wang


Abstract:

Crystal structure generation is fundamental to materials discovery, enabling the prediction of novel materials with desired properties. While existing approaches leverage Large Language Models (LLMs) through extensive fine-tuning on materials databases, we show that pre-trained LLMs can inherently generate stable crystal structures without additional training. Our novel framework MatLLMSearch integrates pre-trained LLMs with evolutionary search algorithms, achieving a 78.38% metastable rate validated by machine learning interatomic potentials and 31.7% DFT-verified stability via quantum mechanical calculations, outperforming specialized models such as CrystalTextLLM. Beyond crystal structure generation, we further demonstrate that our framework can be readily adapted to diverse materials design tasks, including crystal structure prediction and multi-objective optimization of properties such as deformation energy and bulk modulus, all without fine-tuning. These results establish pre-trained LLMs as versatile and effective tools for materials discovery, opening up new venues for crystal structure generation with reduced computational overhead and broader accessibility.

Chat is not available.