Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Frontiers in Probabilistic Inference: learning meets Sampling

Adjoint Sampling: Highly-Scalable Diffusion Samplers via Adjoint Matching

Aaron Havens · Benjamin Kurt Miller · Bing Yan · Carles Domingo i Enrich · Anuroop Sriram · Daniel Levine · Brandon Wood · Bin Hu · Brandon Amos · Brian Karrer · Xiang Fu · Guan-Horng Liu · Ricky T. Q. Chen


Abstract:

We introduce Adjoint Sampling, a highly scalable and efficient algorithm for learning diffusion processes that sample from unnormalized densities, or energy functions. It is the first of its kind in allowing significantly more gradient updates than the number of energy evaluations and model samples, allowing us to scale to much larger problem settings than previously explored by similar methods.Our framework is theoretically grounded in stochastic optimal control and shares the same theoretical guarantees as Adjoint Matching, being able to train without the need for corrective measures that push samples towards the target distribution such as sequential Monte Carlo.We show how to incorporate key symmetries, as well as periodic boundary conditions, for modeling molecules in both Cartesian and torsional representations.We demonstrate the effectiveness of our approach through extensive experiments on classical energy functions, and further scale up to neural network-based energy models where we perform amortized conformer generation across many molecular systems.

Chat is not available.