Skip to yearly menu bar Skip to main content


Invited Talk
in
Workshop: Frontiers in Probabilistic Inference: learning meets Sampling

Efficient variational inference with generative models

Grant Rotskoff


Abstract:

Neural networks continue to surprise us with their remarkable capabilities for high-dimensional function approximation. Applications of machine learning now pervade essentially every scientific discipline, but predictive models to describe the optimization dynamics, inference properties, and flexibility of modern neural networks remain limited. In this talk, I will introduce several approaches to both analyzing and building generative models to augment Monte Carlo sampling and sampling high-dimensional distributions. I will focus, in particular, on two applications from chemistry: sampling conformational ensembles of disordered protein domains and molecular optimization. I will also introduce a self-distillation strategy for large scale models that shares conceptually similarities to preference optimization with reinforcement learning, but does not require proximal optimization (PPO) and outperforms direct preference optimization and (DPO).

Chat is not available.