Invited Talk
in
Workshop: Frontiers in Probabilistic Inference: learning meets Sampling
Mixture models: a lens on score estimation, feature localization, and guidance
Sitan Chen
in
Workshop: Frontiers in Probabilistic Inference: learning meets Sampling
In this talk, I give some vignettes demonstrating how mixture models can be used as a fruitful abstraction for understanding various algorithmic and empirical aspects of diffusions. In the first part, I describe a new algorithm for learning Gaussian mixture models via score estimation that is exponentially faster in a relevant range of parameters than prior work based on the method of moments. In the second part, I explain how to use mixture models to understand “critical windows” in diffusions (and localization-based samplers more generally), i.e. narrow windows in the generation process during which important features of the final output are determined. Time permitting, in the third part, I describe a simple toy mixture model in which one can precisely characterize the behavior of guidance. Based on the following joint works: https://arxiv.org/abs/2404.18893, https://arxiv.org/abs/2502.00921, https://arxiv.org/abs/2409.13074.