Poster
in
Workshop: Frontiers in Probabilistic Inference: learning meets Sampling
Debiasing Guidance for Discrete Diffusion with Sequential Monte Carlo
Lee Kit · Paul Jeha · Jes Frellsen · Pietro Lio · Michael Albergo · Francisco Vargas
Abstract:
Discrete diffusion models are a class of generative models that produce samples from an approximated data distribution within a discrete state space. Often, there is a need to target specific regions of the data distribution. Current guidance methods aim to sample from a distribution with mass proportional to $p_0(x_0) p(\zeta|x_0)^\alpha$ but fail to achieve this in practice. We introduce a Sequential Monte Carlo algorithm that generates unbiasedly from this target distribution, utilising the learnt unconditional and guided process. We validate our approach on low-dimensional distributions, controlled images and text generations. For text generation, our method provides strong control while maintaining low perplexity compared to guidance-based approaches.
Chat is not available.
Successful Page Load