Skip to yearly menu bar Skip to main content


In-Person Poster presentation / poster accept

Discrete Predictor-Corrector Diffusion Models for Image Synthesis

José Lezama · Tim Salimans · Lu Jiang · Huiwen Chang · Jonathan Ho · Irfan Essa

MH1-2-3-4 #75

Keywords: [ Generative models ] [ discrete diffusion models ] [ image synthesis ]


Abstract:

We introduce Discrete Predictor-Corrector diffusion models (DPC), extending predictor-corrector samplers in Gaussian diffusion models to the discrete case. Predictor-corrector samplers are a class of samplers for diffusion models, which improve on ancestral samplers by correcting the sampling distribution of intermediate diffusion states using MCMC methods. In DPC, the Langevin corrector, which does not have a direct counterpart in discrete space, is replaced with a discrete MCMC transition defined by a learned corrector kernel. The corrector kernel is trained to make the correction steps achieve asymptotic convergence, in distribution, to the correct marginal of the intermediate diffusion states. Equipped with DPC, we revisit recent transformer-based non-autoregressive generative models through the lens of discrete diffusion, and find that DPC can alleviate the compounding decoding error due to the parallel sampling of visual tokens. Our experiments show that DPC improves upon existing discrete latent space models for class-conditional image generation on ImageNet, and outperforms continuous diffusion models and GANs, according to standard metrics and user preference studies.

Chat is not available.