Skip to yearly menu bar Skip to main content


Poster

Simple Guidance Mechanisms for Discrete Diffusion Models

Yair Schiff · Subham Sahoo · Hao Phung · Guanghan Wang · Sam Boshar · Hugo Dalla-torre · Bernardo Almeida · Alexander Rush · Thomas Pierrot · Volodymyr Kuleshov

Hall 3 + Hall 2B #174
[ ] [ Project Page ]
Wed 23 Apr 7 p.m. PDT — 9:30 p.m. PDT

Abstract:

Diffusion models for continuous data gained widespread adoption owing to their high quality generation and control mechanisms. However, controllable diffusion on discrete data faces challenges given that continuous guidance methods do not directly apply to discrete diffusion. Here, we provide a straightforward derivation of classifier-free and classifier-based guidance for discrete diffusion, as well as a new class of diffusion models that leverage uniform noise and that are more guidable because they can continuously edit their outputs. We improve the quality of these models with a novel continuous-time variational lower bound that yields state-of-the-art performance, especially in settings involving guidance or fast generation. Empirically, we demonstrate that our guidance mechanisms combined with uniform noise diffusion improve controllable generation relative to autoregressive and diffusion baselines on several discrete data domains, including genomic sequences, small molecule design, and discretized image generation.

Live content is unavailable. Log in and register to view live content