Processing math: 100%
Skip to yearly menu bar Skip to main content


Poster

Convergence of Score-Based Discrete Diffusion Models: A Discrete-Time Analysis

Zikun Zhang · Zixiang Chen · Quanquan Gu

Hall 3 + Hall 2B #455
[ ]
Thu 24 Apr midnight PDT — 2:30 a.m. PDT

Abstract: Diffusion models have achieved great success in generating high-dimensional samples across various applications. While the theoretical guarantees for continuous-state diffusion models have been extensively studied, the convergence analysis of the discrete-state counterparts remains under-explored. In this paper, we study the theoretical aspects of score-based discrete diffusion models under the Continuous Time Markov Chain (CTMC) framework. We introduce a discrete-time sampling algorithm in the general state space [S]d that utilizes score estimators at predefined time points. We derive convergence bounds for the Kullback-Leibler (KL) divergence and total variation (TV) distance between the generated sample distribution and the data distribution, considering both scenarios with and without early stopping under reasonable assumptions. Notably, our KL divergence bounds are nearly linear in the dimension d, aligning with state-of-the-art results for diffusion models. Our convergence analysis employs a Girsanov-based method and establishes key properties of the discrete score function, which are essential for characterizing the discrete-time sampling process.

Live content is unavailable. Log in and register to view live content