Poster
in
Workshop: Deep Generative Model in Machine Learning: Theory, Principle and Efficacy
A Universal Isoentropic Time Scheduler for Continuous Generative Diffusion Models
Dejan Stancevic · Luca Ambrogioni
Keywords: [ Noise Scheduling ] [ Information Theory ] [ Generative Diffusion Models ] [ Stochastic Differential Equations (SDEs) ] [ Conditional Entropy ]
The practical performance of generative diffusion models depends on the appropriate choice of the noise scheduling function, which can also be equivalently expressed as a time reparameterization. In this paper, we present a time scheduler that selects sampling points based on entropy rather than uniform time spacing, ensuring each point contributes an equal amount of information to the final generation. We prove that this time reparameterization does not depend on the initial choice of time. We provide a tractable formula to estimate this \emph{isoentropic time} in a trained model using the training loss. In our experiments with mixtures of Gaussian distributions and 2D patterns, we show that using the isoentropic time discretization greatly improves the inference performance of trained models when the number of function evaluations is small.