Skip to yearly menu bar Skip to main content


Invited Talk
in
Workshop: Bridging the Gap Between Practice and Theory in Deep Learning

Invited Talk 5 : Capitalizing on Generative AI: Diffusion Models Towards High-Dimensional Generative Optimization

Mengdi Wang


Abstract:

Diffusion models represent a significant breakthrough in generative AI, operating by progressively transforming random noise distributions into structured outputs, with adaptability for specific tasks through guidance or fine-tuning. In this presentation, we delve into the statistical aspects of diffusion models and establish their connection to theoretical optimization frameworks. In the first part, we explore how unconditioned diffusion models efficiently capture complex high-dimensional data, particularly when low-dimensional structures are present. We present the first efficient sample complexity bound for diffusion models that depend on the small intrinsic dimension, effectively addressing the challenge of the curse of dimensionality. Moving to the second part, we leverage our understanding of diffusion models to introduce a pioneering optimization method termed "generative optimization." Here, we harness diffusion models as data-driven solution generators to maximize an unknown objective function. We introduce innovative reward guidance techniques incorporating the target function value to guide the diffusion model. Theoretical analysis in the offline setting demonstrates that the generated solutions yield higher function values on average, with optimality gaps aligning with off-policy bandit regret. Moreover, these solutions maintain fidelity to the intrinsic structures within the training data, suggesting a promising avenue for optimization in complex, structured spaces through generative AI.

Chat is not available.