Skip to yearly menu bar Skip to main content


Inner Classifier-Free Guidance and Its Taylor Expansion for Diffusion Models

Shikun Sun · Longhui Wei · Zhicai Wang · Zixuan Wang · Junliang Xing · Jia Jia · Qi Tian

Halle B #35
[ ]
Tue 7 May 7:30 a.m. PDT — 9:30 a.m. PDT


Classifier-free guidance (CFG) is a pivotal technique for balancing the diversity and fidelity of samples in conditional diffusion models. This approach involves utilizing a single model to jointly optimize the conditional score predictor and unconditional score predictor, eliminating the need for additional classifiers. It delivers impressive results and can be employed for continuous and discrete condition representations. However, when the condition is continuous, it prompts the question of whether the trade-off can be further enhanced. Our proposed inner classifier-free guidance (ICFG) provides an alternative perspective on the CFG method when the condition has a specific structure, demonstrating that CFG represents a first-order case of ICFG. Additionally, we offer a second-order implementation, highlighting that even without altering the training policy, our second-order approach can introduce new valuable information and achieve an improved balance between fidelity and diversity for Stable Diffusion.

Chat is not available.