Processing math: 100%
Skip to yearly menu bar Skip to main content


Poster

Domain Guidance: A Simple Transfer Approach for a Pre-trained Diffusion Model

Jincheng Zhong · XiangCheng Zhang · Jianmin Wang · Mingsheng Long

Hall 3 + Hall 2B #170
[ ]
Fri 25 Apr 7 p.m. PDT — 9:30 p.m. PDT

Abstract: Recent advancements in diffusion models have revolutionized generative modeling. However, the impressive and vivid outputs they produce often come at the cost of significant model scaling and increased computational demands. Consequently, building personalized diffusion models based on off-the-shelf models has emerged as an appealing alternative. In this paper, we introduce a novel perspective on conditional generation for transferring a pre-trained model. From this viewpoint, we propose *Domain Guidance*, a straightforward transfer approach that leverages pre-trained knowledge to guide the sampling process toward the target domain. Domain Guidance shares a formulation similar to advanced classifier-free guidance, facilitating better domain alignment and higher-quality generations. We provide both empirical and theoretical analyses of the mechanisms behind Domain Guidance. Our experimental results demonstrate its substantial effectiveness across various transfer benchmarks, achieving over a 19.6\% improvement in FID and a 23.4\% improvement in FDDINOv2 compared to standard fine-tuning. Notably, existing fine-tuned models can seamlessly integrate Domain Guidance to leverage these benefits, without additional training.

Live content is unavailable. Log in and register to view live content