FAST‑DIPS: Adjoint‑Free Analytic Steps and Hard‑Constrained Likelihood Correction for Diffusion‑Prior Inverse Problems
Minwoo Kim · Seunghyeok Shin · Hongki Lim
Abstract
$\textbf{FAST-DIPS}$ is a training-free solver for diffusion-prior inverse problems, including nonlinear forward operators. At each noise level, a pretrained denoiser provides an anchor $\mathbf{x}_ {0|t}$; we then perform a hard-constrained proximal correction in measurement space (AWGN) by solving $\min_\mathbf{x} \tfrac{1}{2\gamma_t}\|\mathbf{x}-\mathbf{x}_{0|t}\|^2 \ \text{s.t.}\ \|\mathcal{A}(\mathbf{x})-\mathbf{y}\|\le\varepsilon$. The correction is implemented via an adjoint-free ADMM with a closed-form projection onto the Euclidean ball and a few steepest-descent updates whose step size is analytic and computable from one VJP and one JVP—or a forward-difference surrogate—followed by decoupled re-annealing. We show this step minimizes a local quadratic model (with backtracking-based descent), any ADMM fixed point satisfies KKT for the hard-constraint, and mode substitution yields a bounded time-marginal error. We also derive a latent variant $\mathcal{A}\mapsto\mathcal{A}\circ\mathcal{D}$ and a one-parameter pixel$\rightarrow$latent hybrid schedule. FAST-DIPS delivers comparable or better PSNR/SSIM/LPIPS while being substantially faster, requiring only autodiff access to $\mathcal{A}$ and no hand-coded adjoints or inner MCMC.
Successful Page Load