Skip to yearly menu bar Skip to main content


Learning Energy-Based Generative Models via Coarse-to-Fine Expanding and Sampling

Yang Zhao · Jianwen Xie · Ping Li

Keywords: [ image translation ] [ Energy-based model ] [ generative model ]

Abstract: Energy-based models (EBMs) parametrized by neural networks can be trained by Markov chain Monte Carlo (MCMC) sampling-based maximum likelihood estimation. Despite the recent significant success of EBMs in data generation, the current approaches to train EBMs can be unstable and sometimes may have difficulty synthesizing diverse and high-fidelity images. In this paper, we propose to train EBMs via a multistage coarse-to-fine expanding and sampling strategy, namely CF-EBM. To improve the learning procedure, we propose an effective net architecture and advocate applying smooth activations. The resulting approach is computationally efficient and achieves the best performance on image generation amongst EBMs and the spectral normalization GAN. Furthermore, we provide a recipe for being the first successful EBM to synthesize $512\times512$-pixel images and also improve out-of-distribution detection. In the end, we effortlessly generalize CF-EBM to the one-sided unsupervised image-to-image translation and beat baseline methods with the model size and the training budget largely reduced. In parallel, we present a gradient-based discriminative saliency method to interpret the translation dynamics which align with human behavior explicitly.

Chat is not available.