Skip to yearly menu bar Skip to main content


Poster

Optimal Transport Maps For Distribution Preserving Operations on Latent Spaces of Generative Models

Eirikur Agustsson · Alexander Sage · Radu Timofte · Luc Van Gool

Great Hall BC #17

Keywords: [ distribution preserving operations ] [ optimal transport ] [ generative models ]


Abstract:

Generative models such as Variational Auto Encoders (VAEs) and Generative Adversarial Networks (GANs) are typically trained for a fixed prior distribution in the latent space, such as uniform or Gaussian. After a trained model is obtained, one can sample the Generator in various forms for exploration and understanding, such as interpolating between two samples, sampling in the vicinity of a sample or exploring differences between a pair of samples applied to a third sample. However, the latent space operations commonly used in the literature so far induce a distribution mismatch between the resulting outputs and the prior distribution the model was trained on. Previous works have attempted to reduce this mismatch with heuristic modification to the operations or by changing the latent distribution and re-training models. In this paper, we propose a framework for modifying the latent space operations such that the distribution mismatch is fully eliminated. Our approach is based on optimal transport maps, which adapt the latent space operations such that they fully match the prior distribution, while minimally modifying the original operation. Our matched operations are readily obtained for the commonly used operations and distributions and require no adjustment to the training procedure.

Live content is unavailable. Log in and register to view live content