Poster
Convex Potential Flows: Universal Probability Distributions with Optimal Transport and Convex Optimization
Chin-Wei Huang · Tian Qi Chen · Christos Tsirigotis · Aaron Courville
Keywords: [ generative models ] [ variational inference ] [ optimal transport ] [ convex optimization ] [ invertible neural networks ] [ Normalizing flows ] [ universal approximation ]
Flow-based models are powerful tools for designing probabilistic models with tractable density. This paper introduces Convex Potential Flows (CP-Flow), a natural and efficient parameterization of invertible models inspired by the optimal transport (OT) theory. CP-Flows are the gradient map of a strongly convex neural potential function. The convexity implies invertibility and allows us to resort to convex optimization to solve the convex conjugate for efficient inversion. To enable maximum likelihood training, we derive a new gradient estimator of the log-determinant of the Jacobian, which involves solving an inverse-Hessian vector product using the conjugate gradient method. The gradient estimator has constant-memory cost, and can be made effectively unbiased by reducing the error tolerance level of the convex optimization routine. Theoretically, we prove that CP-Flows are universal density approximators and are optimal in the OT sense. Our empirical results show that CP-Flow performs competitively on standard benchmarks of density estimation and variational inference.