Learning Continuous Normalizing Flows For Faster Convergence To Target Distribution via Ascent Regularizations
Shuangshuang Chen · Sihao Ding · Yiannis Karayiannidis · Mårten Björkman
Keywords:
unbiased sampling
density estimation
variational inference
gradient flows
Normalizing flows
Probabilistic Methods
Abstract
Normalizing flows (NFs) have been shown to be advantageous in modeling complex distributions and improving sampling efficiency for unbiased sampling. In this work, we propose a new class of continuous NFs, ascent continuous normalizing flows (ACNFs), that makes a base distribution converge faster to a target distribution. As solving such a flow is non-trivial and barely possible, we propose a practical implementation to learn flexibly parametric ACNFs via ascent regularization and apply it in two learning cases: maximum likelihood learning for density estimation and minimizing reverse KL divergence for unbiased sampling and variational inference. The learned ACNFs demonstrate faster convergence towards the target distributions, therefore, achieving better density estimations, unbiased sampling and variational approximation at lower computational costs. Furthermore, the flows show to stabilize themselves to mitigate performance deterioration and are less sensitive to the choice of training flow length $T$.
Video
Chat is not available.
Successful Page Load