Skip to yearly menu bar Skip to main content


Poster

Fast and unified path gradient estimators for normalizing flows

Lorenz Vaitl · Ludwig Winkler · Lorenz Richter · Pan Kessel

Halle B #217

Abstract:

Recent work shows that path gradient estimators for normalizing flows have lower variance compared to standard estimators, resulting in improved training. However, they are often prohibitively more expensive from a computational point of view and cannot be applied to maximum likelihood training in a scalable manner, which severely hinders their widespread adoption. In this work, we overcome these crucial limitations. Specifically, we propose a fast path gradient estimator which works for all normalizing flow architectures of practical relevance for sampling from an unnormalized target distribution. We then show that this estimator can also be applied to maximum likelihood training and empirically establish its superior performance for several natural sciences applications.

Chat is not available.