Skip to yearly menu bar Skip to main content


Poster

SUMO: Unbiased Estimation of Log Marginal Probability for Latent Variable Models

David Duvenaud · Yucen Luo · Ryan P Adams · Mohammad Norouzi · Tian Qi Chen · Alex Beatson · Jun Zhu


Abstract:

Standard variational lower bounds used to train latent variable models produce biased estimates of most quantities of interest. We introduce an unbiased estimator of the log marginal likelihood and its gradients for latent variable models based on randomized truncation of infinite series. If parameterized by an encoder-decoder architecture, the parameters of the encoder can be optimized to minimize its variance of this estimator. We show that models trained using our estimator give better test-set likelihoods than a standard importance-sampling based approach for the same average computational cost. This estimator also allows use of latent variable models for tasks where unbiased estimators, rather than marginal likelihood lower bounds, are preferred, such as minimizing reverse KL divergences and estimating score functions.

Chat is not available.