Skip to yearly menu bar Skip to main content


Test Of Time

Auto-Encoding Variational Bayes

Diederik (Durk) Kingma · Max Welling

Halle A 8 - 9
[ ]
[ arXiv
Wed 8 May 5:15 a.m. PDT — 5:45 a.m. PDT

Abstract:

Probabilistic modeling is one of the most fundamental ways in which we reason about the world. This paper spearheaded the integration of deep learning with scalable probabilistic inference (amortized mean-field variational inference via a so-called reparameterization trick), giving rise to the Variational Autoencoder (VAE). The lasting value of this work is rooted in its elegance. The principles used to develop VAEs deepened our understanding of the interplay between deep learning and probabilistic modeling, and sparked the development of many subsequent interesting probabilistic models and encoding approaches.

Chat is not available.