Skip to yearly menu bar Skip to main content


Poster

Preventing Posterior Collapse with delta-VAEs

Ali Razavi · Aaron van den Oord · Ben Poole · Oriol Vinyals

Great Hall BC #3

Keywords: [ autoregressive models ] [ posterior collapse ] [ vae ]


Abstract:

Due to the phenomenon of “posterior collapse,” current latent variable generative models pose a challenging design choice that either weakens the capacity of the decoder or requires altering the training objective. We develop an alternative that utilizes the most powerful generative models as decoders, optimize the variational lower bound, and ensures that the latent variables preserve and encode useful information. Our proposed δ-VAEs achieve this by constraining the variational family for the posterior to have a minimum distance to the prior. For sequential latent variable models, our approach resembles the classic representation learning approach of slow feature analysis. We demonstrate our method’s efficacy at modeling text on LM1B and modeling images: learning representations, improving sample quality, and achieving state of the art log-likelihood on CIFAR-10 and ImageNet 32 × 32.

Live content is unavailable. Log in and register to view live content