Skip to yearly menu bar Skip to main content


Poster

Kernel Implicit Variational Inference

Jiaxin Shi · Shengyang Sun · Jun Zhu

East Meeting level; 1,2,3 #1

Abstract:

Recent progress in variational inference has paid much attention to the flexibility of variational posteriors. One promising direction is to use implicit distributions, i.e., distributions without tractable densities as the variational posterior. However, existing methods on implicit posteriors still face challenges of noisy estimation and computational infeasibility when applied to models with high-dimensional latent variables. In this paper, we present a new approach named Kernel Implicit Variational Inference that addresses these challenges. As far as we know, for the first time implicit variational inference is successfully applied to Bayesian neural networks, which shows promising results on both regression and classification tasks.

Live content is unavailable. Log in and register to view live content