Poster
ELBOing Stein: Variational Bayes with Stein Mixture Inference
Ola Rønning · Eric Nalisnick · Christophe Ley · Padhraic Smyth · Thomas Hamelryck
Hall 3 + Hall 2B #397
Stein variational gradient descent (SVGD) (Liu & Wang, 2016) performs approximate Bayesian inference by representing the posterior with a set of particles.However, SVGD suffers from variance collapse, i.e. poor predictions due to underestimating uncertainty (Ba et al., 2021), even for moderately-dimensional modelssuch as small Bayesian neural networks (BNNs). To address this issue, we generalize SVGD by letting each particle parameterize a component distribution ina mixture model. Our method, Stein Mixture Inference (SMI), optimizes a lowerbound to the evidence (ELBO) and introduces user-specified guides parameterizedby particles. SMI extends the Nonlinear SVGD framework (Wang & Liu, 2019) tothe case of variational Bayes. SMI effectively avoids variance collapse, judging bya previously described test developed for this purpose, and performs well on standard data sets. In addition, SMI requires considerably fewer particles than SVGDto accurately estimate uncertainty for small BNNs. The synergistic combination ofNSVGD, ELBO optimization and user-specified guides establishes a promisingapproach towards variational Bayesian inference in the case of tall and wide data.
Live content is unavailable. Log in and register to view live content