Poster
Scalable Bayesian Learning with posteriors
Samuel Duffield · Kaelan Donatella · Johnathan Chiu · Phoebe Klett · Daniel Simpson
Hall 3 + Hall 2B #419
Although theoretically compelling, Bayesian learning with modern machine learning models is computationally challenging since it requires approximating a high dimensional posterior distribution. In this work, we (i) introduce posteriors, an easily extensible PyTorch library hosting general-purpose implementations making Bayesian learning accessible and scalable to large data and parameter regimes; (ii) present a tempered framing of stochastic gradient Markov chain Monte Carlo, as implemented in posteriors, that transitions seamlessly into optimization and unveils a minor modification to deep ensembles to ensure they are asymptotically unbiased for the Bayesian posterior, and (iii) demonstrate and compare the utility of Bayesian approximations through experiments including an investigation into the cold posterior effect and applications with large language models.posteriors repository: https://github.com/normal-computing/posteriors
Live content is unavailable. Log in and register to view live content