Poster
Bayesian Neural Network Priors Revisited
Vincent Fortuin · AdriĆ Garriga-Alonso · Sebastian Ober · Florian Wenzel · Gunnar Ratsch · Richard E Turner · Mark van der Wilk · Laurence Aitchison
Keywords: [ bayesian neural networks ] [ bayesian deep learning ]
Isotropic Gaussian priors are the de facto standard for modern Bayesian neural network inference. However, it is unclear whether these priors accurately reflect our true beliefs about the weight distributions or give optimal performance. To find better priors, we study summary statistics of neural network weights in networks trained using stochastic gradient descent (SGD). We find that convolutional neural network (CNN) and ResNet weights display strong spatial correlations, while fully connected networks (FCNNs) display heavy-tailed weight distributions. We show that building these observations into priors can lead to improved performance on a variety of image classification datasets. Surprisingly, these priors mitigate the cold posterior effect in FCNNs, but slightly increase the cold posterior effect in ResNets.