arXiv Analytics

Sign in

arXiv:2008.05912 [stat.ML]AbstractReferencesReviewsResources

A statistical theory of cold posteriors in deep neural networks

Laurence Aitchison

Published 2020-08-13Version 1

To get Bayesian neural networks to perform comparably to standard neural networks it is usually necessary to artificially reduce uncertainty using a "tempered" or "cold" posterior. This is extremely concerning: if the prior is accurate, Bayes inference/decision theory is optimal, and any artificial changes to the posterior should harm performance. While this suggests that the prior may be at fault, here we argue that in fact, BNNs for image classification use the wrong likelihood. In particular, standard image benchmark datasets such as CIFAR-10 are carefully curated. We develop a generative model describing curation which gives a principled Bayesian account of cold posteriors, because the likelihood under this new generative model closely matches the tempered likelihoods used in past work.

Related articles: Most relevant | Search more
arXiv:2309.16314 [stat.ML] (Published 2023-09-28)
A Primer on Bayesian Neural Networks: Review and Debates
arXiv:2008.08400 [stat.ML] (Published 2020-08-19)
Improving predictions of Bayesian neural networks via local linearization
arXiv:1902.02603 [stat.ML] (Published 2019-02-07)
Radial and Directional Posteriors for Bayesian Neural Networks