arXiv Analytics

Sign in

arXiv:2207.08200 [stat.ML]AbstractReferencesReviewsResources

Uncertainty Calibration in Bayesian Neural Networks via Distance-Aware Priors

Gianluca Detommaso, Alberto Gasparin, Andrew Wilson, Cedric Archambeau

Published 2022-07-17Version 1

As we move away from the data, the predictive uncertainty should increase, since a great variety of explanations are consistent with the little available information. We introduce Distance-Aware Prior (DAP) calibration, a method to correct overconfidence of Bayesian deep learning models outside of the training domain. We define DAPs as prior distributions over the model parameters that depend on the inputs through a measure of their distance from the training set. DAP calibration is agnostic to the posterior inference method, and it can be performed as a post-processing step. We demonstrate its effectiveness against several baselines in a variety of classification and regression problems, including benchmarks designed to test the quality of predictive distributions away from the data.

Related articles: Most relevant | Search more
arXiv:2304.02595 [stat.ML] (Published 2023-04-02)
Bayesian neural networks via MCMC: a Python-based tutorial
arXiv:2401.00611 [stat.ML] (Published 2023-12-31)
A Compact Representation for Bayesian Neural Networks By Removing Permutation Symmetry
arXiv:1905.06076 [stat.ML] (Published 2019-05-15)
Expressive Priors in Bayesian Neural Networks: Kernel Combinations and Periodic Functions