arXiv Analytics

Sign in

arXiv:1902.02603 [stat.ML]AbstractReferencesReviewsResources

Radial and Directional Posteriors for Bayesian Neural Networks

Changyong Oh, Kamil Adamczewski, Mijung Park

Published 2019-02-07Version 1

We propose a new variational family for Bayesian neural networks. We decompose the variational posterior into two components, where the radial component captures the strength of each neuron in terms of its magnitude; while the directional component captures the statistical dependencies among the weight parameters. The dependencies learned via the directional density provide better modeling performance compared to the widely-used Gaussian mean-field-type variational family. In addition, the strength of input and output neurons learned via the radial density provides a structured way to compress neural networks. Indeed, experiments show that our variational family improves predictive performance and yields compressed networks simultaneously.

Related articles: Most relevant | Search more
arXiv:2008.08044 [stat.ML] (Published 2020-08-18)
Bayesian neural networks and dimensionality reduction
arXiv:2501.11773 [stat.ML] (Published 2025-01-20)
Can Bayesian Neural Networks Make Confident Predictions?
arXiv:2309.16314 [stat.ML] (Published 2023-09-28)
A Primer on Bayesian Neural Networks: Review and Debates