arXiv Analytics

Sign in

arXiv:2303.02859 [cond-mat.dis-nn]AbstractReferencesReviewsResources

Bayesian inference with finitely wide neural networks

Chi-Ken Lu

Published 2023-03-06Version 1

The analytic inference, e.g. predictive distribution being in closed form, may be an appealing benefit for machine learning practitioners when they treat wide neural networks as Gaussian process in Bayesian setting. The realistic widths, however, are finite and cause weak deviation from the Gaussianity under which partial marginalization of random variables in a model is straightforward. On the basis of multivariate Edgeworth expansion, we propose a non-Gaussian distribution in differential form to model a finite set of outputs from a random neural network, and derive the corresponding marginal and conditional properties. Thus, we are able to derive the non-Gaussian posterior distribution in Bayesian regression task. In addition, in the bottlenecked deep neural networks, a weight space representation of deep Gaussian process, the non-Gaussianity is investigated through the marginal kernel.

Related articles: Most relevant | Search more
arXiv:0901.1144 [cond-mat.dis-nn] (Published 2009-01-08, updated 2009-10-26)
Bayesian Inference Based on Stationary Fokker-Planck Sampling
arXiv:1911.06509 [cond-mat.dis-nn] (Published 2019-11-15)
Improved algorithm for neuronal ensemble inference by Monte Carlo method
arXiv:2309.17006 [cond-mat.dis-nn] (Published 2023-09-29)
Statistical physics, Bayesian inference and neural information processing