arXiv Analytics

Sign in

arXiv:2205.00009 [cond-mat.stat-mech]AbstractReferencesReviewsResources

An intuition for physicists: information gain from experiments

Johannes Buchner

Published 2022-04-29Version 1

How much one has learned from an experiment is quantifiable by the information gain, also known as the Kullback-Leibler divergence. The narrowing of the posterior parameter distribution $P(\theta|D)$ compared with the prior parameter distribution $\pi(\theta)$, is quantified in units of bits, as: $ D_{\mathrm{KL}}(P|\pi)=\int\log_{2}\left(\frac{P(\theta|D)}{\pi(\theta)}\right)\,P(\theta|D)\,d\theta $. This research note gives an intuition what one bit of information gain means. It corresponds to a Gaussian shrinking its standard deviation by a factor of three.

Related articles: Most relevant | Search more
arXiv:cond-mat/0411255 (Published 2004-11-10, updated 2004-12-03)
Random Ising model in three dimensions: theory, experiment and simulation - a difficult coexistence
Martingales for Physicists
arXiv:cond-mat/0211106 (Published 2002-11-06, updated 2002-11-14)
Hiking through glassy phases: physics beyond aging