arXiv:2205.00009 [cond-mat.stat-mech]AbstractReferencesReviewsResources
An intuition for physicists: information gain from experiments
Published 2022-04-29Version 1
How much one has learned from an experiment is quantifiable by the information gain, also known as the Kullback-Leibler divergence. The narrowing of the posterior parameter distribution $P(\theta|D)$ compared with the prior parameter distribution $\pi(\theta)$, is quantified in units of bits, as: $ D_{\mathrm{KL}}(P|\pi)=\int\log_{2}\left(\frac{P(\theta|D)}{\pi(\theta)}\right)\,P(\theta|D)\,d\theta $. This research note gives an intuition what one bit of information gain means. It corresponds to a Gaussian shrinking its standard deviation by a factor of three.
Comments: Accepted to RNAAS
Related articles: Most relevant | Search more
Random Ising model in three dimensions: theory, experiment and simulation - a difficult coexistence
arXiv:2210.09983 [cond-mat.stat-mech] (Published 2022-10-18)
Martingales for Physicists
Édgar Roldán, Izaak Neri, Raphael Chetrite, Shamik Gupta, Simone Pigolotti, Frank Jülicher, Ken Sekimoto
Hiking through glassy phases: physics beyond aging