arXiv Analytics

Sign in

arXiv:2208.06201 [cond-mat.stat-mech]AbstractReferencesReviewsResources

Equivalence of information production and generalized entropies in complex processes

Rudolf Hanel, Stefan Thurner

Published 2022-08-12Version 1

Complex systems that are characterized by strong correlations and fat-tailed distribution functions have been argued to be incompatible within the framework of Boltzmann-Gibbs entropy. As an alternative, so-called generalized entropies were proposed and intensively studied. Here we show that this incompatibility is a misconception. For a broad class of processes, Boltzmann entropy the log multiplicity remains the valid entropy concept, however, for non-i.i.d., non-multinomial, and non-ergodic processes, Boltzmann entropy is not of Shannon form. The correct form of Boltzmann entropy can be shown to be identical with generalized entropies. We derive this result for all processes that can be mapped reversibly to adjoint representations where processes are i.i.d.. In these representations the information production is given by the Shannon entropy. We proof that over the original sampling space this yields functionals that are identical to generalized entropies. The problem of constructing adequate context-sensitive entropy functionals therefore can be translated into the much simpler problem of finding adjoint representations. The method provides a comprehensive framework for a statistical physics of strongly correlated systems and complex processes.

Comments: 14 pages paper + SI, 1 figure
Subjects: 82C05, 82M99, 82C99, 94A17, I.0, G.3
Related articles: Most relevant | Search more
arXiv:cond-mat/0209319 (Published 2002-09-13)
The additive generalization of the Boltzmann entropy
arXiv:0804.3443 [cond-mat.stat-mech] (Published 2008-04-22)
Geometric variations of the Boltzmann entropy
arXiv:cond-mat/0412683 (Published 2004-12-24)
Boltzmann entropy and the microcanonical ensemble