arXiv Analytics

Sign in

arXiv:cond-mat/0505107AbstractReferencesReviewsResources

A step beyond Tsallis and Renyi entropies

Marco Masi

Published 2005-05-04Version 1

Tsallis and R\'{e}nyi entropy measures are two possible different generalizations of the Boltzmann-Gibbs entropy (or Shannon's information) but are not generalizations of each others. It is however the Sharma-Mittal measure, which was already defined in 1975 (B.D. Sharma, D.P. Mittal, J.Math.Sci \textbf{10}, 28) and which received attention only recently as an application in statistical mechanics (T.D. Frank & A. Daffertshofer, Physica A \textbf{285}, 351 & T.D. Frank, A.R. Plastino, Eur. Phys. J., B \textbf{30}, 543-549) that provides one possible unification. We will show how this generalization that unifies R\'{e}nyi and Tsallis entropy in a coherent picture naturally comes into being if the q-formalism of generalized logarithm and exponential functions is used, how together with Sharma-Mittal's measure another possible extension emerges which however does not obey a pseudo-additive law and lacks of other properties relevant for a generalized thermostatistics, and how the relation between all these information measures is best understood when described in terms of a particular logarithmic Kolmogorov-Nagumo average.

Journal: Physics Letters A Volume 338, Issues 3-5, 2 May 2005, Pages 217-224
Categories: cond-mat.stat-mech
Related articles: Most relevant | Search more
arXiv:1302.2826 [cond-mat.stat-mech] (Published 2013-02-12, updated 2013-04-02)
Renyi entropies as a measure of the complexity of counting problems
arXiv:cond-mat/0308017 (Published 2003-08-01, updated 2006-11-14)
The CTRW in finance: Direct and inverse problems with some generalizations and extensions
arXiv:0911.0383 [cond-mat.stat-mech] (Published 2009-11-02, updated 2012-05-28)
Critique of multinomial coefficients method for evaluating Tsallis and Renyi entropies