arXiv Analytics

Sign in

arXiv:2401.12119 [quant-ph]AbstractReferencesReviewsResources

Temperature as Joules per Bit

Charles Alexandre Bédard, Sophie Berthelette, Xavier Coiteux-Roy, Stefan Wolf

Published 2024-01-22Version 1

Boltzmann's constant reflects a historical misunderstanding of the concept of entropy, whose informational nature is obfuscated when expressed in J/K. We suggest that the development of temperature and energy, historically prior to that of entropy, does not amount to their logical priority: Temperature should be defined in terms of entropy, not vice versa. Following the precepts of information theory, entropy is measured in bits, and coincides with information capacity at thermodynamic equilibrium. Consequently, not only is the temperature of an equilibrated system expressed in J/bit, but it acquires an operational meaning: It is the cost in energy to increase its information capacity by 1 bit. Our proposal also supports the notion of available capacity, analogous to free energy. Finally, it simplifies Landauer's cost and clarifies that it is a cost of displacement, not of erasure.

Related articles: Most relevant | Search more
arXiv:2102.09981 [quant-ph] (Published 2021-02-19)
Revisiting thermodynamics in computation and information theory
arXiv:quant-ph/0309188 (Published 2003-09-25, updated 2004-04-12)
Correlation Functions in Spin Chains and Information Theory
arXiv:quant-ph/0405005 (Published 2004-05-03)
The Physics of Information