arXiv Analytics

Sign in

arXiv:1705.00612 [cond-mat.stat-mech]AbstractReferencesReviewsResources

Thermodynamic cost and benefit of data representations

Susanne Still

Published 2017-04-29Version 1

This paper takes a thermodynamic approach to addressing the problem of how to represent data efficiently and meaningfully, a problem at the heart of learning and adaptation in both biological and artificial systems. Thermodynamic analysis of an information engine's cyclic operation reveals information theoretic quantities that are setting limits on performance. If run at fixed temperature, dissipation is lower bounded by a term proportional to irrelevant information. Data representation strategies that are optimal in the sense of minimizing dissipation must therefore strive to retain only relevant information. When an information engine is allowed to make use of a temperature difference, it can produce net work output, for which an upper bound is derived. Maximizing the bound yields a direct derivation of the Information Bottleneck method, a known technique in signal processing and machine learning, used precisely to filter relevant information from irrelevant clutter.

Related articles: Most relevant | Search more
The Thermodynamic Cost of Erasing Information in Finite-time
Thermodynamic cost of external control
Thermodynamic Cost for Classical Counterdiabatic Driving