arXiv Analytics

Sign in

arXiv:2004.03604 [cond-mat.stat-mech]AbstractReferencesReviewsResources

Learning about learning by many-body systems

Weishun Zhong, Jacob M. Gold, Sarah Marzen, Jeremy L. England, Nicole Yunger Halpern

Published 2020-04-07Version 1

Many-body systems from soap bubbles to suspensions to polymers learn the drives that push them far from equilibrium. This learning has been detected with thermodynamic properties, such as work absorption and strain. We progress beyond these macroscopic properties that were first defined for equilibrium contexts: We quantify statistical mechanical learning with representation learning, a machine-learning model in which information squeezes through a bottleneck. We identify a structural parallel between representation learning and far-from-equilibrium statistical mechanics. Applying this parallel, we measure four facets of many-body systems' learning: classification ability, memory capacity, discrimination ability, and novelty detection. Numerical simulations of a classical spin glass illustrate our technique. This toolkit exposes self-organization that eludes detection by thermodynamic measures. Our toolkit more reliably and more precisely detects and quantifies learning by matter.

Comments: 4.5 pages, including 4 figures. Short version of the more detailed arXiv:2001.03623
Related articles: Most relevant | Search more
arXiv:cond-mat/0110585 (Published 2001-10-29)
Optimized Forest-Ruth- and Suzuki-like algorithms for integration of motion in many-body systems
arXiv:1109.6664 [cond-mat.stat-mech] (Published 2011-09-29, updated 2012-04-26)
Conditional pair distributions in many-body systems: Exact results for Poisson ensembles
Symmetry-resolved entanglement in many-body systems