arXiv Analytics

Sign in

arXiv:2302.14112 [cond-mat.dis-nn]AbstractReferencesReviewsResources

Injectivity of ReLU networks: perspectives from statistical physics

Antoine Maillard, Afonso S. Bandeira, David Belius, Ivan Dokmanić, Shuta Nakajima

Published 2023-02-27Version 1

When can the input of a ReLU neural network be inferred from its output? In other words, when is the network injective? We consider a single layer, $x \mapsto \mathrm{ReLU}(Wx)$, with a random Gaussian $m \times n$ matrix $W$, in a high-dimensional setting where $n, m \to \infty$. Recent work connects this problem to spherical integral geometry giving rise to a conjectured sharp injectivity threshold for $\alpha = \frac{m}{n}$ by studying the expected Euler characteristic of a certain random set. We adopt a different perspective and show that injectivity is equivalent to a property of the ground state of the spherical perceptron, an important spin glass model in statistical physics. By leveraging the (non-rigorous) replica symmetry-breaking theory, we derive analytical equations for the threshold whose solution is at odds with that from the Euler characteristic. Furthermore, we use Gordon's min--max theorem to prove that a replica-symmetric upper bound refutes the Euler characteristic prediction. Along the way we aim to give a tutorial-style introduction to key ideas from statistical physics in an effort to make the exposition accessible to a broad audience. Our analysis establishes a connection between spin glasses and integral geometry but leaves open the problem of explaining the discrepancies.

Related articles: Most relevant | Search more
arXiv:cond-mat/0004306 (Published 2000-04-18)
Application of Statistical Physics to Politics
arXiv:2306.01477 [cond-mat.dis-nn] (Published 2023-06-02)
Statistical physics of learning in high-dimensional chaotic systems
arXiv:2309.17006 [cond-mat.dis-nn] (Published 2023-09-29)
Statistical physics, Bayesian inference and neural information processing