arXiv Analytics

Sign in

arXiv:2209.04882 [cond-mat.dis-nn]AbstractReferencesReviewsResources

Statistical mechanics of deep learning beyond the infinite-width limit

S. Ariosto, R. Pacelli, M. Pastore, F. Ginelli, M. Gherardi, P. Rotondo

Published 2022-09-11Version 1

Modern deep neural networks represent a formidable challenge for theorists: even simple one hidden layer fully-connected architectures are in general not analytically tractable with statistical physics techniques. This fact constitutes a barrier against a comprehensive theoretical understanding of deep learning. Huge simplifications arise in the so-called infinite-width limit, where the size of the each hidden layer $N_\ell$ ($\ell=1, \dots, L$, $L$ being the total number of hidden layers of the architecture) is much larger than the size of the dataset $P$. Infinite-width neural networks are well understood since they are equivalent to a particular kernel learning problem. Here we show that the thermodynamics of deep learning with a quadratic loss function, i.e. the calculation of the partition function, is an analytically tractable problem also far away from this limit and, more importantly, at fixed realization of the training set, at least in the asymptotic limit where the size of the hidden layers tends to infinity at fixed ratios $\alpha_\ell = P/N_\ell$. This means that the computation of an extensive number of integrals over the weights of the deep neural networks can be reduced to a saddle-point integration over a finite number of order parameters. In the case of one hidden layer architectures we are able to prove that our result is correct as long as the covariance matrix of the training data satisfies the assumptions of a generalized central limit theorem due to Breuer and Major. In the general case of $L$ hidden layers we derive a $L$-dimensional effective action for the learning problem, which is obtained by iterative integration of the weights starting from the input layer.

Related articles: Most relevant | Search more
arXiv:2412.05439 [cond-mat.dis-nn] (Published 2024-12-06)
Statistical Mechanics of Support Vector Regression
arXiv:2303.15298 [cond-mat.dis-nn] (Published 2023-03-27)
The percolating cluster is invisible to image recognition with deep learning
arXiv:cond-mat/0103275 (Published 2001-03-13)
Retarded Learning: Rigorous Results from Statistical Mechanics