arXiv Analytics

Sign in

arXiv:cond-mat/9611027AbstractReferencesReviewsResources

Finite size scaling in neural networks

Walter Nadler, Wolfgang Fink

Published 1996-11-05Version 1

We demonstrate that the fraction of pattern sets that can be stored in single- and hidden-layer perceptrons exhibits finite size scaling. This feature allows to estimate the critical storage capacity \alpha_c from simulations of relatively small systems. We illustrate this approach by determining \alpha_c, together with the finite size scaling exponent \nu, for storing Gaussian patterns in committee and parity machines with binary couplings and up to K=5 hidden units.

Comments: 4 pages, RevTex, 5 figures, uses multicol.sty and psfig.sty
Related articles: Most relevant | Search more
arXiv:2409.13597 [cond-mat.dis-nn] (Published 2024-09-20)
Fluctuation-learning relationship in neural networks
arXiv:cond-mat/9710352 (Published 1997-10-01)
Self-Wiring of Neural Networks
arXiv:2309.09240 [cond-mat.dis-nn] (Published 2023-09-17)
High-dimensional manifold of solutions in neural networks: insights from statistical physics