arXiv Analytics

Sign in

arXiv:1903.06818 [cond-mat.dis-nn]AbstractReferencesReviewsResources

Generalization from correlated sets of patterns in the perceptron

Francesco Borra, Marco Cosentino Lagomarsino, Pietro Rotondo, Marco Gherardi

Published 2019-03-15Version 1

Generalization is a central aspect of learning theory. Here, we propose a framework that explores an auxiliary task-dependent notion of generalization, and attempts to quantitatively answer the following question: given two sets of patterns with a given degree of dissimilarity, how easily will a network be able to "unify" their interpretation? This is quantified by the volume of the configurations of synaptic weights that classify the two sets in a similar manner. To show the applicability of our idea in a concrete setting, we compute this quantity for the perceptron, a simple binary classifier, using the classical statistical physics approach in the replica-symmetric ansatz. In this case, we show how an analytical expression measures the "distance-based capacity", the maximum load of patterns sustainable by the network, at fixed dissimilarity between patterns and fixed allowed number of errors. This curve indicates that generalization is possible at any distance, but with decreasing capacity. We propose that a distance-based definition of generalization may be useful in numerical experiments with real-world neural networks, and to explore computationally sub-dominant sets of synaptic solutions.

Related articles: Most relevant | Search more
arXiv:cond-mat/9801200 (Published 1998-01-20)
Generalizing with perceptrons in case of structured phase- and pattern-spaces
arXiv:1607.08814 [cond-mat.dis-nn] (Published 2016-07-29)
Hyperuniformity and its Generalizations
arXiv:cond-mat/9910500 (Published 1999-10-29)
Generalization in the Hopfield Model