arXiv Analytics

Sign in

arXiv:cond-mat/9801200AbstractReferencesReviewsResources

Generalizing with perceptrons in case of structured phase- and pattern-spaces

G. Dirscherl, B. Schottky, U. Krey

Published 1998-01-20Version 1

We investigate the influence of different kinds of structure on the learning behaviour of a perceptron performing a classification task defined by a teacher rule. The underlying pattern distribution is permitted to have spatial correlations. The prior distribution for the teacher coupling vectors itself is assumed to be nonuniform. Thus classification tasks of quite different difficulty are included. As learning algorithms we discuss Hebbian learning, Gibbs learning, and Bayesian learning with different priors, using methods from statistics and the replica formalism. We find that the Hebb rule is quite sensitive to the structure of the actual learning problem, failing asymptotically in most cases. Contrarily, the behaviour of the more sophisticated methods of Gibbs and Bayes learning is influenced by the spatial correlations only in an intermediate regime of $\alpha$, where $\alpha$ specifies the size of the training set. Concerning the Bayesian case we show, how enhanced prior knowledge improves the performance.

Comments: LaTeX, 32 pages with eps-figs, accepted by J Phys A
Categories: cond-mat.dis-nn
Related articles: Most relevant | Search more
arXiv:cond-mat/9809427 (Published 1998-09-30)
Tracer Dispersion in Porous Media with Spatial Correlations
arXiv:cond-mat/0007074 (Published 2000-07-05)
Robust chaos generation by a perceptron
arXiv:cond-mat/9608092 (Published 1996-08-21, updated 1997-01-13)
Multifractality and percolation in the coupling space of perceptrons