arXiv Analytics

Sign in

arXiv:cond-mat/0310205AbstractReferencesReviewsResources

Influence of topology on the performance of a neural network

Joaquin J. Torres, Miguel A. Munoz, J. Marro, P. L. Garrido

Published 2003-10-09Version 1

We studied the computational properties of an attractor neural network (ANN) with different network topologies. Though fully connected neural networks exhibit, in general, a good performance, they are biologically unrealistic, as it is unlikely that natural evolution leads to such a large connectivity. We demonstrate that, at finite temperature, the capacity to store and retrieve binary patterns is higher for ANN with scale--free (SF) topology than for highly random--diluted Hopfield networks with the same number of synapses. We also show that, at zero temperature, the relative performance of the SF network increases with increasing values of the distribution power-law exponent. Some consequences and possible applications of our findings are discussed.

Comments: 6 eps Figures. 6 pages. To appear in Neurocomputing
Journal: Neurocomputing, vol. 58-60, pag. 229-234 (2004)
Related articles: Most relevant | Search more
arXiv:cond-mat/0503374 (Published 2005-03-15)
System size resonance in an attractor neural network
Optimized Coefficient of performance of Power law dissipative Carnot like Refrigerator
arXiv:cond-mat/0503626 (Published 2005-03-27, updated 2005-06-29)
Conditions for the emergence of spatial asymmetric states in attractor neural network