arXiv Analytics

Sign in

arXiv:cond-mat/9805135AbstractReferencesReviewsResources

Retrieval dynamics of neural networks for sparsely coded sequential patterns

Katsunori Kitano, Toshio Aoyagi

Published 1998-05-12, updated 1998-05-14Version 2

It is well known that a sparsely coded network in which the activity level is extremely low has intriguing equilibrium properties. In the present work, we study the dynamical properties of a neural network designed to store sparsely coded sequential patterns rather than static ones. Applying the theory of statistical neurodynamics, we derive the dynamical equations governing the retrieval process which are described by some macroscopic order parameters such as the overlap. It is found that our theory provides good predictions for the storage capacity and the basin of attraction obtained through numerical simulations. The results indicate that the nature of the basin of attraction depends on the methods of activity control employed. Furthermore, it is found that robustness against random synaptic dilution slightly deteriorates with the degree of sparseness.

Comments: 9 pages including 4 EPSF figures, latex209, ref[21] is modefied
Journal: J. Phys. A: Math. Gen. 31. L613-L620 (1998)
Categories: cond-mat.dis-nn, q-bio
Related articles: Most relevant | Search more
arXiv:cond-mat/9704098 (Published 1997-04-11)
Phase Transitions of Neural Networks
arXiv:cond-mat/9705182 (Published 1997-05-19)
Stochastic learning in a neural network with adapting synapses
arXiv:cond-mat/0112463 (Published 2001-12-26)
Gauge Symmetry and Neural Networks