arXiv:cond-mat/0203568AbstractReferencesReviewsResources
Bistable Gradient Networks in the Thermodynamic Limit
Patrick N. McGraw, Michael Menzinger
Published 2002-03-27Version 1
We examine the large-network, low-loading behaviour of an attractor neural network, the so-called bistable gradient network (BGN). We use analytical and numerical methods to characterize the attractor states of the network and their basins of attraction. The energy landscape is more complex than that of the Hopfield network and depends on the strength of the coupling among units. At weak coupling, the BGN acts as a highly selective associative memory; the input must be close to the one of the stored patterns in order to be recognized. A category of spurious attractors occurs which is not present in the Hopfield network. Stronger coupling results in a transition to a more Hopfield-like regime with large basins of attraction. The basins of attraction for spurious attractors are noticeably suppressed compared to the Hopfield case, even though the Hebbian synaptic structure is the same and there is no stochastic noise.