arXiv Analytics

Sign in

arXiv:1311.4400 [math.PR]AbstractReferencesReviewsResources

Asymptotic description of neural networks with correlated synaptic weights

Olivier Faugeras, James MacLaurin

Published 2013-11-18, updated 2013-12-13Version 3

We study the asymptotic law of a network of interacting neurons when the number of neurons becomes infinite. Given a completely connected network of neurons in which the synaptic weights are Gaussian correlated random variables, we describe the asymptotic law of the network when the number of neurons goes to infinity. We introduce the process-level empirical measure of the trajectories of the solutions to the equations of the finite network of neurons and the averaged law (with respect to the synaptic weights) of the trajectories of the solutions to the equations of the network of neurons. The main result of this article is that the image law through the empirical measure satisfies a large deviation principle with a good rate function which is shown to have a unique global minimum. Our analysis of the rate function allows us also to characterize the limit measure as the image of a stationary Gaussian measure defined on a transformed set of trajectories.

Comments: This paper has been withdrawn by the authors. 50 pages. arXiv admin note: substantial text overlap with arXiv:1302.1029 This paper has been withdrawn because I meant to replace arXiv:1302.1029, not arXiv:1311.4400
Categories: math.PR
Related articles: Most relevant | Search more
arXiv:1302.1029 [math.PR] (Published 2013-02-05, updated 2013-05-31)
A large deviation principle for networks of rate neurons with correlated synaptic weights
arXiv:1901.10248 [math.PR] (Published 2019-01-29)
The meanfield limit of a network of Hopfield neurons with correlated synaptic weights
arXiv:1806.11426 [math.PR] (Published 2018-06-29)
Continuity result for the rate function of the simple random walk on supercritical percolation clusters