arXiv:1407.2458 [math.PR]AbstractReferencesReviewsResources
Asymptotic description of stochastic neural networks. II - Characterization of the limit law
Olivier Faugeras, James MacLaurin
Published 2014-07-09Version 1
We continue the development, started in of the asymptotic description of certain stochastic neural networks. We use the Large Deviation Principle (LDP) and the good rate function H announced there to prove that H has a unique minimum mu_e, a stationary measure on the set of trajectories. We characterize this measure by its two marginals, at time 0, and from time 1 to T. The second marginal is a stationary Gaussian measure. With an eye on applications, we show that its mean and covariance operator can be inductively computed. Finally we use the LDP to establish various convergence results, averaged and quenched.
Categories: math.PR
Related articles: Most relevant | Search more
arXiv:1407.2457 [math.PR] (Published 2014-07-09)
Asymptotic description of stochastic neural networks. I - existence of a Large Deviation Principle
arXiv:1412.1366 [math.PR] (Published 2014-12-03)
Characterization of max-continuous local martingales vanishing at infinity
arXiv:2002.00684 [math.PR] (Published 2020-02-03)
Characterization of Brownian Gibbsian line ensembles