arXiv Analytics

Sign in

arXiv:2009.08889 [cond-mat.dis-nn]AbstractReferencesReviewsResources

Large Deviation Approach to Random Recurrent Neuronal Networks: Rate Function, Parameter Inference, and Activity Prediction

Alexander van Meegen, Tobias Kühn, Moritz Helias

Published 2020-09-18Version 1

Statistical field theory captures collective non-equilibrium dynamics of neuronal networks, but it does not address the inverse problem of searching the connectivity to implement a desired dynamics. We here show for an analytically solvable network model that the effective action in statistical field theory is identical to the rate function in large deviation theory; using field theoretical methods we derive this rate function. It takes the form of a Kullback-Leibler divergence and enables data-driven inference of model parameters and Bayesian prediction of time series.

Related articles:
arXiv:1801.03726 [cond-mat.dis-nn] (Published 2018-01-11)
Large deviation theory for diluted Wishart random matrices
arXiv:1901.05235 [cond-mat.dis-nn] (Published 2019-01-16)
Large deviations of the length of the longest increasing subsequence of random permutations and random walks
arXiv:2405.10761 [cond-mat.dis-nn] (Published 2024-05-17)
Critical feature learning in deep neural networks