arXiv Analytics

Sign in

arXiv:1208.0472 [math.PR]AbstractReferencesReviewsResources

On the form of the large deviation rate function for the empirical measures of weakly interacting systems

Markus Fischer

Published 2012-08-02, updated 2014-10-16Version 4

A basic result of large deviations theory is Sanov's theorem, which states that the sequence of empirical measures of independent and identically distributed samples satisfies the large deviation principle with rate function given by relative entropy with respect to the common distribution. Large deviation principles for the empirical measures are also known to hold for broad classes of weakly interacting systems. When the interaction through the empirical measure corresponds to an absolutely continuous change of measure, the rate function can be expressed as relative entropy of a distribution with respect to the law of the McKean-Vlasov limit with measure-variable frozen at that distribution. We discuss situations, beyond that of tilted distributions, in which a large deviation principle holds with rate function in relative entropy form.

Comments: Published in at http://dx.doi.org/10.3150/13-BEJ540 the Bernoulli (http://isi.cbs.nl/bernoulli/) by the International Statistical Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm)
Journal: Bernoulli 2014, Vol. 20, No. 4, 1765-1801
Categories: math.PR
Subjects: 60F10, 60K35, 94A17
Related articles: Most relevant | Search more
arXiv:2409.20337 [math.PR] (Published 2024-09-30)
Alternative representation of the large deviation rate function and hyperparameter tuning schemes for Metropolis-Hastings Markov Chains
arXiv:2304.01384 [math.PR] (Published 2023-04-03)
Large Deviations for Empirical Measures of Self-Interacting Markov Chains
arXiv:1407.0836 [math.PR] (Published 2014-07-03, updated 2014-10-20)
A Lower Bound on the Relative Entropy with Respect to a Symmetric Probability