arXiv Analytics

Sign in

arXiv:2506.20623 [cs.LG]AbstractReferencesReviewsResources

Lost in Retraining: Roaming the Parameter Space of Exponential Families Under Closed-Loop Learning

Fariba Jangjoo, Matteo Marsili, Yasser Roudi

Published 2025-06-25Version 1

Closed-loop learning is the process of repeatedly estimating a model from data generated from the model itself. It is receiving great attention due to the possibility that large neural network models may, in the future, be primarily trained with data generated by artificial neural networks themselves. We study this process for models that belong to exponential families, deriving equations of motions that govern the dynamics of the parameters. We show that maximum likelihood estimation of the parameters endows sufficient statistics with the martingale property and that as a result the process converges to absorbing states that amplify initial biases present in the data. However, we show that this outcome may be prevented by polluting the data with an infinitesimal fraction of data points generated from a fixed model, by relying on maximum a posteriori estimation or by introducing regularisation. Furthermore, we show that the asymptotic behavior of the dynamics is not reparametrisation invariant.

Related articles: Most relevant | Search more
arXiv:2302.07384 [cs.LG] (Published 2023-02-14)
The Geometry of Neural Nets' Parameter Spaces Under Reparametrization
arXiv:2002.04632 [cs.LG] (Published 2020-02-11)
Differentiating the Black-Box: Optimization with Local Generative Surrogates
arXiv:1207.4131 [cs.LG] (Published 2012-07-11)
Exponential Families for Conditional Random Fields