arXiv Analytics

Sign in

arXiv:2302.07384 [cs.LG]AbstractReferencesReviewsResources

The Geometry of Neural Nets' Parameter Spaces Under Reparametrization

Agustinus Kristiadi, Felix Dangel, Philipp Hennig

Published 2023-02-14Version 1

Model reparametrization -- transforming the parameter space via a bijective differentiable map -- is a popular way to improve the training of neural networks. But reparametrizations have also been problematic since they induce inconsistencies in, e.g., Hessian-based flatness measures, optimization trajectories, and modes of probability density functions. This complicates downstream analyses, e.g. one cannot make a definitive statement about the connection between flatness and generalization. In this work, we study the invariance quantities of neural nets under reparametrization from the perspective of Riemannian geometry. We show that this notion of invariance is an inherent property of any neural net, as long as one acknowledges the assumptions about the metric that is always present, albeit often implicitly, and uses the correct transformation rules under reparametrization. We present discussions on measuring the flatness of minima, in optimization, and in probability-density maximization, along with applications in studying the biases of optimizers and in Bayesian inference.

Related articles: Most relevant | Search more
arXiv:2403.07379 [cs.LG] (Published 2024-03-12)
Hallmarks of Optimization Trajectories in Neural Networks and LLMs: The Lengths, Bends, and Dead Ends
arXiv:2506.20623 [cs.LG] (Published 2025-06-25)
Lost in Retraining: Roaming the Parameter Space of Exponential Families Under Closed-Loop Learning
arXiv:2002.04632 [cs.LG] (Published 2020-02-11)
Differentiating the Black-Box: Optimization with Local Generative Surrogates