arXiv Analytics

Sign in

arXiv:2409.07137 [cs.LG]AbstractReferencesReviewsResources

Combined Optimization of Dynamics and Assimilation with End-to-End Learning on Sparse Observations

Vadim Zinchenko, David S. Greenberg

Published 2024-09-11Version 1

Fitting nonlinear dynamical models to sparse and noisy observations is fundamentally challenging. Identifying dynamics requires data assimilation (DA) to estimate system states, but DA requires an accurate dynamical model. To break this deadlock we present CODA, an end-to-end optimization scheme for jointly learning dynamics and DA directly from sparse and noisy observations. A neural network is trained to carry out data accurate, efficient and parallel-in-time DA, while free parameters of the dynamical system are simultaneously optimized. We carry out end-to-end learning directly on observation data, introducing a novel learning objective that combines unrolled auto-regressive dynamics with the data- and self-consistency terms of weak-constraint 4Dvar DA. By taking into account interactions between new and existing simulation components over multiple time steps, CODA can recover initial conditions, fit unknown dynamical parameters and learn neural network-based PDE terms to match both available observations and self-consistency constraints. In addition to facilitating end-to-end learning of dynamics and providing fast, amortized, non-sequential DA, CODA provides greater robustness to model misspecification than classical DA approaches.

Comments: Submitted to Journal of Advances in Modeling Earth Systems (JAMES)
Categories: cs.LG, physics.ao-ph
Related articles: Most relevant | Search more
arXiv:1612.08810 [cs.LG] (Published 2016-12-28)
The Predictron: End-To-End Learning and Planning
David Silver et al.
arXiv:1805.06523 [cs.LG] (Published 2018-05-16)
End-to-end Learning of a Convolutional Neural Network via Deep Tensor Decomposition
arXiv:2002.05707 [cs.LG] (Published 2020-02-13)
A Framework for End-to-End Learning on Semantic Tree-Structured Data