arXiv Analytics

Sign in

arXiv:2109.13359 [cs.LG]AbstractReferencesReviewsResources

Lyapunov-Net: A Deep Neural Network Architecture for Lyapunov Function Approximation

Nathan Gaby, Fumin Zhang, Xiaojing Ye

Published 2021-09-27, updated 2022-08-17Version 2

We develop a versatile deep neural network architecture, called Lyapunov-Net, to approximate Lyapunov functions of dynamical systems in high dimensions. Lyapunov-Net guarantees positive definiteness, and thus it can be easily trained to satisfy the negative orbital derivative condition, which only renders a single term in the empirical risk function in practice. This significantly reduces the number of hyper-parameters compared to existing methods. We also provide theoretical justifications on the approximation power of Lyapunov-Net and its complexity bounds. We demonstrate the efficiency of the proposed method on nonlinear dynamical systems involving up to 30-dimensional state spaces, and show that the proposed approach significantly outperforms the state-of-the-art methods.

Related articles: Most relevant | Search more
arXiv:1912.10382 [cs.LG] (Published 2019-12-22)
Deep Learning via Dynamical Systems: An Approximation Perspective
arXiv:1906.09088 [cs.LG] (Published 2019-06-21)
Meta-Model Framework for Surrogate-Based Parameter Estimation in Dynamical Systems
arXiv:2408.06465 [cs.LG] (Published 2024-08-12)
Kernel Sum of Squares for Data Adapted Kernel Learning of Dynamical Systems from Data: A global optimization approach