arXiv Analytics

Sign in

arXiv:2006.02250 [cs.LG]AbstractReferencesReviewsResources

dynoNet: a neural network architecture for learning dynamical systems

Marco Forgione, Dario Piga

Published 2020-06-03Version 1

This paper introduces a network architecture, called dynoNet, utilizing linear dynamical operators as elementary building blocks. Owing to the dynamical nature of these blocks, dynoNet networks are tailored for sequence modeling and system identification purposes. The back-propagation behavior of the linear dynamical operator with respect to both its parameters and its input sequence is defined. This enables end-to-end training of structured networks containing linear dynamical operators and other differentiable units, exploiting existing deep learning software. Examples show the effectiveness of the proposed approach on well-known system identification benchmarks. Examples show the effectiveness of the proposed approach against well-known system identification benchmarks.

Related articles: Most relevant | Search more
arXiv:2106.04546 [cs.LG] (Published 2021-06-08)
LEADS: Learning Dynamical Systems that Generalize Across Environments
arXiv:2205.09459 [cs.LG] (Published 2022-05-19)
Neural Network Architecture Beyond Width and Depth
arXiv:1905.08300 [cs.LG] (Published 2019-05-20)
A Neural Network Architecture for Learning Word-Referent Associations in Multiple Contexts