arXiv Analytics

Sign in

arXiv:1701.00831 [cs.LG]AbstractReferencesReviewsResources

Collapsing of dimensionality

Marco Gori, Marco Maggini, Alessandro Rossi

Published 2017-01-03Version 1

We analyze a new approach to Machine Learning coming from a modification of classical regularization networks by casting the process in the time dimension, leading to a sort of collapse of dimensionality in the problem of learning the model parameters. This approach allows the definition of a online learning algorithm that progressively accumulates the knowledge provided in the input trajectory. The regularization principle leads to a solution based on a dynamical system that is paired with a procedure to develop a graph structure that stores the input regularities acquired from the temporal evolution. We report an extensive experimental exploration on the behavior of the parameter of the proposed model and an evaluation on artificial dataset.

Related articles: Most relevant | Search more
arXiv:1301.2269 [cs.LG] (Published 2013-01-10)
Learning the Dimensionality of Hidden Variables
arXiv:2007.11604 [cs.LG] (Published 2020-07-22)
Understanding the temporal evolution of COVID-19 research through machine learning and natural language processing
arXiv:1802.10123 [cs.LG] (Published 2018-02-27)
Latent-space Physics: Towards Learning the Temporal Evolution of Fluid Flow