arXiv Analytics

Sign in

arXiv:2001.03813 [cs.LG]AbstractReferencesReviewsResources

Fundamental Limits of Online Learning: An Entropic-Innovations Viewpoint

Song Fang, Quanyan Zhu

Published 2020-01-12Version 1

In this paper, we examine the fundamental performance limitations of online machine learning, by viewing the online learning problem as a prediction problem with causal side information. Towards this end, we combine the entropic analysis from information theory and the innovations approach from prediction theory to derive generic lower bounds on the prediction errors as well as the conditions to achieve the bounds. It is seen in general that no specific restrictions have to be imposed on the learning algorithms or the distributions of the data points for the performance bounds to be valid. In particular, the cases of supervised learning, semi-supervised learning, as well as unsupervised learning can all be analyzed accordingly. We also investigate the implications of the results in analyzing the fundamental limits of generalization.

Related articles: Most relevant | Search more
arXiv:2201.12700 [cs.LG] (Published 2022-01-30)
Coordinated Attacks against Contextual Bandits: Fundamental Limits and Defense Mechanisms
arXiv:2208.03567 [cs.LG] (Published 2022-08-06)
On the Fundamental Limits of Formally (Dis)Proving Robustness in Proof-of-Learning
Congyu Fang et al.
arXiv:1311.3494 [cs.LG] (Published 2013-11-14, updated 2014-10-28)
Fundamental Limits of Online and Distributed Algorithms for Statistical Learning and Estimation