arXiv Analytics

Sign in

arXiv:1703.05060 [cs.LG]AbstractReferencesReviewsResources

Online Learning for Distribution-Free Prediction

Dave Zachariah, Petre Stoica, Thomas B. Schön

Published 2017-03-15Version 1

We develop an online learning method for prediction, which is important in problems with large and/or streaming data sets. We formulate the learning approach using a covariance-fitting methodology, and show that the resulting predictor has desirable computational and distribution-free properties: It is implemented online with a runtime that scales linearly in the number of samples; has a constant memory requirement; avoids local minima problems; and prunes away redundant feature dimensions without relying on restrictive assumptions on the data distribution. In conjunction with the split conformal approach, it also produces distribution-free prediction confidence intervals in a computationally efficient manner. The method is demonstrated on both real and synthetic datasets.

Related articles: Most relevant | Search more
arXiv:1508.00842 [cs.LG] (Published 2015-08-04)
Perceptron like Algorithms for Online Learning to Rank
arXiv:1711.03343 [cs.LG] (Published 2017-11-09)
Analysis of Dropout in Online Learning
arXiv:1810.09666 [cs.LG] (Published 2018-10-23)
Online learning with feedback graphs and switching costs