arXiv Analytics

Sign in

arXiv:cs/0512059 [cs.LG]AbstractReferencesReviewsResources

Competing with wild prediction rules

Vladimir Vovk

Published 2005-12-14, updated 2006-01-25Version 2

We consider the problem of on-line prediction competitive with a benchmark class of continuous but highly irregular prediction rules. It is known that if the benchmark class is a reproducing kernel Hilbert space, there exists a prediction algorithm whose average loss over the first N examples does not exceed the average loss of any prediction rule in the class plus a "regret term" of O(N^(-1/2)). The elements of some natural benchmark classes, however, are so irregular that these classes are not Hilbert spaces. In this paper we develop Banach-space methods to construct a prediction algorithm with a regret term of O(N^(-1/p)), where p is in [2,infty) and p-2 reflects the degree to which the benchmark class fails to be a Hilbert space.

Comments: 28 pages, 3 figures
Categories: cs.LG
Subjects: I.2.6
Related articles: Most relevant | Search more
arXiv:cs/0506041 [cs.LG] (Published 2005-06-11, updated 2005-09-02)
Competitive on-line learning with a convex loss function
arXiv:1804.04503 [cs.LG] (Published 2018-04-11)
When optimizing nonlinear objectives is no harder than linear objectives
arXiv:1801.09821 [cs.LG] (Published 2018-01-30)
Learning to Emulate an Expert Projective Cone Scheduler