arXiv Analytics

Sign in

arXiv:2006.10925 [cs.LG]AbstractReferencesReviewsResources

Gradient Descent in RKHS with Importance Labeling

Tomoya Murata, Taiji Suzuki

Published 2020-06-19Version 1

Labeling cost is often expensive and is a fundamental limitation of supervised learning. In this paper, we study importance labeling problem, in which we are given many unlabeled data and select a limited number of data to be labeled from the unlabeled data, and then a learning algorithm is executed on the selected one. We propose a new importance labeling scheme and analyse the generalization error of gradient descent combined with our labeling scheme in least squares regression in Reproducing Kernel Hilbert Spaces (RKHS). We show that the proposed importance labeling leads to much better generalization ability than uniform one under near interpolation settings. Numerical experiments verify our theoretical findings.

Related articles: Most relevant | Search more
arXiv:2204.08809 [cs.LG] (Published 2022-04-19)
Making Progress Based on False Discoveries
arXiv:1905.05843 [cs.LG] (Published 2019-05-14)
Task-Driven Data Verification via Gradient Descent
arXiv:1903.11680 [cs.LG] (Published 2019-03-27)
Gradient Descent with Early Stopping is Provably Robust to Label Noise for Overparameterized Neural Networks