arXiv Analytics

Sign in

arXiv:2007.06827 [stat.ML]AbstractReferencesReviewsResources

Early stopping and polynomial smoothing in regression with reproducing kernels

Yaroslav Averyanov, Alain Celisse

Published 2020-07-14Version 1

In this paper we study the problem of early stopping for iterative learning algorithms in reproducing kernel Hilbert space (RKHS) in the nonparametric regression framework. In particular, we work with gradient descent and (iterative) kernel ridge regression algorithms. We present a data-driven rule to perform early stopping without a validation set that is based on the so-called minimum discrepancy principle. This method enjoys only one assumption on the regression function: it belongs to a reproducing kernel Hilbert space (RKHS). The proposed rule is proved to be minimax optimal over different types of kernel spaces, including finite rank and Sobolev smoothness classes. The proof is derived from the fixed-point analysis of the localized Rademacher complexities, which is a standard technique for obtaining optimal rates in the nonparametric regression literature. In addition to that, we present simulations results on artificial datasets that show comparable performance of the designed rule with respect to other stopping rules such as the one determined by V-fold cross-validation.

Related articles: Most relevant | Search more
arXiv:2301.11556 [stat.ML] (Published 2023-01-27)
Conformal inference is (almost) free for neural networks trained with early stopping
arXiv:2207.08406 [stat.ML] (Published 2022-07-18)
Kullback-Leibler and Renyi divergences in reproducing kernel Hilbert space and Gaussian process settings
arXiv:2402.04613 [stat.ML] (Published 2024-02-07)
Wasserstein Gradient Flows for Moreau Envelopes of f-Divergences in Reproducing Kernel Hilbert Spaces