arXiv Analytics

Sign in

arXiv:2205.06798 [cs.LG]AbstractReferencesReviewsResources

Sharp Asymptotics of Kernel Ridge Regression Beyond the Linear Regime

Hong Hu, Yue M. Lu

Published 2022-05-13Version 1

The generalization performance of kernel ridge regression (KRR) exhibits a multi-phased pattern that crucially depends on the scaling relationship between the sample size $n$ and the underlying dimension $d$. This phenomenon is due to the fact that KRR sequentially learns functions of increasing complexity as the sample size increases; when $d^{k-1}\ll n\ll d^{k}$, only polynomials with degree less than $k$ are learned. In this paper, we present sharp asymptotic characterization of the performance of KRR at the critical transition regions with $n \asymp d^k$, for $k\in\mathbb{Z}^{+}$. Our asymptotic characterization provides a precise picture of the whole learning process and clarifies the impact of various parameters (including the choice of the kernel function) on the generalization performance. In particular, we show that the learning curves of KRR can have a delicate "double descent" behavior due to specific bias-variance trade-offs at different polynomial scaling regimes.

Related articles: Most relevant | Search more
arXiv:2009.09136 [cs.LG] (Published 2020-09-19)
Kernel Ridge Regression Using Importance Sampling with Application to Seismic Response Prediction
arXiv:2306.07737 [cs.LG] (Published 2023-06-13)
Robustness and Generalization Performance of Deep Learning Models on Cyber-Physical Systems: A Comparative Study
arXiv:2401.01270 [cs.LG] (Published 2024-01-02)
Optimal Rates of Kernel Ridge Regression under Source Condition in Large Dimensions