arXiv Analytics

Sign in

arXiv:2406.03171 [stat.ML]AbstractReferencesReviewsResources

High-Dimensional Kernel Methods under Covariate Shift: Data-Dependent Implicit Regularization

Yihang Chen, Fanghui Liu, Taiji Suzuki, Volkan Cevher

Published 2024-06-05Version 1

This paper studies kernel ridge regression in high dimensions under covariate shifts and analyzes the role of importance re-weighting. We first derive the asymptotic expansion of high dimensional kernels under covariate shifts. By a bias-variance decomposition, we theoretically demonstrate that the re-weighting strategy allows for decreasing the variance. For bias, we analyze the regularization of the arbitrary or well-chosen scale, showing that the bias can behave very differently under different regularization scales. In our analysis, the bias and variance can be characterized by the spectral decay of a data-dependent regularized kernel: the original kernel matrix associated with an additional re-weighting matrix, and thus the re-weighting strategy can be regarded as a data-dependent regularization for better understanding. Besides, our analysis provides asymptotic expansion of kernel functions/vectors under covariate shift, which has its own interest.

Related articles: Most relevant | Search more
arXiv:1809.08159 [stat.ML] (Published 2018-09-21)
Intractable Likelihood Regression for Covariate Shift by Kernel Mean Embedding
arXiv:2502.09047 [stat.ML] (Published 2025-02-13)
Optimal Algorithms in Linear Regression under Covariate Shift: On the Importance of Precondition
arXiv:2405.16594 [stat.ML] (Published 2024-05-26)
Training-Conditional Coverage Bounds under Covariate Shift