arXiv Analytics

Sign in

arXiv:2405.16594 [stat.ML]AbstractReferencesReviewsResources

Training-Conditional Coverage Bounds under Covariate Shift

Mehrdad Pournaderi, Yu Xiang

Published 2024-05-26Version 1

Training-conditional coverage guarantees in conformal prediction concern the concentration of the error distribution, conditional on the training data, below some nominal level. The conformal prediction methodology has recently been generalized to the covariate shift setting, namely, the covariate distribution changes between the training and test data. In this paper, we study the training-conditional coverage properties of a range of conformal prediction methods under covariate shift via a weighted version of the Dvoretzky-Kiefer-Wolfowitz (DKW) inequality tailored for distribution change. The result for the split conformal method is almost assumption-free, while the results for the full conformal and jackknife+ methods rely on strong assumptions including the uniform stability of the training algorithm.

Related articles: Most relevant | Search more
arXiv:1809.08159 [stat.ML] (Published 2018-09-21)
Intractable Likelihood Regression for Covariate Shift by Kernel Mean Embedding
arXiv:2502.09047 [stat.ML] (Published 2025-02-13)
Optimal Algorithms in Linear Regression under Covariate Shift: On the Importance of Precondition
arXiv:2406.03171 [stat.ML] (Published 2024-06-05)
High-Dimensional Kernel Methods under Covariate Shift: Data-Dependent Implicit Regularization