arXiv Analytics

Sign in

arXiv:2305.12686 [stat.ML]AbstractReferencesReviewsResources

Conformal Inference for Invariant Risk Minimization

Wenlu Tang, Zicheng Liu

Published 2023-05-22Version 1

The application of machine learning models can be significantly impeded by the occurrence of distributional shifts, as the assumption of homogeneity between the population of training and testing samples in machine learning and statistics may not be feasible in practical situations. One way to tackle this problem is to use invariant learning, such as invariant risk minimization (IRM), to acquire an invariant representation that aids in generalization with distributional shifts. This paper develops methods for obtaining distribution-free prediction regions to describe uncertainty estimates for invariant representations, accounting for the distribution shifts of data from different environments. Our approach involves a weighted conformity score that adapts to the specific environment in which the test sample is situated. We construct an adaptive conformal interval using the weighted conformity score and prove its conditional average under certain conditions. To demonstrate the effectiveness of our approach, we conduct several numerical experiments, including simulation studies and a practical example using real-world data.

Related articles: Most relevant | Search more
arXiv:2310.08209 [stat.ML] (Published 2023-10-12)
Conformal inference for regression on Riemannian Manifolds
arXiv:2004.05007 [stat.ML] (Published 2020-04-10)
An Empirical Study of Invariant Risk Minimization
arXiv:2307.11972 [stat.ML] (Published 2023-07-22)
Out-of-Distribution Optimality of Invariant Risk Minimization