arXiv Analytics

Sign in

arXiv:2505.06531 [stat.ML]AbstractReferencesReviewsResources

High-Dimensional Importance-Weighted Information Criteria: Theory and Optimality

Yong-Syun Cao, Shinpei Imori, Ching-Kang Ing

Published 2025-05-10Version 1

Imori and Ing (2025) proposed the importance-weighted orthogonal greedy algorithm (IWOGA) for model selection in high-dimensional misspecified regression models under covariate shift. To determine the number of IWOGA iterations, they introduced the high-dimensional importance-weighted information criterion (HDIWIC). They argued that the combined use of IWOGA and HDIWIC, IWOGA + HDIWIC, achieves an optimal trade-off between variance and squared bias, leading to optimal convergence rates in terms of conditional mean squared prediction error. In this article, we provide a theoretical justification for this claim by establishing the optimality of IWOGA + HDIWIC under a set of reasonable assumptions.

Related articles: Most relevant | Search more
arXiv:1912.00458 [stat.ML] (Published 2019-12-01)
On the optimality of kernels for high-dimensional clustering
arXiv:1407.2724 [stat.ML] (Published 2014-07-10, updated 2015-06-13)
On the Optimality of Averaging in Distributed Statistical Learning
arXiv:1709.08148 [stat.ML] (Published 2017-09-24)
On the Optimality of Kernel-Embedding Based Goodness-of-Fit Tests