arXiv Analytics

Sign in

arXiv:1706.00098 [stat.ML]AbstractReferencesReviewsResources

Bayesian $l_0$ Regularized Least Squares

Nicholas G. Polson, Lei Sun

Published 2017-05-31Version 1

Bayesian $l_0$-regularized least squares provides a variable selection technique for high dimensional predictors. The challenge in $l_0$ regularization is optimizing a non-convex objective function via search over model space consisting of all possible predictor combinations, a NP-hard task. Spike-and-slab (a.k.a. Bernoulli-Gaussian, BG) priors are the gold standard for Bayesian variable selection, with a caveat of computational speed and scalability. We show that a Single Best Replacement (SBR) algorithm is a fast scalable alternative. Although SBR calculates a sparse posterior mode, we show that it possesses a number of equivalences and optimality properties of a posterior mean. To illustrate our methodology, we provide simulation evidence and a real data example on the statistical properties and computational efficiency of SBR versus direct posterior sampling using spike-and-slab priors. Finally, we conclude with directions for future research.

Comments: 21 pages, 6 figures, 1 table
Categories: stat.ML, stat.CO
Subjects: 62-04
Related articles: Most relevant | Search more
arXiv:1602.01120 [stat.ML] (Published 2016-02-02)
On the Nyström and Column-Sampling Methods for the Approximate Principal Components Analysis of Large Data Sets
arXiv:2007.04803 [stat.ML] (Published 2020-07-09)
Online Approximate Bayesian learning
arXiv:2104.11688 [stat.ML] (Published 2021-04-23)
Grouped Feature Importance and Combined Features Effect Plot