{ "id": "1706.00098", "version": "v1", "published": "2017-05-31T21:29:40.000Z", "updated": "2017-05-31T21:29:40.000Z", "title": "Bayesian $l_0$ Regularized Least Squares", "authors": [ "Nicholas G. Polson", "Lei Sun" ], "comment": "21 pages, 6 figures, 1 table", "categories": [ "stat.ML", "stat.CO" ], "abstract": "Bayesian $l_0$-regularized least squares provides a variable selection technique for high dimensional predictors. The challenge in $l_0$ regularization is optimizing a non-convex objective function via search over model space consisting of all possible predictor combinations, a NP-hard task. Spike-and-slab (a.k.a. Bernoulli-Gaussian, BG) priors are the gold standard for Bayesian variable selection, with a caveat of computational speed and scalability. We show that a Single Best Replacement (SBR) algorithm is a fast scalable alternative. Although SBR calculates a sparse posterior mode, we show that it possesses a number of equivalences and optimality properties of a posterior mean. To illustrate our methodology, we provide simulation evidence and a real data example on the statistical properties and computational efficiency of SBR versus direct posterior sampling using spike-and-slab priors. Finally, we conclude with directions for future research.", "revisions": [ { "version": "v1", "updated": "2017-05-31T21:29:40.000Z" } ], "analyses": { "subjects": [ "62-04" ], "keywords": [ "real data example", "sparse posterior mode", "high dimensional predictors", "single best replacement", "sbr calculates" ], "note": { "typesetting": "TeX", "pages": 21, "language": "en", "license": "arXiv", "status": "editable" } } }