arXiv Analytics

Sign in

arXiv:2303.15588 [math.OC]AbstractReferencesReviewsResources

Square Root {LASSO}: well-posedness, Lipschitz stability and the tuning trade off

Aaron Berk, Simone Brugiapaglia, Tim Hoheisel

Published 2023-03-27Version 1

This paper studies well-posedness and parameter sensitivity of the Square Root LASSO (SR-LASSO), an optimization model for recovering sparse solutions to linear inverse problems in finite dimension. An advantage of the SR-LASSO (e.g., over the standard LASSO) is that the optimal tuning of the regularization parameter is robust with respect to measurement noise. This paper provides three point-based regularity conditions at a solution of the SR-LASSO: the weak, intermediate, and strong assumptions. It is shown that the weak assumption implies uniqueness of the solution in question. The intermediate assumption yields a directionally differentiable and locally Lipschitz solution map (with explicit Lipschitz bounds), whereas the strong assumption gives continuous differentiability of said map around the point in question. Our analysis leads to new theoretical insights on the comparison between SR-LASSO and LASSO from the viewpoint of tuning parameter sensitivity: noise-robust optimal parameter choice for SR-LASSO comes at the "price" of elevated tuning parameter sensitivity. Numerical results support and showcase the theoretical findings.

Related articles: Most relevant | Search more
arXiv:2402.05215 [math.OC] (Published 2024-02-07)
Geometric characterizations of Lipschitz stability for convex optimization problems
arXiv:2409.13118 [math.OC] (Published 2024-09-19)
Lipschitz stability of least-squares problems regularized by functions with $\mathcal{C}^2$-cone reducible conjugates
arXiv:2101.06711 [math.OC] (Published 2021-01-17)
Generalized Differentiation of Expected-Integral Mappings with Applications to Stochastic Programming, II: Leibniz Rules and Lipschitz Stability