arXiv Analytics

Sign in

arXiv:1810.08775 [math.NA]AbstractReferencesReviewsResources

Tikhonov regularization with l^0-term complementing a convex penalty: l^1 convergence under sparsity constraints

Wei Wang, Shuai Lu, Bernd Hofmann, Jin Cheng

Published 2018-10-20Version 1

Measuring the error by an l^1-norm, we analyze under sparsity assumptions an l^0-regularization approach, where the penalty in the Tikhonov functional is complemented by a general stabilizing convex functional. In this context, ill-posed operator equations Ax = y with an injective and bounded linear operator A mapping between l^2 and a Banach space Y are regularized. For sparse solutions, error estimates as well as linear and sublinear convergence rates are derived based on a variational inequality approach, where the regularization parameter can be chosen either a priori in an appropriate way or a posteriori by the sequential discrepancy principle. To further illustrate the balance between the l^0-term and the complementing convex penalty, the important special case of the l^2-norm square penalty is investigated showing explicit dependence between both terms. Finally, some numerical experiments verify and illustrate the sparsity promoting properties of corresponding regularized solutions.

Related articles: Most relevant | Search more
arXiv:1307.0334 [math.NA] (Published 2013-07-01)
Embedded techniques for choosing the parameter in Tikhonov regularization
arXiv:2012.14875 [math.NA] (Published 2020-12-29)
Estimating solution smoothness and data noise with Tikhonov regularization
arXiv:1401.0435 [math.NA] (Published 2014-01-02, updated 2014-05-16)
A global minimization algorithm for Tikhonov functionals with sparsity constraints