arXiv Analytics

Sign in

arXiv:1401.0435 [math.NA]AbstractReferencesReviewsResources

A global minimization algorithm for Tikhonov functionals with sparsity constraints

Wei Wang, Stephan W. Anzengruber, Ronny Ramlau, Bo Han

Published 2014-01-02, updated 2014-05-16Version 2

In this paper we present a globally convergent algorithm for the computation of a minimizer of the Tikhonov functional with sparsity promoting penalty term for nonlinear forward operators in Banach space. The dual TIGRA method uses a gradient descent iteration in the dual space at decreasing values of the regularization parameter $\alpha_j$, where the approximation obtained with $\alpha_j$ serves as the starting value for the dual iteration with parameter $\alpha_{j+1}$. With the discrepancy principle as a global stopping rule the method further yields an automatic parameter choice. We prove convergence of the algorithm under suitable step-size selection and stopping rules and illustrate our theoretic results with numerical experiments for the nonlinear autoconvolution problem.

Related articles: Most relevant | Search more
arXiv:2001.02991 [math.NA] (Published 2020-01-09)
A note on the minimization of a Tikhonov functional with $\ell^1$-penalty
arXiv:1810.08775 [math.NA] (Published 2018-10-20)
Tikhonov regularization with l^0-term complementing a convex penalty: l^1 convergence under sparsity constraints
arXiv:2209.10531 [math.NA] (Published 2022-09-21)
Autocorrelation analysis for cryo-EM with sparsity constraints: Improved sample complexity and projection-based algorithms