arXiv:0912.1660 [math.OC]AbstractReferencesReviewsResources
Gradient-based methods for sparse recovery
William Hager, Dzung Phan, Hongchao Zhang
Published 2009-12-09Version 1
The convergence rate is analyzed for the SpaSRA algorithm (Sparse Reconstruction by Separable Approximation) for minimizing a sum $f (\m{x}) + \psi (\m{x})$ where $f$ is smooth and $\psi$ is convex, but possibly nonsmooth. It is shown that if $f$ is convex, then the error in the objective function at iteration $k$, for $k$ sufficiently large, is bounded by $a/(b+k)$ for suitable choices of $a$ and $b$. Moreover, if the objective function is strongly convex, then the convergence is $R$-linear. An improved version of the algorithm based on a cycle version of the BB iteration and an adaptive line search is given. The performance of the algorithm is investigated using applications in the areas of signal processing and image reconstruction.