arXiv Analytics

Sign in

arXiv:0912.1660 [math.OC]AbstractReferencesReviewsResources

Gradient-based methods for sparse recovery

William Hager, Dzung Phan, Hongchao Zhang

Published 2009-12-09Version 1

The convergence rate is analyzed for the SpaSRA algorithm (Sparse Reconstruction by Separable Approximation) for minimizing a sum $f (\m{x}) + \psi (\m{x})$ where $f$ is smooth and $\psi$ is convex, but possibly nonsmooth. It is shown that if $f$ is convex, then the error in the objective function at iteration $k$, for $k$ sufficiently large, is bounded by $a/(b+k)$ for suitable choices of $a$ and $b$. Moreover, if the objective function is strongly convex, then the convergence is $R$-linear. An improved version of the algorithm based on a cycle version of the BB iteration and an adaptive line search is given. The performance of the algorithm is investigated using applications in the areas of signal processing and image reconstruction.

Comments: 16 pages, submitted to SIAM Journal on Imaging Sciences
Categories: math.OC
Subjects: 90C06, 90C25, 65Y20, 94A08
Related articles: Most relevant | Search more
arXiv:1710.06612 [math.OC] (Published 2017-10-18)
Mirror Descent and Convex Optimization Problems With Non-Smooth Inequality Constraints
arXiv:1802.01062 [math.OC] (Published 2018-02-04)
How to Characterize the Worst-Case Performance of Algorithms for Nonconvex Optimization
arXiv:1506.08247 [math.OC] (Published 2015-06-27)
First order constrained optimization algorithms with feasibility updates