arXiv Analytics

Sign in

arXiv:1402.7291 [math.OC]AbstractReferencesReviewsResources

Optimal subgradient algorithms with application to large-scale linear inverse problems

Masoud Ahookhosh

Published 2014-02-28, updated 2014-05-27Version 3

This study addresses some algorithms for solving structured unconstrained convex optimiza- tion problems using first-order information where the underlying function includes high-dimensional data. The primary aim is to develop an implementable algorithmic framework for solving problems with multi- term composite objective functions involving linear mappings using the optimal subgradient algorithm, OSGA, proposed by Neumaier in [49]. To this end, we propose some prox-functions for which the cor- responding subproblem of OSGA is solved in a closed form. Considering various inverse problems arising in signal and image processing, machine learning, statistics, we report extensive numerical and compar- isons with several state-of-the-art solvers proposing favourably performance of our algorithm. We also compare with the most widely used optimal first-order methods for some smooth and nonsmooth con- vex problems. Surprisingly, when some Nesterov-type optimal methods originally proposed for smooth problems are adapted for solving nonsmooth problems by simply passing a subgradient instead of the gradient, the results of these subgradient-based algorithms are competitive and totally interesting for solving nonsmooth problems. Finally, the OSGA software package is available.

Related articles: Most relevant | Search more
arXiv:math/0004064 [math.OC] (Published 2000-04-11)
The fractional - order controllers: Methods for their synthesis and application
arXiv:1403.2816 [math.OC] (Published 2014-03-12, updated 2015-04-17)
S-Lemma with Equality and Its Applications
arXiv:1511.02204 [math.OC] (Published 2015-11-06)
An Extended Frank-Wolfe Method with "In-Face" Directions, and its Application to Low-Rank Matrix Completion