arXiv Analytics

Sign in

arXiv:1207.3254 [math.OC]AbstractReferencesReviewsResources

A variable smoothing algorithm for solving convex optimization problems

Radu Ioan Bot, Christopher Hendrich

Published 2012-07-13Version 1

In this article we propose a method for solving unconstrained optimization problems with convex and Lipschitz continuous objective functions. By making use of the Moreau envelopes of the functions occurring in the objective, we smooth the latter to a convex and differentiable function with Lipschitz continuous gradient by using both variable and constant smoothing parameters. The resulting problem is solved via an accelerated first-order method and this allows us to recover approximately the optimal solutions to the initial optimization problem with a rate of convergence of order $\O(\tfrac{\ln k}{k})$ for variable smoothing and of order $\O(\tfrac{1}{k})$ for constant smoothing. Some numerical experiments employing the variable smoothing method in image processing and in supervised learning classification are also presented.

Related articles: Most relevant | Search more
arXiv:1705.06164 [math.OC] (Published 2017-05-17)
A general framework for solving convex optimization problems involving the sum of three convex functions
arXiv:1511.02974 [math.OC] (Published 2015-11-10)
New Computational Guarantees for Solving Convex Optimization Problems with First Order Methods, via a Function Growth Condition Measure
arXiv:2209.12467 [math.OC] (Published 2022-09-26)
Convergence rate of the (1+1)-evolution strategy on locally strongly convex functions with lipschitz continuous gradient and their monotonic transformations