arXiv Analytics

Sign in

arXiv:1912.07972 [math.OC]AbstractReferencesReviewsResources

Contracting Proximal Methods for Smooth Convex Optimization

Nikita Doikov, Yurii Nesterov

Published 2019-12-17Version 1

In this paper, we propose new accelerated methods for smooth Convex Optimization, called Contracting Proximal Methods. At every step of these methods, we need to minimize a contracted version of the objective function augmented by a regularization term in the form of Bregman divergence. We provide global convergence analysis for a general scheme admitting inexactness in solving the auxiliary subproblem. In the case of using for this purpose high-order Tensor Methods, we demonstrate an acceleration effect for both convex and uniformly convex composite objective function. Thus, our construction explains acceleration for methods of any order starting from one. The augmentation of the number of calls of oracle due to computing the contracted proximal steps, is limited by the logarithmic factor in the worst-case complexity bound.

Journal: CORE Discussion Papers ; 2019/27 (2019) 24 pages http://hdl.handle.net/2078.1/223949
Categories: math.OC
Related articles: Most relevant | Search more
arXiv:2011.12341 [math.OC] (Published 2020-11-24)
Sequential convergence of AdaGrad algorithm for smooth convex optimization
arXiv:1606.01327 [math.OC] (Published 2016-06-04)
Bridging Nonsmooth and Smooth Convex Optimization
arXiv:1809.00382 [math.OC] (Published 2018-09-02)
The global rate of convergence for optimal tensor methods in smooth convex optimization