arXiv Analytics

Sign in

arXiv:2202.04545 [math.OC]AbstractReferencesReviewsResources

Lower Complexity Bounds for Minimizing Regularized Functions

Nikita Doikov

Published 2022-02-09Version 1

In this paper, we establish lower bounds for the oracle complexity of the first-order methods minimizing regularized convex functions. We consider the composite representation of the objective. The smooth part has H\"older continuous gradient of degree $\nu \in [0, 1]$ and is accessible by a black-box local oracle. The composite part is a power of a norm. We prove that the best possible rate for the first-order methods in the large-scale setting for Euclidean norms is of the order $O(k^{- p(1 + 3\nu) / (2(p - 1 - \nu))})$ for the functional residual, where $k$ is the iteration counter and $p$ is the power of regularization. Our formulation covers several cases, including computation of the Cubically regularized Newton step by the first-order gradient methods, in which case the rate becomes $O(k^{-6})$. It can be achieved by the Fast Gradient Method. Thus, our result proves the latter rate to be optimal. We also discover lower complexity bounds for non-Euclidean norms.

Related articles: Most relevant | Search more
arXiv:2003.03910 [math.OC] (Published 2020-03-09)
Geometry of First-Order Methods and Adaptive Acceleration
arXiv:1610.01349 [math.OC] (Published 2016-10-05)
A Fast Gradient Method for Nonnegative Sparse Regression with Self Dictionary
arXiv:1908.08394 [math.OC] (Published 2019-08-22)
A General Analysis Framework of Lower Complexity Bounds for Finite-Sum Optimization