arXiv Analytics

Sign in

arXiv:2409.19200 [math.OC]AbstractReferencesReviewsResources

Faster Acceleration for Steepest Descent

Site Bai, Brian Bullins

Published 2024-09-28, updated 2025-02-06Version 2

Recent advances (Sherman, 2017; Sidford and Tian, 2018; Cohen et al., 2021) have overcome the fundamental barrier of dimension dependence in the iteration complexity of solving $\ell_\infty$ regression with first-order methods. Yet it remains unclear to what extent such acceleration can be achieved for general $\ell_p$ smooth functions. In this paper, we propose a new accelerated first-order method for convex optimization under non-Euclidean smoothness assumptions. In contrast to standard acceleration techniques, our approach uses primal-dual iterate sequences taken with respect to $\textit{differing}$ norms, which are then coupled using an $\textit{implicitly}$ determined interpolation parameter. For $\ell_p$ norm smooth problems in $d$ dimensions, our method provides an iteration complexity improvement of up to $O(d^{1-\frac{2}{p}})$ in terms of calls to a first-order oracle, thereby allowing us to circumvent long-standing barriers in accelerated non-Euclidean steepest descent.

Related articles: Most relevant | Search more
arXiv:2301.02268 [math.OC] (Published 2023-01-05)
Restarts subject to approximate sharpness: A parameter-free and optimal scheme for first-order methods
arXiv:2510.01168 [math.OC] (Published 2025-10-01)
A first-order method for constrained nonconvex--nonconcave minimax problems under a local Kurdyka-Łojasiewicz condition
arXiv:2507.01932 [math.OC] (Published 2025-07-02)
A first-order method for nonconvex-nonconcave minimax problems under a local Kurdyka-Łojasiewicz condition