arXiv Analytics

Sign in

arXiv:2205.09647 [math.OC]AbstractReferencesReviewsResources

The First Optimal Acceleration of High-Order Methods in Smooth Convex Optimization

Dmitry Kovalev, Alexander Gasnikov

Published 2022-05-19Version 1

In this paper, we study the fundamental open question of finding the optimal high-order algorithm for solving smooth convex minimization problems. Arjevani et al. (2019) established the lower bound $\Omega\left(\epsilon^{-2/(3p+1)}\right)$ on the number of the $p$-th order oracle calls required by an algorithm to find an $\epsilon$-accurate solution to the problem, where the $p$-th order oracle stands for the computation of the objective function value and the derivatives up to the order $p$. However, the existing state-of-the-art high-order methods of Gasnikov et al. (2019b); Bubeck et al. (2019); Jiang et al. (2019) achieve the oracle complexity $\mathcal{O}\left(\epsilon^{-2/(3p+1)} \log (1/\epsilon)\right)$, which does not match the lower bound. The reason for this is that these algorithms require performing a complex binary search procedure, which makes them neither optimal nor practical. We fix this fundamental issue by providing the first algorithm with $\mathcal{O}\left(\epsilon^{-2/(3p+1)}\right)$ $p$-th order oracle complexity.

Related articles: Most relevant | Search more
arXiv:2011.12341 [math.OC] (Published 2020-11-24)
Sequential convergence of AdaGrad algorithm for smooth convex optimization
arXiv:2001.07999 [math.OC] (Published 2020-01-22)
Curiosities and counterexamples in smooth convex optimization
arXiv:1606.01327 [math.OC] (Published 2016-06-04)
Bridging Nonsmooth and Smooth Convex Optimization