arXiv Analytics

Sign in

arXiv:1910.10879 [math.OC]AbstractReferencesReviewsResources

Convergence Rates of Subgradient Methods for Quasi-convex Optimization Problems

Yaohua Hu, Jiawen Li, Carisa Kwok Wai Yu

Published 2019-10-24Version 1

Quasi-convex optimization acts a pivotal part in many fields including economics and finance; the subgradient method is an effective iterative algorithm for solving large-scale quasi-convex optimization problems. In this paper, we investigate the iteration complexity and convergence rates of various subgradient methods for solving quasi-convex optimization in a unified framework. In particular, we consider a sequence satisfying a general (inexact) basic inequality, and investigate the global convergence theorem and the iteration complexity when using the constant, diminishing or dynamic stepsize rules. More importantly, we establish the linear (or sublinear) convergence rates of the sequence under an additional assumption of weak sharp minima of H\"{o}lderian order and upper bounded noise. These convergence theorems are applied to establish the iteration complexity and convergence rates of several subgradient methods, including the standard/inexact/conditional subgradient methods, for solving quasi-convex optimization problems under the assumptions of the H\"{o}lder condition and/or the weak sharp minima of H\"{o}lderian order.

Related articles: Most relevant | Search more
arXiv:2302.04099 [math.OC] (Published 2023-02-08)
Extragradient-Type Methods with $\mathcal{O}(1/k)$ Convergence Rates for Co-Hypomonotone Inclusions
arXiv:2109.11516 [math.OC] (Published 2021-09-23)
Weak sharp minima for interval-valued functions and its primal-dual characterizations using generalized Hukuhara subdifferentiability
arXiv:2211.10234 [math.OC] (Published 2022-11-18)
Iteration Complexity of Fixed-Step-Momentum Methods for Convex Quadratic Functions