{ "id": "1910.10879", "version": "v1", "published": "2019-10-24T02:01:07.000Z", "updated": "2019-10-24T02:01:07.000Z", "title": "Convergence Rates of Subgradient Methods for Quasi-convex Optimization Problems", "authors": [ "Yaohua Hu", "Jiawen Li", "Carisa Kwok Wai Yu" ], "comment": "28 pages", "categories": [ "math.OC" ], "abstract": "Quasi-convex optimization acts a pivotal part in many fields including economics and finance; the subgradient method is an effective iterative algorithm for solving large-scale quasi-convex optimization problems. In this paper, we investigate the iteration complexity and convergence rates of various subgradient methods for solving quasi-convex optimization in a unified framework. In particular, we consider a sequence satisfying a general (inexact) basic inequality, and investigate the global convergence theorem and the iteration complexity when using the constant, diminishing or dynamic stepsize rules. More importantly, we establish the linear (or sublinear) convergence rates of the sequence under an additional assumption of weak sharp minima of H\\\"{o}lderian order and upper bounded noise. These convergence theorems are applied to establish the iteration complexity and convergence rates of several subgradient methods, including the standard/inexact/conditional subgradient methods, for solving quasi-convex optimization problems under the assumptions of the H\\\"{o}lder condition and/or the weak sharp minima of H\\\"{o}lderian order.", "revisions": [ { "version": "v1", "updated": "2019-10-24T02:01:07.000Z" } ], "analyses": { "subjects": [ "65K05", "90C26" ], "keywords": [ "convergence rates", "iteration complexity", "weak sharp minima", "solving large-scale quasi-convex optimization problems", "convergence theorem" ], "note": { "typesetting": "TeX", "pages": 28, "language": "en", "license": "arXiv", "status": "editable" } } }