{ "id": "1907.10494", "version": "v1", "published": "2019-07-22T09:58:15.000Z", "updated": "2019-07-22T09:58:15.000Z", "title": "An Improved Gradient Method with Approximately Optimal Stepsize Based on Conic model for Unconstrained Optimization", "authors": [ "Zexian Liu", "Hongwei Liu" ], "comment": "arXiv admin note: text overlap with arXiv:1907.01794", "categories": [ "math.OC" ], "abstract": "A new type of stepsize, which was recently introduced by Liu and Liu (Optimization, 67(3), 427-440, 2018), is called approximately optimal stepsize and is quit efficient for gradient method. Interestingly, all gradient methods can be regarded as gradient methods with approximately optimal stepsizes. In this paper, based on the work (Numer. Algorithms 78(1), 21-39, 2018), we present an improved gradient method with approximately optimal stepsize based on conic model for unconstrained optimization. If the objective function $ f $ is not close to a quadratic on the line segment between the current and latest iterates, we construct a conic model to generate approximately optimal stepsize for gradient method if the conic model can be used; otherwise, we construct some quadratic models to generate approximately optimal stepsizes for gradient method. The convergence of the proposed method is analyzed under suitable conditions. Numerical comparisons with some well-known conjugate gradient software packages such as CG$ \\_ $DESCENT (SIAM J. Optim. 16(1), 170-192, 2005) and CGOPT (SIAM J. Optim. 23(1), 296-320, 2013) indicate the proposed method is very promising.", "revisions": [ { "version": "v1", "updated": "2019-07-22T09:58:15.000Z" } ], "analyses": { "keywords": [ "gradient method", "conic model", "unconstrained optimization", "generate approximately optimal stepsize", "well-known conjugate gradient software packages" ], "note": { "typesetting": "TeX", "pages": 0, "language": "en", "license": "arXiv", "status": "editable" } } }