arXiv Analytics

Sign in

arXiv:1907.10494 [math.OC]AbstractReferencesReviewsResources

An Improved Gradient Method with Approximately Optimal Stepsize Based on Conic model for Unconstrained Optimization

Zexian Liu, Hongwei Liu

Published 2019-07-22Version 1

A new type of stepsize, which was recently introduced by Liu and Liu (Optimization, 67(3), 427-440, 2018), is called approximately optimal stepsize and is quit efficient for gradient method. Interestingly, all gradient methods can be regarded as gradient methods with approximately optimal stepsizes. In this paper, based on the work (Numer. Algorithms 78(1), 21-39, 2018), we present an improved gradient method with approximately optimal stepsize based on conic model for unconstrained optimization. If the objective function $ f $ is not close to a quadratic on the line segment between the current and latest iterates, we construct a conic model to generate approximately optimal stepsize for gradient method if the conic model can be used; otherwise, we construct some quadratic models to generate approximately optimal stepsizes for gradient method. The convergence of the proposed method is analyzed under suitable conditions. Numerical comparisons with some well-known conjugate gradient software packages such as CG$ \_ $DESCENT (SIAM J. Optim. 16(1), 170-192, 2005) and CGOPT (SIAM J. Optim. 23(1), 296-320, 2013) indicate the proposed method is very promising.

Related articles: Most relevant | Search more
arXiv:2303.03534 [math.OC] (Published 2023-03-06, updated 2023-04-21)
Global convergence of the gradient method for functions definable in o-minimal structures
arXiv:2306.10951 [math.OC] (Published 2023-06-19)
A Conic Model for Electrolyzer Scheduling
arXiv:1710.05200 [math.OC] (Published 2017-10-14)
Objective acceleration for unconstrained optimization