arXiv:2207.09788 [math.OC]AbstractReferencesReviewsResources
Incremental Quasi-Newton Algorithms for Solving Nonconvex, Nonsmooth, Finite-Sum Optimization Problems
Gulcin Dinc Yalcin, Frank E. Curtis
Published 2022-07-20Version 1
Algorithms for solving nonconvex, nonsmooth, finite-sum optimization problems are proposed and tested. In particular, the algorithms are proposed and tested in the context of an optimization problem formulation arising in semi-supervised machine learning. The common feature of all algorithms is that they employ an incremental quasi-Newton (IQN) strategy, specifically an incremental BFGS (IBFGS) strategy. One applies an IBFGS strategy to the problem directly, whereas the others apply an IBFGS strategy to a difference-of-convex reformulation, smoothed approximation, or (strongly) convex local approximation. Experiments show that all IBFGS approaches fare well in practice, and all outperform a state-of-the-art bundle method.