arXiv Analytics

Sign in

arXiv:2207.09788 [math.OC]AbstractReferencesReviewsResources

Incremental Quasi-Newton Algorithms for Solving Nonconvex, Nonsmooth, Finite-Sum Optimization Problems

Gulcin Dinc Yalcin, Frank E. Curtis

Published 2022-07-20Version 1

Algorithms for solving nonconvex, nonsmooth, finite-sum optimization problems are proposed and tested. In particular, the algorithms are proposed and tested in the context of an optimization problem formulation arising in semi-supervised machine learning. The common feature of all algorithms is that they employ an incremental quasi-Newton (IQN) strategy, specifically an incremental BFGS (IBFGS) strategy. One applies an IBFGS strategy to the problem directly, whereas the others apply an IBFGS strategy to a difference-of-convex reformulation, smoothed approximation, or (strongly) convex local approximation. Experiments show that all IBFGS approaches fare well in practice, and all outperform a state-of-the-art bundle method.

Related articles: Most relevant | Search more
arXiv:2103.08280 [math.OC] (Published 2021-03-15)
Lower Complexity Bounds of Finite-Sum Optimization Problems: The Results and Construction
arXiv:2403.07806 [math.OC] (Published 2024-03-12)
A Stochastic GDA Method With Backtracking For Solving Nonconvex (Strongly) Concave Minimax Problems
arXiv:1611.04982 [math.OC] (Published 2016-11-15)
Oracle Complexity of Second-Order Methods for Finite-Sum Problems