arXiv Analytics

Sign in

arXiv:2407.02146 [math.OC]AbstractReferencesReviewsResources

Coderivative-Based Newton Methods with Wolfe Linesearch for Nonsmooth Optimization

Miantao Chao, Boris S. Mordukhovich, Zijian Shi, Jin Zhang

Published 2024-07-02Version 1

This paper introduces and develops novel coderivative-based Newton methods with Wolfe linesearch conditions to solve various classes of problems in nonsmooth optimization. We first propose a generalized regularized Newton method with Wolfe linesearch (GRNM-W) for unconstrained $C^{1,1}$ minimization problems (which are second-order nonsmooth) and establish global as well as local superlinear convergence of their iterates. To deal with convex composite minimization problems (which are first-order nonsmooth and can be constrained), we combine the proposed GRNM-W with two algorithmic frameworks: the forward-backward envelope and the augmented Lagrangian method resulting in the two new algorithms called CNFB and CNAL, respectively. Finally, we present numerical results to solve Lasso and support vector machine problems appearing in, e.g., machine learning and statistics, which demonstrate the efficiency of the proposed algorithms.

Related articles: Most relevant | Search more
arXiv:1610.03446 [math.OC] (Published 2016-10-11)
Nonsmooth optimization using Taylor-like models: error bounds, convergence, and termination criteria
arXiv:2505.07143 [math.OC] (Published 2025-05-11)
Subgradient Regularization: A Descent-Oriented Subgradient Method for Nonsmooth Optimization
arXiv:1311.2367 [math.OC] (Published 2013-11-11)
Potentialities of Nonsmooth Optimization