arXiv Analytics

Sign in

arXiv:1608.08586 [math.OC]AbstractReferencesReviewsResources

The role of convexity on saddle-point dynamics: Lyapunov function and robustness

Ashish Cherukuri, Enrique Mallada, Steven Low, Jorge Cortes

Published 2016-08-30Version 1

This paper studies the projected saddle-point dynamics associated to a convex-concave function, which we term as saddle function. The dynamics consists of gradient descent of the saddle function in variables corresponding to convexity and (projected) gradient ascent in variables corresponding to concavity. Under the assumption that the saddle function is twice continuously differentiable, we provide a novel characterization of the omega-limit set of the trajectories of this dynamics in terms of the diagonal blocks of the Hessian. Using this characterization, we establish global asymptotic convergence of the dynamics under local strong convexity-concavity of the saddle function. When strong convexity-concavity holds globally, we establish three results. First, we identify a Lyapunov function for the projected saddle-point dynamics when the saddle function corresponds to the Lagrangian of a general constrained optimization problem. Second, when the saddle function is the Lagrangian of an optimization problem with equality constraints, we show input-to-state stability of the saddle-point dynamics by providing an ISS Lyapunov function. Third, we design an opportunistic state-triggered implementation of the dynamics. Various examples illustrate our results.

Related articles: Most relevant | Search more
arXiv:1806.08620 [math.OC] (Published 2018-06-22)
On the Robustness and Scalability of Semidefinite Relaxation for Optimal Power Flow Problems
arXiv:2206.04020 [math.OC] (Published 2022-06-08)
Penalty methods to compute stationary solutions for constrained optimization problems
arXiv:1209.4433 [math.OC] (Published 2012-09-20, updated 2013-03-18)
Transverse Contraction Criteria for Existence, Stability, and Robustness of a Limit Cycle