arXiv Analytics

Sign in

arXiv:1804.05098 [math.OC]AbstractReferencesReviewsResources

On the Differentiability of the Solution to Convex Optimization Problems

Shane Barratt

Published 2018-04-13Version 1

In this paper, we provide conditions under which one can take derivatives of the solution to a convex optimization problem with respect to problem data. These conditions are that Slater's condition holds, the functions involved are twice differentiable, and that a certain Jacobian is nonsingular. The derivation involves applying the implicit function theorem to the necessary and sufficient KKT system for optimality.

Related articles: Most relevant | Search more
arXiv:2406.09786 [math.OC] (Published 2024-06-14)
Convergence analysis of a regularized Newton method with generalized regularization terms for convex optimization problems
arXiv:2505.09030 [math.OC] (Published 2025-05-13)
Aging-Aware Battery Control via Convex Optimization
arXiv:1302.1056 [math.OC] (Published 2013-02-05, updated 2014-12-23)
A generalization of Löwner-John's ellipsoid theorem