arXiv Analytics

Sign in

arXiv:1704.03535 [math.OC]AbstractReferencesReviewsResources

On the Pervasiveness of Difference-Convexity in Optimization and Statistics

Maher Nouiehed, Jong-Shi Pang, Meisam Razaviyayn

Published 2017-04-11Version 1

With the increasing interest in applying the methodology of difference-of-convex (dc) optimization to diverse problems in engineering and statistics, this paper establishes the dc property of many well-known functions not previously known to be of this class. Motivated by a quadratic programming based recourse function in two-stage stochastic programming, we show that the (optimal) value function of a copositive (thus not necessarily convex) quadratic program is dc on the domain of finiteness of the program when the matrix in the objective function's quadratic term and the constraint matrix are fixed. The proof of this result is based on a dc decomposition of a piecewise LC1 function (i.e., functions with Lipschitz gradients). Armed with these new results and known properties of dc functions existed in the literature, we show that many composite statistical functions in risk analysis, including the value-at-risk (VaR), conditional value-at-risk (CVaR), expectation-based, VaR-based, and CVaR-based random deviation functions are all dc. Adding the known class of dc surrogate sparsity functions that are employed as approximations of the l_0 function in statistical learning, our work significantly expands the family of dc functions and positions them for fruitful applications.

Related articles: Most relevant | Search more
arXiv:2002.02657 [math.OC] (Published 2020-02-07)
Optimization of Structural Similarity in Mathematical Imaging
arXiv:1902.08811 [math.OC] (Published 2019-02-23)
A note on optimization in $\mathbb{R}^n$
arXiv:1804.07913 [math.OC] (Published 2018-04-21)
Optimization of a plate with holes