arXiv:1705.06164 [math.OC]AbstractReferencesReviewsResources
A general framework for solving convex optimization problems involving the sum of three convex functions
Published 2017-05-17Version 1
In this paper, we consider solving a class of convex optimization problem which minimizes the sum of three convex functions $f(x)+g(x)+h(Bx)$, where $f(x)$ is differentiable with a Lipschitz continuous gradient, $g(x)$ and $h(x)$ have closed-form expression of their proximity operators and $B$ is a bounded linear operator. This type of optimization problem has wide application in signal and image processing. To make full use of the differentiability function in the optimization problem, we take advantage of two operator splitting methods: the forward backward splitting method and the three operator splitting method. In the iteration scheme derived from the two operator splitting methods, we need to compute the proximity operator of $g+h \circ B$ and $h \circ B$, respectively. Although these proximity operators don't have a closed-form solution, they can be solved very effectively. We mainly employ two different approaches to solving these proximity operators: one is dual and the other is primal-dual. Following this way, we fortunately find that three existing iterative algorithms including Condat and Vu's algorithm, primal-dual fixed point (PDFP) algorithm and primal-dual three operator (PD3O) algorithm are a special case of our proposed iterative algorithms. Moreover, we discover a new kind of iterative algorithm to solve the considered optimization problem, which is not covered by the existing ones. Numerical experiments applied on the fused Lasso problem and the constrained total variation regularization in computed tomography image reconstruction demonstrate the effectiveness and efficiency of the proposed iterative algorithms.