arXiv Analytics

Sign in

arXiv:1410.8479 [math.OC]AbstractReferencesReviewsResources

Metric Selection in Douglas-Rachford Splitting and ADMM

Pontus Giselsson, Stephen Boyd

Published 2014-10-30Version 1

The performance of Douglas-Rachford splitting and the alternating direction method of multipliers (ADMM) is sensitive to the choice of algorithm parameters and on conditioning of the problem data. For a restricted class of problems that enjoy a linear rate of convergence, we show in this paper how to select the algorithm parameters to optimize a bound on the convergence rate. We show that our rate improves and/or generalizes previously known linear convergence rate bounds for these algorithms. For smooth and strongly convex finite dimensional problems, we show that the linear convergence rate bounds depend on the metric that is used in the algorithm. We show how to select a metric for which this bound is optimized. Since most real-world problems are not both smooth and strongly convex, we also propose heuristic metric and parameter selection methods to improve the performance of a much wider class of problem. These heuristic methods can be applied to problems arising, e.g., in compressed sensing, statistical estimation, model predictive control, and medical imaging. The efficiency of the proposed heuristics is confirmed in a numerical example on a model predictive control problem, where improvements of more than one order of magnitude are observed.

Related articles: Most relevant | Search more
arXiv:1904.01415 [math.OC] (Published 2019-03-29)
Synthesis of model predictive control based on data-driven learning
arXiv:1906.05112 [math.OC] (Published 2019-06-12)
Model Predictive Control, Cost Controllability, and Homogeneity
arXiv:1709.05747 [math.OC] (Published 2017-09-18)
Douglas-Rachford splitting and ADMM for nonconvex optimization: new convergence results and accelerated versions