arXiv Analytics

Sign in

arXiv:1705.02502 [math.OC]AbstractReferencesReviewsResources

Linearized ADMM for Non-convex Non-smooth Optimization with Convergence Analysis

Qinghua Liu, Xinyue Shen, Yuantao Gu

Published 2017-05-06Version 1

Linearized alternating direction method of multipliers (ADMM) as an extension of ADMM has been widely used to solve linearly constrained problems in signal processing, machine leaning, communications, and many other fields. Despite its broad applications in non-convex optimization, for a great number of non-convex and non-smooth objective functions, its theoretical convergence guarantee is still an open problem. In this paper, we study the convergence of an existing two-block linearized ADMM and a newly proposed multi-block parallel linearized ADMM for problems with non-convex and non-smooth objectives. Mathematically, we present that the algorithms can converge for a broader class of objective functions under less strict assumptions compared with previous works. Our proposed algorithm can update coupled variables in parallel and work for general non-convex problems, where the traditional ADMM may have difficulties in solving subproblems.

Related articles: Most relevant | Search more
arXiv:1801.07766 [math.OC] (Published 2018-01-23)
A convergence analysis of the method of codifferential descent
arXiv:1508.03899 [math.OC] (Published 2015-08-17)
Convergence Analysis of Algorithms for DC Programming
arXiv:1702.05142 [math.OC] (Published 2017-02-16)
Exact Diffusion for Distributed Optimization and Learning --- Part II: Convergence Analysis