{ "id": "1703.05038", "version": "v1", "published": "2017-03-15T09:26:52.000Z", "updated": "2017-03-15T09:26:52.000Z", "title": "Harmonic Mean Iteratively Reweighted Least Squares for Low-Rank Matrix Recovery", "authors": [ "Christian Kümmerle", "Juliane Sigl" ], "comment": "46 pages, 4 figures", "categories": [ "math.NA" ], "abstract": "We propose a new iteratively reweighted least squares (IRLS) algorithm for the recovery of a matrix $X \\in \\mathbb{C}^{d_1\\times d_2}$ of rank $r \\ll\\min(d_1,d_2)$ from incomplete linear observations, solving a sequence of low complexity linear problems. The easily implementable algorithm, which we call harmonic mean iteratively reweighted least squares (HM-IRLS), optimizes a non-convex Schatten-$p$ quasi-norm penalization to promote low-rankness and carries three major strengths, in particular for the matrix completion setting. First, the algorithm converges globally to the low-rank matrix for relevant, interesting cases, for which any other (non-)convex state-of-the-art optimization approach fails the recovery. Secondly, HM-IRLS exhibits an empirical recovery probability close to $100\\%$ even for a number of measurements very close to the theoretical lower bound $r (d_1 +d_2 -r)$, i.e., already for significantly fewer linear observations than any other tractable approach in the literature. Thirdly, HM-IRLS exhibits a locally superlinear rate of convergence (of order $2-p$) if the linear observations fulfill a suitable null space property. While for the first two properties we have so far only strong empirical evidence, we prove the third property as our main theoretical result.", "revisions": [ { "version": "v1", "updated": "2017-03-15T09:26:52.000Z" } ], "analyses": { "keywords": [ "low-rank matrix recovery", "harmonic mean", "convex state-of-the-art optimization approach fails", "low complexity linear problems", "incomplete linear observations" ], "note": { "typesetting": "TeX", "pages": 46, "language": "en", "license": "arXiv", "status": "editable" } } }