{ "id": "2007.15263", "version": "v1", "published": "2020-07-30T07:03:37.000Z", "updated": "2020-07-30T07:03:37.000Z", "title": "A projected gradient method for $α\\ell_{1}-β\\ell_{2}$ sparsity regularization", "authors": [ "Liang Ding", "Weimin Han" ], "comment": "30 pages; 8 figures", "categories": [ "math.NA", "cs.NA" ], "abstract": "The non-convex $\\alpha\\|\\cdot\\|_{\\ell_1}-\\beta\\| \\cdot\\|_{\\ell_2}$ $(\\alpha\\ge\\beta\\geq0)$ regularization has attracted attention in the field of sparse recovery. One way to obtain a minimizer of this regularization is the ST-($\\alpha\\ell_1-\\beta\\ell_2$) algorithm which is similar to the classical iterative soft thresholding algorithm (ISTA). It is known that ISTA converges quite slowly, and a faster alternative to ISTA is the projected gradient (PG) method. However, the conventional PG method is limited to the classical $\\ell_1$ sparsity regularization. In this paper, we present two accelerated alternatives to the ST-($\\alpha\\ell_1-\\beta\\ell_2$) algorithm by extending the PG method to the non-convex $\\alpha\\ell_1-\\beta\\ell_2$ sparsity regularization. Moreover, we discuss a strategy to determine the radius $R$ of the $\\ell_1$-ball constraint by Morozov's discrepancy principle. Numerical results are reported to illustrate the efficiency of the proposed approach.", "revisions": [ { "version": "v1", "updated": "2020-07-30T07:03:37.000Z" } ], "analyses": { "subjects": [ "65K10", "G.1.6" ], "keywords": [ "sparsity regularization", "projected gradient method", "iterative soft thresholding algorithm", "morozovs discrepancy principle", "conventional pg method" ], "note": { "typesetting": "TeX", "pages": 30, "language": "en", "license": "arXiv", "status": "editable" } } }