arXiv Analytics

Sign in

arXiv:1610.01349 [math.OC]AbstractReferencesReviewsResources

A Fast Gradient Method for Nonnegative Sparse Regression with Self Dictionary

Nicolas Gillis, Robert Luce

Published 2016-10-05Version 1

Nonnegative matrix factorization (NMF) can be computed efficiently under the separability assumption, which asserts that all the columns of the input data matrix belong to the convex cone generated by only a few of its columns. The provably most robust methods to identify these basis columns are based on nonnegative sparse regression and self dictionary, and require the solution of large-scale convex optimization problems. In this paper we study a particular nonnegative sparse regression model with self dictionary. As opposed to previously proposed models, it is a smooth optimization problem where sparsity is enforced through appropriate linear constraints. We show that the Euclidean projection on the set defined by these constraints can be computed efficiently, and propose a fast gradient method to solve our model. We show the effectiveness of the approach compared to state-of-the-art methods on several synthetic data sets and real-world hyperspectral images.

Related articles: Most relevant | Search more
arXiv:1805.09077 [math.OC] (Published 2018-05-23)
Accelerating the Fast Gradient Method
arXiv:2003.05667 [math.OC] (Published 2020-03-12)
Fast Gradient Method for Model Predictive Control with Input Rate and Amplitude Constraints
arXiv:2012.15361 [math.OC] (Published 2020-12-30)
Frank-Wolfe Methods with an Unbounded Feasible Region and Applications to Structured Learning