arXiv Analytics

Sign in

arXiv:2007.10626 [stat.ML]AbstractReferencesReviewsResources

Sparse Nonnegative Tensor Factorization and Completion with Noisy Observations

Xiongjun Zhang, Michael K. Ng

Published 2020-07-21Version 1

In this paper, we study the sparse nonnegative tensor factorization and completion problem from partial and noisy observations for third-order tensors. Because of sparsity and nonnegativity, the underling tensor is decomposed into the tensor-tensor product of one sparse nonnegative tensor and one nonnegative tensor. We propose to minimize the sum of the maximum likelihood estimate for the observations with nonnegativity constraints and the tensor $\ell_0$ norm for the sparse factor. We show that the error bounds of the estimator of the proposed model can be established under general noise observations. The detailed error bounds under specific noise distributions including additive Gaussian noise, additive Laplace noise, and Poisson observations can be derived. Moreover, the minimax lower bounds are shown to be matched with the established upper bounds up to a logarithmic factor of the sizes of the underlying tensor. These theoretical results for tensors are better than those obtained for matrices, and this illustrates the advantage of the use of nonnegative sparse tensor models for completion and denoising. Numerical experiments are provided to validate the superiority of the proposed tensor-based method compared with the matrix-based approach.

Related articles: Most relevant | Search more
arXiv:2001.06270 [stat.ML] (Published 2020-01-17)
Bayesian inference of dynamics from partial and noisy observations using data assimilation and machine learning
arXiv:1504.00052 [stat.ML] (Published 2015-03-31)
Improved Error Bounds Based on Worst Likely Assignments
arXiv:2408.09004 [stat.ML] (Published 2024-08-16)
Error Bounds for Learning Fourier Linear Operators