arXiv Analytics

Sign in

arXiv:2110.12595 [stat.ML]AbstractReferencesReviewsResources

Fast Rank-1 NMF for Missing Data with KL Divergence

Kazu Ghalamkari, Mahito Sugiyama

Published 2021-10-25, updated 2022-02-18Version 2

We propose a fast non-gradient-based method of rank-1 non-negative matrix factorization (NMF) for missing data, called A1GM, that minimizes the KL divergence from an input matrix to the reconstructed rank-1 matrix. Our method is based on our new finding of an analytical closed-formula of the best rank-1 non-negative multiple matrix factorization (NMMF), a variety of NMF. NMMF is known to exactly solve NMF for missing data if positions of missing values satisfy a certain condition, and A1GM transforms a given matrix so that the analytical solution to NMMF can be applied. We empirically show that A1GM is more efficient than a gradient method with competitive reconstruction errors.

Comments: 16 pages, 5 figures, accepted to the 25th International Conference on Artificial Intelligence and Statistics (AISTATS 2022)
Categories: stat.ML, cs.LG
Subjects: I.2.6
Related articles: Most relevant | Search more
arXiv:2205.03820 [stat.ML] (Published 2022-05-08)
Some performance considerations when using multi-armed bandit algorithms in the presence of missing data
arXiv:2201.12020 [stat.ML] (Published 2022-01-28)
A Robust and Flexible EM Algorithm for Mixtures of Elliptical Distributions with Missing Data
arXiv:2104.03158 [stat.ML] (Published 2021-04-07)
Prediction with Missing Data