arXiv Analytics

Sign in

arXiv:2005.12263 [cs.LG]AbstractReferencesReviewsResources

Principal Component Analysis Based on T$\ell_1$-norm Maximization

Xiang-Fei Yang, Yuan-Hai Shao, Chun-Na Li, Li-Ming Liu, Nai-Yang Deng

Published 2020-05-23Version 1

Classical principal component analysis (PCA) may suffer from the sensitivity to outliers and noise. Therefore PCA based on $\ell_1$-norm and $\ell_p$-norm ($0 < p < 1$) have been studied. Among them, the ones based on $\ell_p$-norm seem to be most interesting from the robustness point of view. However, their numerical performance is not satisfactory. Note that, although T$\ell_1$-norm is similar to $\ell_p$-norm ($0 < p < 1$) in some sense, it has the stronger suppression effect to outliers and better continuity. So PCA based on T$\ell_1$-norm is proposed in this paper. Our numerical experiments have shown that its performance is superior than PCA-$\ell_p$ and $\ell_p$SPCA as well as PCA, PCA-$\ell_1$ obviously.

Related articles:
arXiv:2002.10043 [cs.LG] (Published 2020-02-24)
Complete Dictionary Learning via $\ell_p$-norm Maximization
arXiv:2211.03628 [cs.LG] (Published 2022-11-07)
Decentralized Complete Dictionary Learning via $\ell^{4}$-Norm Maximization
arXiv:1906.02435 [cs.LG] (Published 2019-06-06)
Complete Dictionary Learning via $\ell^4$-Norm Maximization over the Orthogonal Group