arXiv Analytics

Sign in

arXiv:1508.06388 [stat.ML]AbstractReferencesReviewsResources

Gaussian Mixture Models with Component Means Constrained in Pre-selected Subspaces

Mu Qiao, Jia Li

Published 2015-08-26Version 1

We investigate a Gaussian mixture model (GMM) with component means constrained in a pre-selected subspace. Applications to classification and clustering are explored. An EM-type estimation algorithm is derived. We prove that the subspace containing the component means of a GMM with a common covariance matrix also contains the modes of the density and the class means. This motivates us to find a subspace by applying weighted principal component analysis to the modes of a kernel density and the class means. To circumvent the difficulty of deciding the kernel bandwidth, we acquire multiple subspaces from the kernel densities based on a sequence of bandwidths. The GMM constrained by each subspace is estimated; and the model yielding the maximum likelihood is chosen. A dimension reduction property is proved in the sense of being informative for classification or clustering. Experiments on real and simulated data sets are conducted to examine several ways of determining the subspace and to compare with the reduced rank mixture discriminant analysis (MDA). Our new method with the simple technique of spanning the subspace only by class means often outperforms the reduced rank MDA when the subspace dimension is very low, making it particularly appealing for visualization.

Related articles: Most relevant | Search more
arXiv:2411.05591 [stat.ML] (Published 2024-11-08)
Network EM Algorithm for Gaussian Mixture Model in Decentralized Federated Learning
arXiv:2310.10843 [stat.ML] (Published 2023-10-16)
Probabilistic Classification by Density Estimation Using Gaussian Mixture Model and Masked Autoregressive Flow
arXiv:2209.15224 [stat.ML] (Published 2022-09-30)
Unsupervised Multi-task and Transfer Learning on Gaussian Mixture Models