arXiv Analytics

Sign in

arXiv:1911.04285 [stat.ML]AbstractReferencesReviewsResources

Maximum a-Posteriori Estimation for the Gaussian Mixture Model via Mixed Integer Nonlinear Programming

Patrick Flaherty, Pitchaya Wiratchotisatian, Ji Ah Lee, Andrew C. Trapp

Published 2019-11-08Version 1

We present a global optimization approach for solving the classical maximum a-posteriori (MAP) estimation problem for the Gaussian mixture model. Our approach formulates the MAP estimation problem as a mixed-integer nonlinear optimization problem (MINLP). Our method provides a certificate of global optimality, can accommodate side constraints, and is extendable to other finite mixture models. We propose an approximation to the MINLP hat transforms it into a mixed integer quadratic program (MIQP) which preserves global optimality within desired accuracy and improves computational aspects. Numerical experiments compare our method to standard estimation approaches and show that our method finds the globally optimal MAP for some standard data sets, providing a benchmark for comparing estimation methods.

Related articles: Most relevant | Search more
arXiv:2209.15224 [stat.ML] (Published 2022-09-30)
Unsupervised Multi-task and Transfer Learning on Gaussian Mixture Models
arXiv:1508.06388 [stat.ML] (Published 2015-08-26)
Gaussian Mixture Models with Component Means Constrained in Pre-selected Subspaces
arXiv:2411.05591 [stat.ML] (Published 2024-11-08)
Network EM Algorithm for Gaussian Mixture Model in Decentralized Federated Learning