arXiv Analytics

Sign in

arXiv:1811.10689 [stat.ML]AbstractReferencesReviewsResources

Sequence Alignment with Dirichlet Process Mixtures

Ieva Kazlauskaite, Ivan Ustyuzhaninov, Carl Henrik Ek, Neill D. F. Campbell

Published 2018-11-26Version 1

We present a probabilistic model for unsupervised alignment of high-dimensional time-warped sequences based on the Dirichlet Process Mixture Model (DPMM). We follow the approach introduced in (Kazlauskaite, 2018) of simultaneously representing each data sequence as a composition of a true underlying function and a time-warping, both of which are modelled using Gaussian processes (GPs) (Rasmussen, 2005), and aligning the underlying functions using an unsupervised alignment method. In (Kazlauskaite, 2018) the alignment is performed using the GP latent variable model (GP-LVM) (Lawrence, 2005) as a model of sequences, while our main contribution is extending this approach to using DPMM, which allows us to align the sequences temporally and cluster them at the same time. We show that the DPMM achieves competitive results in comparison to the GP-LVM on synthetic and real-world data sets, and discuss the different properties of the estimated underlying functions and the time-warps favoured by these models.

Comments: 6 pages, 3 figures, "All Of Bayesian Nonparametrics" Workshop at the 32nd Annual Conference on Neural Information Processing Systems (BNP@NeurIPS2018)
Categories: stat.ML, cs.LG
Related articles: Most relevant | Search more
arXiv:2310.12806 [stat.ML] (Published 2023-10-19)
DCSI -- An improved measure of cluster separability based on separation and connectedness
arXiv:1210.4347 [stat.ML] (Published 2012-10-16)
Hilbert Space Embedding for Dirichlet Process Mixtures
arXiv:2406.12764 [stat.ML] (Published 2024-06-18)
Quasi-Bayes meets Vines