arXiv Analytics

Sign in

arXiv:2308.01578 [cs.LG]AbstractReferencesReviewsResources

Unsupervised Representation Learning for Time Series: A Review

Qianwen Meng, Hangwei Qian, Yong Liu, Yonghui Xu, Zhiqi Shen, Lizhen Cui

Published 2023-08-03Version 1

Unsupervised representation learning approaches aim to learn discriminative feature representations from unlabeled data, without the requirement of annotating every sample. Enabling unsupervised representation learning is extremely crucial for time series data, due to its unique annotation bottleneck caused by its complex characteristics and lack of visual cues compared with other data modalities. In recent years, unsupervised representation learning techniques have advanced rapidly in various domains. However, there is a lack of systematic analysis of unsupervised representation learning approaches for time series. To fill the gap, we conduct a comprehensive literature review of existing rapidly evolving unsupervised representation learning approaches for time series. Moreover, we also develop a unified and standardized library, named ULTS (i.e., Unsupervised Learning for Time Series), to facilitate fast implementations and unified evaluations on various models. With ULTS, we empirically evaluate state-of-the-art approaches, especially the rapidly evolving contrastive learning methods, on 9 diverse real-world datasets. We further discuss practical considerations as well as open research challenges on unsupervised representation learning for time series to facilitate future research in this field.

Related articles: Most relevant | Search more
arXiv:2104.08556 [cs.LG] (Published 2021-04-17)
Recursive input and state estimation: A general framework for learning from time series with missing data
arXiv:2310.01720 [cs.LG] (Published 2023-10-03)
PrACTiS: Perceiver-Attentional Copulas for Time Series
arXiv:2105.08179 [cs.LG] (Published 2021-05-17, updated 2021-05-21)
Learning Disentangled Representations for Time Series