arXiv Analytics

Sign in

arXiv:2310.01720 [cs.LG]AbstractReferencesReviewsResources

PrACTiS: Perceiver-Attentional Copulas for Time Series

Cat P. Le, Chris Cannella, Ali Hasan, Yuting Ng, Vahid Tarokh

Published 2023-10-03Version 1

Transformers incorporating copula structures have demonstrated remarkable performance in time series prediction. However, their heavy reliance on self-attention mechanisms demands substantial computational resources, thus limiting their practical utility across a wide range of tasks. In this work, we present a model that combines the perceiver architecture with a copula structure to enhance time-series forecasting. By leveraging the perceiver as the encoder, we efficiently transform complex, high-dimensional, multimodal data into a compact latent space, thereby significantly reducing computational demands. To further reduce complexity, we introduce midpoint inference and local attention mechanisms, enabling the model to capture dependencies within imputed samples effectively. Subsequently, we deploy the copula-based attention and output variance testing mechanism to capture the joint distribution of missing data, while simultaneously mitigating error propagation during prediction. Our experimental results on the unimodal and multimodal benchmarks showcase a consistent 20\% improvement over the state-of-the-art methods, while utilizing less than half of available memory resources.

Related articles: Most relevant | Search more
arXiv:2104.08556 [cs.LG] (Published 2021-04-17)
Recursive input and state estimation: A general framework for learning from time series with missing data
arXiv:2105.08179 [cs.LG] (Published 2021-05-17, updated 2021-05-21)
Learning Disentangled Representations for Time Series
arXiv:2308.01578 [cs.LG] (Published 2023-08-03)
Unsupervised Representation Learning for Time Series: A Review