arXiv Analytics

Sign in

arXiv:2406.02578 [cs.LG]AbstractReferencesReviewsResources

Pretrained Mobility Transformer: A Foundation Model for Human Mobility

Xinhua Wu, Haoyu He, Yanchao Wang, Qi Wang

Published 2024-05-29Version 1

Ubiquitous mobile devices are generating vast amounts of location-based service data that reveal how individuals navigate and utilize urban spaces in detail. In this study, we utilize these extensive, unlabeled sequences of user trajectories to develop a foundation model for understanding urban space and human mobility. We introduce the \textbf{P}retrained \textbf{M}obility \textbf{T}ransformer (PMT), which leverages the transformer architecture to process user trajectories in an autoregressive manner, converting geographical areas into tokens and embedding spatial and temporal information within these representations. Experiments conducted in three U.S. metropolitan areas over a two-month period demonstrate PMT's ability to capture underlying geographic and socio-demographic characteristics of regions. The proposed PMT excels across various downstream tasks, including next-location prediction, trajectory imputation, and trajectory generation. These results support PMT's capability and effectiveness in decoding complex patterns of human mobility, offering new insights into urban spatial functionality and individual mobility preferences.

Related articles: Most relevant | Search more
arXiv:2407.17880 [cs.LG] (Published 2024-07-25)
DAM: Towards A Foundation Model for Time Series Forecasting
Luke Darlow et al.
arXiv:2503.07851 [cs.LG] (Published 2025-03-10, updated 2025-05-16)
TwinTURBO: Semi-Supervised Fine-Tuning of Foundation Models via Mutual Information Decompositions for Downstream Task and Latent Spaces
arXiv:2502.05505 [cs.LG] (Published 2025-02-08)
Differentially Private Synthetic Data via APIs 3: Using Simulators Instead of Foundation Model