We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

cs.LG

Change to browse by:

References & Citations

DBLP - CS Bibliography

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Computer Science > Machine Learning

Title: T-Rep: Representation Learning for Time Series using Time-Embeddings

Abstract: Multivariate time series present challenges to standard machine learning techniques, as they are often unlabeled, high dimensional, noisy, and contain missing data. To address this, we propose T-Rep, a self-supervised method to learn time series representations at a timestep granularity. T-Rep learns vector embeddings of time alongside its feature extractor, to extract temporal features such as trend, periodicity, or distribution shifts from the signal. These time-embeddings are leveraged in pretext tasks, to incorporate smooth and fine-grained temporal dependencies in the representations, as well as reinforce robustness to missing data. We evaluate T-Rep on downstream classification, forecasting, and anomaly detection tasks. It is compared to existing self-supervised algorithms for time series, which it outperforms in all three tasks. We test T-Rep in missing data regimes, where it proves more resilient than its counterparts. Finally, we provide latent space visualisation experiments, highlighting the interpretability of the learned representations.
Comments: Accepted at ICLR 2024
Subjects: Machine Learning (cs.LG); Artificial Intelligence (cs.AI)
Cite as: arXiv:2310.04486 [cs.LG]
  (or arXiv:2310.04486v3 [cs.LG] for this version)

Submission history

From: Adrien Bennetot [view email]
[v1] Fri, 6 Oct 2023 15:45:28 GMT (4489kb,D)
[v2] Tue, 28 Nov 2023 17:02:31 GMT (5370kb,D)
[v3] Thu, 9 May 2024 10:11:23 GMT (5487kb,D)

Link back to: arXiv, form interface, contact.