TS2Vec: Towards Universal Representation of Time Series

@inproceedings{Yue2021TS2VecTU,
  title={TS2Vec: Towards Universal Representation of Time Series},
  author={Zhihan Yue and Yujing Wang and Juanyong Duan and Tianmeng Yang and Congrui Huang and Yu Tong and Bixiong Xu},
  booktitle={AAAI Conference on Artificial Intelligence},
  year={2021},
  url={https://api.semanticscholar.org/CorpusID:237497421}
}
TS2Vec performs contrastive learning in a hierarchical way over augmented context views, which enables a robust contextual representation for each timestamp, which achieves significant improvement over existing SOTAs of unsupervised time series representation.

Series2Vec: Similarity-based Self-supervised Representation Learning for Time Series Classification

This work argues that time series analysis is fundamentally different in nature to either vision or natural language processing with respect to the forms of meaningful self-supervised learning tasks that can be defined, and introduces a novel approach called Series2Vec for self-supervised representation learning.

Dynamic Contrastive Learning for Time Series Representation

DynaCL is proposed, an unsupervised contrastive representation learning framework for time series that uses temporal adjacent steps to define positive pairs and demonstrates that DynaCL embeds instances from time series into semantically meaningful clusters, which allows superior performance on downstream tasks on a variety of public time series datasets.

T-Rep: Representation Learning for Time Series using Time-Embeddings

T-Rep, a self-supervised method to learn time series representations at a timestep granularity, is proposed and compared to existing self- supervised algorithms for time series, which it outperforms in all three tasks.

Contrastive Representation Learning for Time Series via Compound Consistency and Hierarchical Contrasting

The results show that the linear regressor trained with the representations learned by the proposed model outperforms existing time series prediction models in terms of prediction accuracy and transferability.

An NCDE-based Framework for Universal Representation Learning of Time Series

The proposed CTRL, a framework for universal time series representation learning, employs Neural Controlled Differential Equation as the backbone for TSRL, which captures the continuous processes and exhibits robustness to missing data.

TOTEM: TOkenized Time Series EMbeddings for General Time Series Analysis

The method, TOkenized Time Series EMbeddings (TOTEM), matches or outperforms existing state-of-the-art models in both the canonical specialist setting as well as the generalist setting, which demonstrates the efficacy of tokenization for general time series analysis.

Capturing Temporal Components for Time Series Classification

This work introduces a compositional representation learning approach trained on statistically coherent components extracted from sequential data based on a multi-scale change space, and demonstrates its effectiveness through extensive experiments on publicly available time series classification benchmarks.

UMERICALLY M

This work makes key technical contributions that are tailored to the numerical properties of time-series data and allow the model to scale to large datasets and establishes the new state of the art, even compared with domain-specific non-learning-based methods.

Abstracted Shapes as Tokens -- A Generalizable and Interpretable Model for Time-series Classification

This paper presents VQShape, a pre-trained, generalizable, and interpretable model for time-series representation learning and classification, and shows that the representations of VQShape can be utilized to build interpretable classifiers, achieving comparable performance to specialist models.

Time Series Representation Learning with Supervised Contrastive Temporal Transformer

This work develops a simple, yet novel fusion model, called Supervised COntrastive Temporal Transformer, which performs with high reliability and efficiency on the online CPD problem and investigates its ability to address a real-world task, online Change Point Detection (CPD), on two datasets.
...

Time-Series Representation Learning via Temporal and Contextual Contrasting

An unsupervised Time-Series representation learning framework via Temporal and Contextual Contrasting (TS-TCC), to learn time-series representation from unlabeled data with high efficiency in few-labeled data and transfer learning scenarios is proposed.

TimeNet: Pre-trained deep recurrent neural network for time series classification

TimeNet: a deep recurrent neural network trained on diverse time series in an unsupervised manner using sequence to sequence (seq2seq) models to extract features from time series attempts to generalize time series representation across domains by ingesting time series from several domains simultaneously.

Unsupervised Representation Learning for Time Series with Temporal Neighborhood Coding

A self-supervised framework for learning generalizable representations for non-stationary time series, called Temporal Neighborhood Coding (TNC), takes advantage of the local smoothness of a signal's generative process to define neighborhoods in time with stationary properties.

A Transformer-based Framework for Multivariate Time Series Representation Learning

A novel framework for multivariate time series representation learning based on the transformer encoder architecture, which can offer substantial performance benefits over fully supervised learning on downstream tasks, both with but even without leveraging additional unlabeled data, i.e., by reusing the existing data samples.

Similarity Preserving Representation Learning for Time Series Clustering

An efficient representation learning framework that is able to convert a set of time series with various lengths to an instance-feature matrix that guarantees that the pairwise similarities between time series are well preserved after the transformation, thus the learned feature representation is particularly suitable for the time series clustering task.

DTW-D: time series semi-supervised learning from a single example

It is argued that the availability of the UCR Archive has isolated much of the research community from the following reality, labeled time series data is often very difficult to obtain.

Unsupervised Scalable Representation Learning for Multivariate Time Series

This paper combines an encoder based on causal dilated convolutions with a novel triplet loss employing time-based negative sampling, obtaining general-purpose representations for variable length and multivariate time series.

N-BEATS: Neural basis expansion analysis for interpretable time series forecasting

The proposed deep neural architecture based on backward and forward residual links and a very deep stack of fully-connected layers has a number of desirable properties, being interpretable, applicable without modification to a wide array of target domains, and fast to train.

Spectral Temporal Graph Neural Network for Multivariate Time-series Forecasting

Spectral Temporal Graph Neural Network (StemGNN) is proposed to further improve the accuracy of multivariate time-series forecasting and learns inter-series correlations automatically from the data without using pre-defined priors.

wav2vec 2.0: A Framework for Self-Supervised Learning of Speech Representations

We show for the first time that learning powerful representations from speech audio alone followed by fine-tuning on transcribed speech can outperform the best semi-supervised methods while being
...