Skip to yearly menu bar Skip to main content


Virtual presentation / poster accept

Out-of-distribution Representation Learning for Time Series Classification

Wang Lu · Jindong Wang · Xinwei Sun · Yiqiang Chen · Xing Xie

Keywords: [ time series classification ] [ out-of-distribution generalization ] [ domain generalization ] [ Deep Learning and representational learning ]


Abstract:

Time series classification is an important problem in the real world. Due to its non-stationary property that the distribution changes over time, it remains challenging to build models for generalization to unseen distributions. In this paper, we propose to view time series classification from the distribution perspective. We argue that the temporal complexity of a time series dataset could attribute to unknown latent distributions that need characterize. To this end, we propose DIVERSIFY for out-of-distribution (OOD) representation learning on dynamic distributions of times series. DIVERSIFY takes an iterative process: it first obtains the ‘worst-case’ latent distribution scenario via adversarial training, then reduces the gap between these latent distributions. We then show that such an algorithm is theoretically supported. Extensive experiments are conducted on seven datasets with different OOD settings across gesture recognition, speech commands recognition, wearable stress and affect detection, and sensor-based human activity recognition. Qualitative and quantitative results demonstrate that DIVERSIFY significantly outperforms other baselines and effectively characterizes the latent distributions. Code is available at https://github.com/microsoft/robustlearn.

Chat is not available.