Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Workshop on Learning from Time Series for Health

Conditional Diffusion Models as Self-supervised Learning Backbone for Irregular Time Series

Hamed Shirzad · Ruizhi Deng · He Zhao · Frederick Tung

Keywords: [ time series ] [ Irregular Time Series Classification ] [ Health Data ] [ Diffusion Models ]


Abstract:

Irregular time series are ubiquitous in healthcare, with applications ranging from predicting patient health conditions to imputing missing values. Recent developments in conditional diffusion models, which predict missing values based on observed data, have shown significant promise for imputing regular time series. It also generalizes the self-supervised learning task of maskout reconstruction by replacing partial masking with injecting noise of variable scales to data and shows competitive results on image recognition. Despite the growing interest in diffusion models, their potential for irregular time series data, particularly in downstream tasks, remains underexplored. We propose a conditional diffusion model designed as a self-supervised learning backbone for such data, integrating a learnable time embedding and a cross-dimensional attention mechanism to address the data's complex temporal dynamics. This model not only suits conditional generation tasks naturally but also acquires hidden states beneficial for discriminative tasks. Empirical evidence demonstrates our model's superiority in both imputation and classification tasks.

Chat is not available.