Skip to yearly menu bar Skip to main content


Transformer-Modulated Diffusion Models for Probabilistic Multivariate Time Series Forecasting

Yuxin Li · Wenchao Chen · Xinyue Hu · Bo Chen · baolin sun · Mingyuan Zhou

Halle B #63
[ ]
Tue 7 May 1:45 a.m. PDT — 3:45 a.m. PDT


Transformers have gained widespread usage in multivariate time series (MTS) forecasting, delivering impressive performance. Nonetheless, these existing transformer-based methods often neglect an essential aspect: the incorporation of uncertainty into the predicted series, which holds significant value in decision-making. In this paper, we introduce a Transformer-Modulated Diffusion Model (TMDM), uniting conditional diffusion generative process with transformers into a unified framework to enable precise distribution forecasting for MTS. TMDM harnesses the power of transformers to extract essential insights from historical time series data. This information is then utilized as prior knowledge, capturing covariate-dependence in both the forward and reverse processes within the diffusion model. Furthermore, we seamlessly integrate well-designed transformer-based forecasting methods into TMDM to enhance its overall performance. Additionally, we introduce two novel metrics for evaluating uncertainty estimation performance. Through extensive experiments on six datasets using four evaluation metrics, we establish the effectiveness of TMDM in probabilistic MTS forecasting.

Chat is not available.