Skip to yearly menu bar Skip to main content


In-Person Poster presentation / poster accept

Scaleformer: Iterative Multi-scale Refining Transformers for Time Series Forecasting

Mohammad Amin Shabani · Amir Abdi · Lili Meng · Tristan Sylvain

MH1-2-3-4 #47

Keywords: [ Applications ] [ Time-series forecasting ] [ transformers ]


Abstract:

The performance of time series forecasting has recently been greatly improved by the introduction of transformers. In this paper, we propose a general multi-scale framework that can be applied to state-of-the-art transformer-based time series forecasting models(FEDformer, Autoformer, etc.). Using iteratively refining a forecasted time series at multiple scales with shared weights, architecture adaptations and a specially-designed normalization scheme, we are able to achieve significant performance improvements with minimal additional computational overhead. Via detailed ablation studies, we demonstrate the effectiveness of our proposed architectural and methodological innovations. Furthermore, our experiments on various public datasets demonstrate that the proposed method outperforms the corresponding baselines. Depending on the choice of transformer architecture, our mutli-scale framework results in mean squared error reductions ranging from 5.5% to 38.5%. Our code is publicly available in https://github.com/BorealisAI/scaleformer.

Chat is not available.