Skip to content

Code release for "Non-stationary Transformers: Exploring the Stationarity in Time Series Forecasting" (NeurIPS 2022), https://arxiv.org/abs/2205.14415

License

Notifications You must be signed in to change notification settings

thuml/Nonstationary_Transformers

Repository files navigation

Non-stationary Transformers

This is the codebase for the paper: Non-stationary Transformers: Exploring the Stationarity in Time Series Forecasting, NeurIPS 2022. [Slides], [Poster].

🚩 News (2023.02) Non-stationary Transformer has been included in [Time-Series-Library], which covers long- and short-term forecasting, imputation, anomaly detection, and classification.

Discussions

There are already several discussions about our paper, we appreciate a lot for their valuable comments and efforts: [Official], [OpenReview], [Zhihu].

Architecture

arch

Series Stationarization

Series Stationarization unifies the statistics of each input and converts the output with restored statistics for better predictability.

arch

De-stationary Attention

De-stationary Attention is devised to recover the intrinsic non-stationary information into temporal dependencies by approximating distinguishable attentions learned from unstationarized series.

arch

Showcases

arch

Preparation

  1. Install Python 3.7 and neccessary dependencies.
pip install -r requirements.txt
  1. All the six benchmark datasets can be obtained from Google Drive or Tsinghua Cloud.

Training scripts

Non-stationary Transformer

We provide the Non-stationary Transformer experiment scripts and hyperparameters of all benchmark dataset under the folder ./scripts.

# Transformer with our framework
bash ./scripts/ECL_script/ns_Transformer.sh
bash ./scripts/Traffic_script/ns_Transformer.sh
bash ./scripts/Weather_script/ns_Transformer.sh
bash ./scripts/ILI_script/ns_Transformer.sh
bash ./scripts/Exchange_script/ns_Transformer.sh
bash ./scripts/ETT_script/ns_Transformer.sh
# Transformer baseline
bash ./scripts/ECL_script/Transformer.sh
bash ./scripts/Traffic_script/Transformer.sh
bash ./scripts/Weather_script/Transformer.sh
bash ./scripts/ILI_script/Transformer.sh
bash ./scripts/Exchange_script/Transformer.sh
bash ./scripts/ETT_script/Transformer.sh

Non-stationary framework to promote other Attention-based models

We also provide the scripts for other Attention-based models (Informer, Autoformer), for example:

# Informer promoted by our Non-stationary framework
bash ./scripts/Exchange_script/Informer.sh
bash ./scripts/Exchange_script/ns_Informer.sh

# Autoformer promoted by our Non-stationary framework
bash ./scripts/Weather_script/Autoformer.sh
bash ./scripts/Weather_script/ns_Autoformer.sh

Experiment Results

Main Results

For multivariate forecasting results, the vanilla Transformer equipped with our framework consistently achieves state-of-the-art performance in all six benchmarks and prediction lengths.

arch

Model Promotion

By applying our framework to six mainstream Attention-based models. Our method consistently improves the forecasting ability. Overall, it achieves averaged 49.43% promotion on Transformer, 47.34% on Informer, 46.89% on Reformer, 10.57% on Autoformer, 5.17% on ETSformer and 4.51% on FEDformer, making each of them surpass previous state-of-the-art.

arch

Future Work

We will keep equip the following models with our proposed Non-stationary Transformers framework:

  • Transformer
  • iTransformer
  • Informer
  • Autoformer
  • FEDformer
  • Crossformer
  • Reformer
  • ......

Note: Series Stationarization as an architecture-free module has been widely applied for addressing non-stationarity in time series. Please refer to time-series-library for the implementationdetails.

Citation

If you find this repo useful, please cite our paper.

@article{liu2022non,
  title={Non-stationary Transformers: Exploring the Stationarity in Time Series Forecasting},
  author={Liu, Yong and Wu, Haixu and Wang, Jianmin and Long, Mingsheng},
  booktitle={Advances in Neural Information Processing Systems},
  year={2022}
}

Contact

If you have any questions or want to use the code, please contact liuyong21@mails.tsinghua.edu.cn.

Acknowledgement

This repo is built on the Autoformer repo, we appreciate the authors a lot for their valuable code and efforts.

About

Code release for "Non-stationary Transformers: Exploring the Stationarity in Time Series Forecasting" (NeurIPS 2022), https://arxiv.org/abs/2205.14415

Topics

Resources

License

Stars

Watchers

Forks