🎉 Getting Started | 💡 Overall Design
📦 Dataset | 🛠️ Scaler | 🧠 Model | 📉 Metrics | 🏃♂️ Runner | 📜 Config | 📜 Baselines
If you find this project helpful, please don't forget to give it a ⭐ Star to show your support. Thank you!
On one hand, BasicTS provides a unified and standardized pipeline, offering a fair and comprehensive platform for reproducing and comparing popular models.
On the other hand, BasicTS offers a user-friendly and easily extensible interface, enabling quick design and evaluation of new models. Users can simply define their model structure and easily perform basic operations.
You can find detailed tutorials in Getting Started. Additionally, we are collecting ToDo and HowTo items. If you need more features (e.g., additional datasets or benchmark models) or tutorials, feel free to open an issue or leave a comment here.
Important
If you find this repository helpful for your work, please consider citing the following benchmarking paper:
@article{shao2023exploring,
title={Exploring Progress in Multivariate Time Series Forecasting: Comprehensive Benchmarking and Heterogeneity Analysis},
author={Shao, Zezhi and Wang, Fei and Xu, Yongjun and Wei, Wei and Yu, Chengqing and Zhang, Zhao and Yao, Di and Jin, Guangyin and Cao, Xin and Cong, Gao and others},
journal={arXiv preprint arXiv:2310.06119},
year={2023}
}
🔥🔥🔥 The paper has been accepted by IEEE TKDE! You can check it out here. 🔥🔥🔥
Users can compare the performance of different models on arbitrary datasets fairly and exhaustively based on a unified and comprehensive pipeline.
Minimum Code
Users only need to implement key codes such as model architecture and data pre/post-processing to build their own deep learning projects.Everything Based on Config
Users can control all the details of the pipeline through a config file, such as the hyperparameter of dataloaders, optimization, and other tricks (*e.g.*, curriculum learning).Support All Devices
BasicTS supports CPU, GPU and GPU distributed training (both single node multiple GPUs and multiple nodes) thanks to using EasyTorch as the backend. Users can use it by setting parameters without modifying any code.Save Training Log
Support `logging` log system and `Tensorboard`, and encapsulate it as a unified interface, users can save customized training logs by calling simple interfaces.For detailed instructions, please refer to the Getting Started tutorial.
BasicTS implements a wealth of models, including classic models, spatial-temporal forecasting models, and long-term time series forecasting model:
You can find the implementation of these models in the baselines directory.
The code links (💻Code) in the table below point to the official implementations from these papers. Many thanks to the authors for open-sourcing their work!
📊Baseline | 📝Title | 📄Paper | 💻Code | 🏛Venue | 🎯Task |
---|---|---|---|---|---|
BigST | Linear Complexity Spatio-Temporal Graph Neural Network for Traffic Forecasting on Large-Scale Road Networks | Link | Link | VLDB'24 | STF |
STDMAE | Spatio-Temporal-Decoupled Masked Pre-training for Traffic Forecasting | Link | Link | IJCAI'24 | STF |
STWave | When Spatio-Temporal Meet Wavelets: Disentangled Traffic Forecasting via Efficient Spectral Graph Attention Networks | Link | Link | ICDE'23 | STF |
STAEformer | Spatio-Temporal Adaptive Embedding Makes Vanilla Transformer SOTA for Traffic Forecasting | Link | Link | CIKM'23 | STF |
MegaCRN | Spatio-Temporal Meta-Graph Learning for Traffic Forecasting | Link | Link | AAAI'23 | STF |
DGCRN | Dynamic Graph Convolutional Recurrent Network for Traffic Prediction: Benchmark and Solution | Link | Link | ACM TKDD'23 | STF |
STID | Spatial-Temporal Identity: A Simple yet Effective Baseline for Multivariate Time Series Forecasting | Link | Link | CIKM'22 | STF |
STEP | Pretraining Enhanced Spatial-temporal Graph Neural Network for Multivariate Time Series Forecasting | Link | Link | SIGKDD'22 | STF |
D2STGNN | Decoupled Dynamic Spatial-Temporal Graph Neural Network for Traffic Forecasting | Link | Link | VLDB'22 | STF |
STNorm | Spatial and Temporal Normalization for Multi-variate Time Series Forecasting | Link | Link | SIGKDD'21 | STF |
STGODE | Spatial-Temporal Graph ODE Networks for Traffic Flow Forecasting | Link | Link | SIGKDD'21 | STF |
GTS | Discrete Graph Structure Learning for Forecasting Multiple Time Series | Link | Link | ICLR'21 | STF |
StemGNN | Spectral Temporal Graph Neural Network for Multivariate Time-series Forecasting | Link | Link | NeurIPS'20 | STF |
MTGNN | Connecting the Dots: Multivariate Time Series Forecasting with Graph Neural Networks | Link | Link | SIGKDD'20 | STF |
AGCRN | Adaptive Graph Convolutional Recurrent Network for Traffic Forecasting | Link | Link | NeurIPS'20 | STF |
GWNet | Graph WaveNet for Deep Spatial-Temporal Graph Modeling | Link | Link | IJCAI'19 | STF |
STGCN | Spatio-Temporal Graph Convolutional Networks: A Deep Learning Framework for Traffic Forecasting | Link | Link | IJCAI'18 | STF |
DCRNN | Diffusion Convolutional Recurrent Neural Network: Data-Driven Traffic Forecasting | Link | Link1, Link2 | ICLR'18 | STF |
📊Baseline | 📝Title | 📄Paper | 💻Code | 🏛Venue | 🎯Task |
---|---|---|---|---|---|
CATS | Are Self-Attentions Effective for Time Series Forecasting? | Link | Link | NeurIPS'24 | LTSF |
Sumba | Structured Matrix Basis for Multivariate Time Series Forecasting with Interpretable Dynamics | Link | Link | NeurIPS'24 | LTSF |
GLAFF | Rethinking the Power of Timestamps for Robust Time Series Forecasting: A Global-Local Fusion Perspective | Link | Link | NeurIPS'24 | LTSF |
CycleNet | CycleNet: Enhancing Time Series Forecasting through Modeling Periodic Patterns Forecasting | Link | Link | NeurIPS'24 | LTSF |
Fredformer | Fredformer: Frequency Debiased Transformer for Time Series Forecasting | Link | Link | KDD'24 | LTSF |
UMixer | An Unet-Mixer Architecture with Stationarity Correction for Time Series Forecasting | Link | Link | AAAI'24 | LTSF |
TimeMixer | Decomposable Multiscale Mixing for Time Series Forecasting | Link | Link | ICLR'24 | LTSF |
Time-LLM | Time-LLM: Time Series Forecasting by Reprogramming Large Language Models | Link | Link | ICLR'24 | LTSF |
SparseTSF | Modeling LTSF with 1k Parameters | Link | Link | ICML'24 | LTSF |
iTrainsformer | Inverted Transformers Are Effective for Time Series Forecasting | Link | Link | ICLR'24 | LTSF |
Koopa | Learning Non-stationary Time Series Dynamics with Koopman Predictors | Link | Link | NeurIPS'24 | LTSF |
CrossGNN | CrossGNN: Confronting Noisy Multivariate Time Series Via Cross Interaction Refinement | Link | Link | NeurIPS'23 | LTSF |
NLinear | Are Transformers Effective for Time Series Forecasting? | Link | Link | AAAI'23 | LTSF |
Crossformer | Transformer Utilizing Cross-Dimension Dependency for Multivariate Time Series Forecasting | Link | Link | ICLR'23 | LTSF |
DLinear | Are Transformers Effective for Time Series Forecasting? | Link | Link | AAAI'23 | LTSF |
DSformer | A Double Sampling Transformer for Multivariate Time Series Long-term Prediction | Link | Link | CIKM'23 | LTSF |
SegRNN | Segment Recurrent Neural Network for Long-Term Time Series Forecasting | Link | Link | arXiv | LTSF |
MTS-Mixers | Multivariate Time Series Forecasting via Factorized Temporal and Channel Mixing | Link | Link | arXiv | LTSF |
LightTS | Fast Multivariate Time Series Forecasting with Light Sampling-oriented MLP | Link | Link | arXiv | LTSF |
ETSformer | Exponential Smoothing Transformers for Time-series Forecasting | Link | Link | arXiv | LTSF |
NHiTS | Neural Hierarchical Interpolation for Time Series Forecasting | Link | Link | AAAI'23 | LTSF |
PatchTST | A Time Series is Worth 64 Words: Long-term Forecasting with Transformers | Link | Link | ICLR'23 | LTSF |
TiDE | Long-term Forecasting with TiDE: Time-series Dense Encoder | Link | Link | TMLR'23 | LTSF |
TimesNet | Temporal 2D-Variation Modeling for General Time Series Analysis | Link | Link | ICLR'23 | LTSF |
Triformer | Triangular, Variable-Specific Attentions for Long Sequence Multivariate Time Series Forecasting | Link | Link | IJCAI'22 | LTSF |
NSformer | Exploring the Stationarity in Time Series Forecasting | Link | Link | NeurIPS'22 | LTSF |
FiLM | Frequency improved Legendre Memory Model for LTSF | Link | Link | NeurIPS'22 | LTSF |
FEDformer | Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting | Link | Link | ICML'22 | LTSF |
Pyraformer | Low complexity pyramidal Attention For Long-range Time Series Modeling and Forecasting | Link | Link | ICLR'22 | LTSF |
HI | Historical Inertia: A Powerful Baseline for Long Sequence Time-series Forecasting | Link | None | CIKM'21 | LTSF |
Autoformer | Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting | Link | Link | NeurIPS'21 | LTSF |
Informer | Beyond Efficient Transformer for Long Sequence Time-Series Forecasting | Link | Link | AAAI'21 | LTSF |
📊Baseline | 📝Title | 📄Paper | 💻Code | 🏛Venue | 🎯Task |
---|---|---|---|---|---|
LightGBM | LightGBM: A Highly Efficient Gradient Boosting Decision Tree | Link | Link | NeurIPS'17 | Machine Learning |
NBeats | Neural basis expansion analysis for interpretable time series forecasting | Link | Link1, Link2 | ICLR'19 | Deep Time Series Forecasting |
DeepAR | Probabilistic Forecasting with Autoregressive Recurrent Networks | Link | Link1, Link2, Link3 | Int. J. Forecast'20 | Probabilistic Time Series Forecasting |
WaveNet | WaveNet: A Generative Model for Raw Audio. | Link | Link 1, Link 2 | arXiv | Audio |
BasicTS support a variety of datasets, including spatial-temporal forecasting, long-term time series forecasting, and large-scale datasets.
🏷️Name | 🌐Domain | 📏Length | 📊Time Series Count | 🔄Graph | ⏱️Freq. (m) | 🎯Task |
---|---|---|---|---|---|---|
METR-LA | Traffic Speed | 34272 | 207 | True | 5 | STF |
PEMS-BAY | Traffic Speed | 52116 | 325 | True | 5 | STF |
PEMS03 | Traffic Flow | 26208 | 358 | True | 5 | STF |
PEMS04 | Traffic Flow | 16992 | 307 | True | 5 | STF |
PEMS07 | Traffic Flow | 28224 | 883 | True | 5 | STF |
PEMS08 | Traffic Flow | 17856 | 170 | True | 5 | STF |
🏷️Name | 🌐Domain | 📏Length | 📊Time Series Count | 🔄Graph | ⏱️Freq. (m) | 🎯Task |
---|---|---|---|---|---|---|
BeijingAirQuality | Beijing Air Quality | 36000 | 7 | False | 60 | LTSF |
ETTh1 | Electricity Transformer Temperature | 14400 | 7 | False | 60 | LTSF |
ETTh2 | Electricity Transformer Temperature | 14400 | 7 | False | 60 | LTSF |
ETTm1 | Electricity Transformer Temperature | 57600 | 7 | False | 15 | LTSF |
ETTm2 | Electricity Transformer Temperature | 57600 | 7 | False | 15 | LTSF |
Electricity | Electricity Consumption | 26304 | 321 | False | 60 | LTSF |
ExchangeRate | Exchange Rate | 7588 | 8 | False | 1440 | LTSF |
Illness | Ilness Data | 966 | 7 | False | 10080 | LTSF |
Traffic | Road Occupancy Rates | 17544 | 862 | False | 60 | LTSF |
Weather | Weather | 52696 | 21 | False | 10 | LTSF |
🏷️Name | 🌐Domain | 📏Length | 📊Time Series Count | 🔄Graph | ⏱️Freq. (m) | 🎯Task |
---|---|---|---|---|---|---|
CA | Traffic Flow | 35040 | 8600 | True | 15 | Large Scale |
GBA | Traffic Flow | 35040 | 2352 | True | 15 | Large Scale |
GLA | Traffic Flow | 35040 | 3834 | True | 15 | Large Scale |
SD | Traffic Flow | 35040 | 716 | True | 15 | Large Scale |
See the paper Exploring Progress in Multivariate Time Series Forecasting: Comprehensive Benchmarking and Heterogeneity Analysis.
Thanks goes to these wonderful people (emoji key):
S22 🚧 💻 🐛 |
blisky-li 💻 |
LMissher 💻 🐛 |
CNStark 🚇 |
Azusa 🐛 |
Yannick Wölker 🐛 |
hlhang9527 🐛 |
Chengqing Yu 💻 |
Reborn14 📖 💻 |
TensorPulse 🐛 |
superarthurlx 💻 🐛 |
Yisong Fu 💻 |
Xubin 📖 |
DU YIFAN 💻 |
This project follows the all-contributors specification. Contributions of any kind welcome!
BasicTS is developed based on EasyTorch, an easy-to-use and powerful open-source neural network training framework.
Official Discord Server:
Official WeChat Group: