Skip to content

Latest commit

 

History

History
46 lines (29 loc) · 1.89 KB

README.md

File metadata and controls

46 lines (29 loc) · 1.89 KB

GCformer

GCformer combines a structured global convolutional branch for processing long input sequences with a local Transformer-based branch for capturing short, recent signals. Experiments demonstrate that GCformer outperforms state-of-the-art methods, reducing MSE error in multivariate time series benchmarks by 4.38% and model parameters by 61.92%. In particular, the global convolutional branch can serve as a plug-in block to enhance the performance of other models, with an average improvement of 31.93%, including various recently published Transformer-based models.

Method

model_structure
Figure 1. GCformer overall framework
global_kernel
Figure 2. Different parameterization methods of global convolution kernel

Main Results

|boosting_result|

|full_benchmark|

Get Started

  1. Install Python 3.6, PyTorch 1.11.0.
  2. Download data. You can obtain all the six benchmarks from [FEDformer] or [Autoformer].
  3. Train the model. We provide the experiment scripts of all benchmarks under the folder ./scripts/GCformer. For instance, you can reproduce the experiment result on illness dataset by:
bash ./scripts/GCformer/illness.sh

Citation

Contact

Acknowledgement

We appreciate the following github repos a lot for their valuable code base or datasets:

https://github.com/yuqinie98/PatchTST

https://github.com/MAZiqing/FEDformer

https://github.com/ctlllll/SGConv

https://github.com/thuml/Autoformer