Skip to content

3rd Place Solution of KDD Cup 2022-Spatial Dynamic Wind Power Forecasting

Notifications You must be signed in to change notification settings

LongxingTan/KDDCup2022-WPF

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

KDD Cup 2022 - Baidu Spatial Dynamic Wind Power Forecasting

Open in Kaggle arxiv

This is the 3rd place solution in Baidu KDD Cup 2022. The task is to predict the wind farm's future 48 hours active power for every 10 minutes.


Solution summary

  • A single Transformer/ BERT model is made from the tfts library. Follow its latest development here
  • Using sliding window to generate more samples
  • Only 2 raw features are used, wind speed and direction
  • The daily fluctuation is added by post-processing to make the predicted result in line with daily periodicity

How to use it

  1. Prepare the tensorflow environment
pip install -r requirements.txt
  1. Download the data from Baidu AI studio, and put it in ./data/raw
  2. Train the model, the file result.zip in ./weights/ can be used for submit.
cd src/train
python nn_train.py

Citation

If you find it useful in your research, please consider cite:

@article{tan2023application,
  title={Application of BERT in Wind Power Forecasting-Teletraan's Solution in Baidu KDD Cup 2022},
  author={Tan, Longxing and Yue, Hongying},
  journal={arXiv preprint arXiv:2307.09248},
  year={2023}
}

Reference

[1] Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. 2018. Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018).
[2] Haixu Wu, Jiehui Xu, Jianmin Wang, and Mingsheng Long. 2021. Autoformer: De-composition transformers with auto-correlation for long-term series forecasting. Advances in Neural Information Processing Systems 34 (2021), 22419–22430.
[3] JingboZhou,ShuangliLi,LiangHuang,HaoyiXiong,FanWang,TongXu,Hui Xiong, and Dejing Dou. 2020. Distance-aware molecule graph attention network for drug-target binding affinity prediction. arXiv preprint arXiv:2012.09624 (2020).
[4] HaoyiZhou,ShanghangZhang,JieqiPeng,ShuaiZhang,JianxinLi,HuiXiong, and Wancai Zhang. 2021. Informer: Beyond efficient transformer for long sequence time-series forecasting. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 35. 11106–11115.

About

3rd Place Solution of KDD Cup 2022-Spatial Dynamic Wind Power Forecasting

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published