This is the 3rd place solution in Baidu KDD Cup 2022. The task is to predict the wind farm's future 48 hours active power for every 10 minutes.
- A single Transformer/ BERT model is made from the tfts library. Follow its latest development here
- Using sliding window to generate more samples
- Only 2 raw features are used, wind speed and direction
- The daily fluctuation is added by post-processing to make the predicted result in line with daily periodicity
- Prepare the tensorflow environment
pip install -r requirements.txt
- Download the data from Baidu AI studio, and put it in
./data/raw
- Train the model, the file
result.zip
in./weights/
can be used for submit.
cd src/train
python nn_train.py
If you find it useful in your research, please consider cite:
@article{tan2023application,
title={Application of BERT in Wind Power Forecasting-Teletraan's Solution in Baidu KDD Cup 2022},
author={Tan, Longxing and Yue, Hongying},
journal={arXiv preprint arXiv:2307.09248},
year={2023}
}
[1] Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. 2018. Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018).
[2] Haixu Wu, Jiehui Xu, Jianmin Wang, and Mingsheng Long. 2021. Autoformer: De-composition transformers with auto-correlation for long-term series forecasting. Advances in Neural Information Processing Systems 34 (2021), 22419–22430.
[3] JingboZhou,ShuangliLi,LiangHuang,HaoyiXiong,FanWang,TongXu,Hui Xiong, and Dejing Dou. 2020. Distance-aware molecule graph attention network for drug-target binding affinity prediction. arXiv preprint arXiv:2012.09624 (2020).
[4] HaoyiZhou,ShanghangZhang,JieqiPeng,ShuaiZhang,JianxinLi,HuiXiong, and Wancai Zhang. 2021. Informer: Beyond efficient transformer for long sequence time-series forecasting. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 35. 11106–11115.