This is the MXNet implementation of ST-MetaNet in the following paper:
- Zheyi Pan, Yuxuan Liang, Weifeng Wang, Yong Yu, Yu Zheng, and Junbo Zhang. Urban Traffic Prediction from Spatio-Temporal Data Using Deep Meta Learning. 2019. In The 25th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD'19), August 4–8, 2019, Anchorage, AK, USA.
- System: Ubuntu 16.04
- Language: Python 3.5.2
- Devices: a single GTX 1080 GPU
- scipy == 1.2.1
- numpy == 1.16.3
- pandas == 0.24.2
- mxnet-cu90 == 1.5.0b20190108
- dgl == 0.2
- tables == 3.5.1
- pymal
- h5py
Dependency can be installed using the following command:
pip install -r requirements.txt
After that, change the backend deep learning framework for dgl:
echo 'export DGLBACKEND=mxnet' >> ~/.bashrc
. ~/.bashrc
Unzip the data files in:
flow-prediction/data.zip
traffic-prediction/data.zip
(the data is pre-processed in repository DCRNN)
The flow data is collected from 32x32 grids of Beijing city.
- BJ_FEATURE.h5: the shape of data is (32, 32, 989), which indicates
(row, colume, feature_id)
. This data represents the node feature for each grid. It consists of POI features and road features saved in BJ_POI.h5 and BJ_ROAD.h5, respectively. - BJ_FLOW.h5: the shape of data is (150, 24, 32, 32, 2), which indicates
(date, hour, row, colume, flow_type)
. Theflow_type
represents inflow or outflow of the region. - BJ_GRAPH.h5: the shape is (1024, 1024, 32), which indicates
(grid_1, grid_2, feature_id)
. This data represents the edge feature from grid_1 to grid_2. - BJ_POI.h5: POI features for each grid. This data is the intermediate output of preprocessing stage (not used in the model training & testing).
- BJ_ROAD.h5: Road features for each grid. This data is the intermediate output of preprocessing stage (not used in the model training & testing). .
The description please refers to the repository of DCRNN.
Given the flow prediction task as example (the traffic prediction task is exactly the same as the flow prediction task):
cd flow-prediction/
.- The settings of the models are in the folder
src/model_setting
, saved as yaml format. Three models are provided:seq2seq
,gat-seq2seq
, andst-metanet
. Other baselines refers to DCRNN and ST-ResNet, respectively. - All trained model will be saved in
param/
. There are two types of files in this folder:model.yaml
: the model training log (the result on evaluation dataset of each epoch). This file records the number of the best epoch for the model.model-xxxx.params
: the saved model parameters of the best evaluation epoch, wherexxxx
is the epoch number.
- Running the codes:
cd src/
.python train.py --file model_settting/[model_name].yaml --gpus [gpu_ids] --epochs [num_epoch]
. The code will firstly load the best epoch fromparams/
, and then train the models for[num_epoch]
. Our code can be trained with multiple gpus. An example of[gpu_ids]
is0,1,2,3
if you have four gpus. But we recommend to use a single gpu to train & evaluate the model if possible (currently, our implementation using DGL library for meta graph attention is not efficient when using multiple gpus).
- Training from the begining:
- Remove the model records in
param/
, otherwise the code will train from the best pre-trained (saved) model. - Train the model (example).
- Single gpu:
python train.py --file model_setting/st-metanet.yaml --gpus 0 --epochs 200
. - Multiple gpu:
python train.py --file model_setting/st-metanet.yaml --gpus 0,1 --epochs 200
.
- Single gpu:
- Remove the model records in
- Testing the model (example):
python train.py --file model_setting/st-metanet.yaml --gpus 0 --epochs 0
. The code will directly give evaluation results on evaluation dataset and test dataset, without training.
If you find this repository, e.g., the code and the datasets, useful in your research, please cite the following paper:
- Zheyi Pan, Yuxuan Liang, Weifeng Wang, Yong Yu, Yu Zheng, and Junbo Zhang. Urban Traffic Prediction from Spatio-Temporal Data Using Deep Meta Learning. 2019. In The 25th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD'19), August 4–8, 2019, Anchorage, AK, USA.
ST-MetaNet is released under the MIT License (refer to the LICENSE file for details).