This is the PyTorch implementation of the paper:
Baoyu Jing, Hanghang Tong and Yada Zhu, Network of Tensor Time Series, WWW'2021
- numpy>=1.19.5
- scipy>=1.5.4
- PyYAML>=5.4.1
- tensorly>=0.5.1
- tqdm>=4.59.0
- pandas>=1.1.5
- torch>=1.6.0
- torchvision>=0.7.0
Packages can be installed via: pip install -r requirements.txt
- Formulation. Formulate the co-evolving time series (or multi-variate time series) as a tensor time series. The temporal snapshot should be an M-dimensional tensor. Note that vector and matrix are special cases of tensor.
- Normalization. For each single time series within the tensor time series, use z-score of the training split to normalize the values.
- Graph construction. The m-th dimension of the tensor can be associated with a graph, which is represented by the adjacency matrix . The adjacency matrix should be normalized by . Note that if a dimension is not associated with a network, then use the identity matrix.
- Store the values of the tensor time series and the adjacency matrices in
values.pkl
andnetworks.pkl
. Store the indicators for training, validation and testing intrain_idx.pkl
,val_idx.pkl
andtest.pkl
.
- Specify the mode for training:
train
(only training) ortrain-eval
(evaluating the model after each epoch). - Specify the task:
missing
(missing value recovery) andfuture
(future value prediction). - Specify the paths of the configurations for the model and training.
python main.py -cm ./configs/model.yml -cr ./configs/run_missing.yml -m train -t missing
- Specify the mode:
eval
- Specify the task:
missing
(missing value recovery) andfuture
(future value prediction). - Specify the paths of the configurations for the model and evaluation.
python main.py -cm ./configs/model.yml -cr ./configs/run_missing.yml -m eval -t missing
Please cite the following paper, if you find the repository or the paper useful.
Baoyu Jing, Hanghang Tong and Yada Zhu, Network of Tensor Time Series, WWW'2021
@article{jing2021network,
title={Network of Tensor Time Series},
author={Jing, Baoyu and Tong, Hanghang and Zhu, Yada},
journal={arXiv preprint arXiv:2102.07736},
year={2021}
}