PyTorch code for our paper on Learning Temporal Attention in Dynamic Graphs with Bilinear Interactions.
Updates
- added the
--bilinear_enc
flag in addition to--bilinear
to contol where to use the bilinear layer - added the
--model
flag to run baseline Graph Convolutional/Attention Networks (GCN/GAT) models - improved logging and added the
--verbose
to control how much info to print out
When using this dataset, you must comply with their conditions specified here.
Option 1:
The original data can be accessed here.
Once you download their zip file, unpack it to the SocialEvolution
folder, then inside that folder unpack Proximity.csv.bz2
, e.g. by running bzip2 -d Proximity.csv.bz2
.
You can then run our code and it will generate a preprocessed data_prob0.8.pkl
file that will be reused every time you run our code.
Option 2:
Instead of using original data, you can directly download data_prob0.8.pkl
from here
and put it to the SocialEvolution
folder.
The original data can be accessed here. When using this dataset, you must comply with their licenses specified here.
In this repo we extract a subnetwork of 284 users with relatively dense events between each other. Each user initiated at least 200 communication and 7 association events during the year of 2013. "Follow" events in 2011-2012 are considered as initial associations. Communication events include: Watch, Star, Fork, Push, Issues, IssueComment, PullRequest, Commit. This results in a dataset of 284 nodes and around 10k training events (from December to August 2013) and 8k test events (from September to December 2013) .
We provide the preprocessed pkl files in the Github
folder so that you do not need to access the original data to run our code.
Running the baseline DyRep model [1] on Social Evolution:
python main.py --log_interval 300 --data_dir ./SocialEvolution/
.
Running our latent dynamic graph (LDG) model with a learned graph, sparse prior and biliear interactions:
python main.py --log_interval 300 --data_dir ./SocialEvolution/ --encoder mlp --soft_attn --bilinear --bilinear_enc --sparse
Note that on Social Evolution our default option is to filter Proximity
events by their probability: --prob 0.8
. In the DyRep paper, they use all events, i.e. --prob 0.8
. When we compare results in our paper, we use the same --prob 0.8
for all methods.
To run Github experiments, use the same arguments, but add --dataset github --data_dir ./Github
.
To use the Frequency bias, add the --freq
flag.
I provide the base class data_loader.py, showing which class attributes and functions must be implemented if you want to train our model on other datasets. Plus I added example_data_loader.py, showing a minimal example of using the base class.
If you make use of this code, we appreciate it if you can cite our paper as follows:
@ARTICLE{knyazev2019learning,
title = "Learning Temporal Attention in Dynamic Graphs with Bilinear Interactions",
author = "Knyazev, Boris and Augusta, Carolyn and Taylor, Graham W",
month = sep,
year = 2019,
archivePrefix = "arXiv",
primaryClass = "stat.ML",
eprint = "1909.10367"
}