Skip to content

feedzai/truncation-gap

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Mind the truncation gap: challenges of learning on dynamic graphs

Official code for the paper Mind the truncation gap: challenges of learning on dynamic graphs with recurrent architectures.

Batch truncation in dynamic graphs

Includes our toy task implementation, together with the python scripts required to train and evaluate a GRNN model on this task using both full backpropagation and truncated backpropagation.

Also includes python scripts to reproduce the experiments on public dynamic graph datasets, comparing full and truncated backpropagation.

Using the synthetic task

Our proposed synthetic link regression task can be used as a simple benchmark to test an algorithm's ability to learn longer term dependencies.

Toy task

The following example instantiates the synthetic task with 100 nodes and a memory of 3 steps and uses it to generate edges:

import jax
from jax import lax
from tgap.data.buffer_task import get_sampler_link_regression

init_data, step_data = get_sampler_link_regression(num_nodes=100, delay=3)

# initialize state
rng = jax.random.PRNGKey(123)
initial_data_state = init_data(rng)

# run 1 step
new_data_state, (edge_src, edge_dst, edge_feat, edge_target) = step_data(initial_data_state)

# run 1000 steps
new_data_state, (edges_src, edges_dst, edges_feat, edges_target) = lax.scan(step_data, initial_data_state, None, 1000)

Running the synthetic task experiments

Install dependencies (with python >= 3.9):

pip install -r requirements.txt

Run full or truncated backprop:

# Full Backprop
python run_toy_task.py --method FBPTT

# Truncated Backprop
python run_toy_task.py --method TBPTT

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages