Skip to content

Latest commit

 

History

History
83 lines (66 loc) · 2.86 KB

README.md

File metadata and controls

83 lines (66 loc) · 2.86 KB

KGPool: Dynamic Knowledge Graph Context Selection for Relation Extraction

This is an implementation of the paper KGPool: Dynamic Knowledge Graph Context Selection for Relation Extraction, ACL Findings 2021.

PWC

PWC

@inproceedings{nadgeri-etal-2021-kgpool,
    title = "{KGP}ool: Dynamic Knowledge Graph Context Selection for Relation Extraction",
    author = "Nadgeri, Abhishek  and
      Bastos, Anson  and
      Singh, Kuldeep  and
      Mulang{'}, Isaiah Onando  and
      Hoffart, Johannes  and
      Shekarpour, Saeedeh  and
      Saraswat, Vijay",
    booktitle = "Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021",
    month = aug,
    year = "2021",
    address = "Online",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2021.findings-acl.48",
    doi = "10.18653/v1/2021.findings-acl.48",
    pages = "535--548",
}

alt tag

Acknowledgment

The Code base is built upon the following work -

Requirements

  • Python 3
  • torch>=1.8
  • torch-geometric>=1.8

Data

We use the same dataset and the entity attributes as used by the previous baseline RECON

Usage

First, download the datasets.

Train and evaluate the model:
(make sure Glove embeddings are ready before training)

wget http://nlp.stanford.edu/data/glove.6B.zip
unzip glove.6B.zip

Train

python Context-Aggregator/train.py

Testing

python Context-Aggregator/test.py

Directory structure

Context-Aggregator/: The same folder strucutre as that of GPGNN

KGPool/data_loader.py: Data pre-processing and gathering step.

KGPool/networks.py: Set the graph architecture.

KGPool/layers.py: Perform dynamic pooling on nodes.

Hyper Parameters

  • For Context-Aggregator we use the same hyper-parameters used by the baseline.
  • For KGPool we use the defualt params in Context-Aggregator/train.py.
  • No systematic hyper-parameter tunning was performed.