Skip to content

cyjie429/PMD

Repository files navigation

PMD

In this paper, we proposes a progressive distillation method based on masked generation features for KGC task, aiming to significantly reduce the complexity of pre-trained models.

Baseline

@inproceedings{wang-etal-2022-simkgc,
    title = "{S}im{KGC}: Simple Contrastive Knowledge Graph Completion with Pre-trained Language Models",
    author = "Wang, Liang  and
      Zhao, Wei  and
      Wei, Zhuoyu  and
      Liu, Jingming",
    booktitle = "Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
    month = may,
    year = "2022",
    address = "Dublin, Ireland",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2022.acl-long.295",
    pages = "4281--4294",
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published