Skip to content

Latest commit

 

History

History

BERT

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 
 
 
 
 
 
 

BERT+CRF

This code repository contains the implementations of the method BERT+Softmax/Span/CRF. The Original BERT (NAACL-HLT, 2019) paper can be found in link

Train From Scratch

If you want to train the model from scratch, please following these steps:

1.Firstly, prepare the pretrained models:

2.Secondly, modify the paths in model config (demo/ner/BERT/bert_softmax(span/crf).py), including the pretrained models paths, datasets paths, work space, etc.

3.Thirdly, direct run demo/ner/BERT/dist_train.sh.

Test

Given the trained model, direct run demo/ner/BERT/test.sh to test model.

Trained Model Download

All of the models are re-implemented and well trained based on the opensourced framework mmdetection. So, the results might be slightly different from reported results.

Results on various datasets and trained models can be download as follows:

F1-score CoNLL2003 Resume Links
BERT+Softmax(paper) 92.4 - -
BERT+Softmax 92.1 96.4 config, pth (Access Code:agh1)
BERT+Span 92.1 96.2 config, pth (Access Code:308S)
BERT+CRF 92.6 96.7 config, pth (Access Code:1075)

Citation

@inproceedings{devlin2019bert,
	 author    = {Jacob Devlin and
               Ming{-}Wei Chang and
               Kenton Lee and
               Kristina Toutanova},
  title     = {{BERT:} Pre-training of Deep Bidirectional Transformers for Language
               Understanding},
  booktitle = {NAACL-HLT},
  pages     = {4171--4186},
  year      = {2019},
}

License

This project is released under the Apache 2.0 license

Contact

If there is any suggestion and problem, please feel free to contact the author with qiaoliang6@hikvision.com.