Skip to content

Latest commit

 

History

History
33 lines (17 loc) · 1005 Bytes

README.md

File metadata and controls

33 lines (17 loc) · 1005 Bytes

bert_tfv1

BERT, tensorflow, v1. Classification. Sequential labelling.

Pretrained model ckpt

Pretrained model of Chinese language:

Chinese Simplified and Traditional, 12-layer, 768-hidden, 12-heads, 110M parameters

模型的下载链接可以在github上google的开源代码里找到。

对下载的压缩文件进行解压,得到五个文件,

  • bert_model.ckpt开头的文件是负责模型变量载入的
  • vocab.txt是训练时中文文本采用的字典
  • bert_config.json是BERT在训练时,可选调整的一些参数

Acknowledgement