Skip to content

Commit

Permalink
Create README.md (#8751)
Browse files Browse the repository at this point in the history
* Create README.md

* Update model_cards/Cinnamon/electra-small-japanese-generator/README.md

Co-authored-by: Julien Chaumond <chaumond@gmail.com>
  • Loading branch information
joangines and julien-c authored Dec 11, 2020
1 parent 76df559 commit c615df7
Showing 1 changed file with 18 additions and 0 deletions.
18 changes: 18 additions & 0 deletions model_cards/Cinnamon/electra-small-japanese-generator/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
---
language: ja
---
## Japanese ELECTRA-small

We provide a Japanese **ELECTRA-Small** model, as described in [ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators](https://openreview.net/pdf?id=r1xMH1BtvB).

Our pretraining process employs subword units derived from the [Japanese Wikipedia](https://dumps.wikimedia.org/jawiki/latest), using the [Byte-Pair Encoding](https://www.aclweb.org/anthology/P16-1162.pdf) method and building on an initial tokenization with [mecab-ipadic-NEologd](https://github.com/neologd/mecab-ipadic-neologd). For optimal performance, please take care to set your MeCab dictionary appropriately.

```
# ELECTRA-small generator usage
from transformers import BertJapaneseTokenizer, ElectraForMaskedLM
tokenizer = BertJapaneseTokenizer.from_pretrained('Cinnamon/electra-small-japanese-generator', mecab_kwargs={"mecab_option": "-d /usr/lib/x86_64-linux-gnu/mecab/dic/mecab-ipadic-neologd"})
model = ElectraForMaskedLM.from_pretrained('Cinnamon/electra-small-japanese-generator')
```

0 comments on commit c615df7

Please sign in to comment.