-
Notifications
You must be signed in to change notification settings - Fork 27.7k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
* Create README.md * Update model_cards/Cinnamon/electra-small-japanese-generator/README.md Co-authored-by: Julien Chaumond <chaumond@gmail.com>
- Loading branch information
Showing
1 changed file
with
18 additions
and
0 deletions.
There are no files selected for viewing
18 changes: 18 additions & 0 deletions
18
model_cards/Cinnamon/electra-small-japanese-generator/README.md
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,18 @@ | ||
--- | ||
language: ja | ||
--- | ||
## Japanese ELECTRA-small | ||
|
||
We provide a Japanese **ELECTRA-Small** model, as described in [ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators](https://openreview.net/pdf?id=r1xMH1BtvB). | ||
|
||
Our pretraining process employs subword units derived from the [Japanese Wikipedia](https://dumps.wikimedia.org/jawiki/latest), using the [Byte-Pair Encoding](https://www.aclweb.org/anthology/P16-1162.pdf) method and building on an initial tokenization with [mecab-ipadic-NEologd](https://github.com/neologd/mecab-ipadic-neologd). For optimal performance, please take care to set your MeCab dictionary appropriately. | ||
|
||
``` | ||
# ELECTRA-small generator usage | ||
from transformers import BertJapaneseTokenizer, ElectraForMaskedLM | ||
tokenizer = BertJapaneseTokenizer.from_pretrained('Cinnamon/electra-small-japanese-generator', mecab_kwargs={"mecab_option": "-d /usr/lib/x86_64-linux-gnu/mecab/dic/mecab-ipadic-neologd"}) | ||
model = ElectraForMaskedLM.from_pretrained('Cinnamon/electra-small-japanese-generator') | ||
``` |