From 40b53070c7b21b8a77712706b55f413e8ee13fb4 Mon Sep 17 00:00:00 2001 From: Philip May Date: Tue, 18 Aug 2020 07:45:02 +0200 Subject: [PATCH 1/2] Update README.md --- .../german-nlp-group/electra-base-german-uncased/README.md | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/model_cards/german-nlp-group/electra-base-german-uncased/README.md b/model_cards/german-nlp-group/electra-base-german-uncased/README.md index 624b784ec50546..0633403bec18f6 100644 --- a/model_cards/german-nlp-group/electra-base-german-uncased/README.md +++ b/model_cards/german-nlp-group/electra-base-german-uncased/README.md @@ -20,7 +20,7 @@ This Model is suitable for Training on many downstream tasks in German (Q&A, Sen It can be used as a drop-in Replacement for **BERT** in most down-stream tasks (**ELECTRA** is even implemented as an extended **BERT** Class). -On the time of the realse (August 2020) this Model is the best performing publicly available German NLP Model on various German Evaluation Metrics (CONLL, GermEval19 Coarse, GermEval19 Fine). +On the time of the realse (August 2020) this Model is the best performing publicly available German NLP Model on various German Evaluation Metrics (CONLL03-DE, GermEval18 Coarse, GermEval18 Fine). For GermEval18 Coarse results see below. More will be published soon. ## Installation @@ -159,4 +159,3 @@ We tried the following approaches which we found had no positive influence: - **Increased Vocab Size**: Leads to more parameters and thus reduced examples/sec while no visible Performance gains were measured - **Decreased Batch-Size**: The original Electra was trained with a Batch Size per TPU Core of 16 whereas this Model was trained with 32 BS / TPU Core. We found out that 32 BS leads to better results when you compare metrics over computation time - From b2a76992bfcb11dea0bccdc0b833f302125a1593 Mon Sep 17 00:00:00 2001 From: Julien Chaumond Date: Tue, 18 Aug 2020 14:21:42 +0200 Subject: [PATCH 2/2] Update model_cards/german-nlp-group/electra-base-german-uncased/README.md --- .../german-nlp-group/electra-base-german-uncased/README.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/model_cards/german-nlp-group/electra-base-german-uncased/README.md b/model_cards/german-nlp-group/electra-base-german-uncased/README.md index 0633403bec18f6..6e10e7375c10d5 100644 --- a/model_cards/german-nlp-group/electra-base-german-uncased/README.md +++ b/model_cards/german-nlp-group/electra-base-german-uncased/README.md @@ -20,7 +20,8 @@ This Model is suitable for Training on many downstream tasks in German (Q&A, Sen It can be used as a drop-in Replacement for **BERT** in most down-stream tasks (**ELECTRA** is even implemented as an extended **BERT** Class). -On the time of the realse (August 2020) this Model is the best performing publicly available German NLP Model on various German Evaluation Metrics (CONLL03-DE, GermEval18 Coarse, GermEval18 Fine). For GermEval18 Coarse results see below. More will be published soon. +At the time of release (August 2020) this Model is the best performing publicly available German NLP Model on various German Evaluation Metrics (CONLL03-DE, GermEval18 Coarse, GermEval18 Fine). For GermEval18 Coarse results see below. More will be published soon. + ## Installation