Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Model Card for electra-base-german-uncased #6496

Merged
merged 2 commits into from
Aug 17, 2020
Merged

Add Model Card for electra-base-german-uncased #6496

merged 2 commits into from
Aug 17, 2020

Conversation

PhilipMay
Copy link
Contributor

@PhilipMay PhilipMay commented Aug 15, 2020

This adds the model card for electra-base-german-uncased.

Could you please also have a look into #6495 because something went wrong with the upload.

Thanks
Philip

@julien-c julien-c added the model card Related to pretrained model cards label Aug 15, 2020
@codecov
Copy link

codecov bot commented Aug 15, 2020

Codecov Report

Merging #6496 into master will increase coverage by 0.06%.
The diff coverage is 0.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #6496      +/-   ##
==========================================
+ Coverage   80.37%   80.44%   +0.06%     
==========================================
  Files         156      156              
  Lines       28058    28058              
==========================================
+ Hits        22552    22571      +19     
+ Misses       5506     5487      -19     
Impacted Files Coverage Δ
src/transformers/configuration_utils.py 96.59% <ø> (ø)
src/transformers/generation_tf_utils.py 86.71% <ø> (+0.25%) ⬆️
src/transformers/generation_utils.py 96.94% <ø> (ø)
src/transformers/trainer.py 37.84% <0.00%> (ø)
src/transformers/modeling_tf_distilbert.py 64.47% <0.00%> (-34.36%) ⬇️
src/transformers/modeling_bert.py 88.42% <0.00%> (+0.16%) ⬆️
src/transformers/modeling_tf_bert.py 98.38% <0.00%> (+29.31%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 24107c2...a9ce8ff. Read the comment docs.

---

# German Electra Uncased
<img width="300px" src="https://raw.githubusercontent.com/German-NLP-Group/german-transformer-training/master/model_cards/german-electra-logo.png">
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why Palpatine? Any specific reason for this visualization? (asking out of curiosity)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Because of "Electra" and the lightnings.

@JetRunner JetRunner merged commit 3c72f55 into huggingface:master Aug 17, 2020
Zigur pushed a commit to Zigur/transformers that referenced this pull request Oct 26, 2020
* Add Model Card for electra-base-german-uncased

* Update README.md

Co-authored-by: Kevin Canwen Xu <canwenxu@126.com>
fabiocapsouza added a commit to fabiocapsouza/transformers that referenced this pull request Nov 15, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
model card Related to pretrained model cards
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants