-
Notifications
You must be signed in to change notification settings - Fork 346
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
* Adds adapter support to DistilBERT models (via mixins in adapter_distilbert.py) * Adds a flex-head model for DistilBERT (DistilBertModelWithHeads) * Moved invertible adapters to separate InvertibleAdaptersMixin to improve modularity * Adjustments in BERT adapters implementation to allow partial reuse for DistilBERT
- Loading branch information
Showing
16 changed files
with
439 additions
and
98 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,69 @@ | ||
DistilBERT | ||
=========== | ||
|
||
The DistilBERT model was proposed in the blog post | ||
`Smaller, faster, cheaper, lighter: Introducing DistilBERT, a distilled version of BERT <https://medium.com/huggingface/distilbert-8cf3380435b5>`__, | ||
and the paper `DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter <https://arxiv.org/abs/1910.01108>`__. | ||
DistilBERT is a small, fast, cheap and light Transformer model trained by distilling Bert base. It has 40% less | ||
parameters than `bert-base-uncased`, runs 60% faster while preserving over 95% of Bert's performances as measured on | ||
the GLUE language understanding benchmark. | ||
|
||
.. note:: | ||
This class is nearly identical to the PyTorch implementation of DistilBERT in Huggingface Transformers. | ||
For more information, visit `the corresponding section in their documentation <https://huggingface.co/transformers/model_doc/distilbert.html>`_. | ||
|
||
|
||
DistilBertConfig | ||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | ||
|
||
.. autoclass:: transformers.DistilBertConfig | ||
:members: | ||
|
||
|
||
DistilBertTokenizer | ||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | ||
|
||
.. autoclass:: transformers.DistilBertTokenizer | ||
:members: | ||
|
||
|
||
DistilBertTokenizerFast | ||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | ||
|
||
.. autoclass:: transformers.DistilBertTokenizerFast | ||
:members: | ||
|
||
|
||
DistilBertModel | ||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | ||
|
||
.. autoclass:: transformers.DistilBertModel | ||
:members: | ||
|
||
|
||
DistilBertModelWithHeads | ||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | ||
|
||
.. autoclass:: transformers.DistilBertModelWithHeads | ||
:members: | ||
|
||
|
||
DistilBertForMaskedLM | ||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | ||
|
||
.. autoclass:: transformers.DistilBertForMaskedLM | ||
:members: | ||
|
||
|
||
DistilBertForSequenceClassification | ||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | ||
|
||
.. autoclass:: transformers.DistilBertForSequenceClassification | ||
:members: | ||
|
||
|
||
DistilBertForQuestionAnswering | ||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | ||
|
||
.. autoclass:: transformers.DistilBertForQuestionAnswering | ||
:members: |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.