Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update transformers requirement from <3.6,>=3.4 to >=3.4,<4.3 #7

Closed

Conversation

dependabot[bot]
Copy link

@dependabot dependabot bot commented on behalf of github Jan 14, 2021

Updates the requirements on transformers to permit the latest version.

Release notes

Sourced from transformers's releases.

v4.2.0: LED from AllenAI, Generation Scores, TensorFlow 2x speedup, faster import

v4.2.0: LED from AllenAI, encoder-decoder templates, fast imports

LED from AllenAI (@patrickvonplaten)

Four new models are released as part of the LED implementation: LEDModel, LEDForConditionalGeneration, LEDForSequenceClassification, LEDForQuestionAnswering, in PyTorch. The first two models have a TensorFlow version.

LED is the encoder-decoder variant of the Longformer model by allenai.

The LED model was proposed in Longformer: The Long-Document Transformer by Iz Beltagy, Matthew E. Peters, Arman Cohan.

Compatible checkpoints can be found on the Hub: https://huggingface.co/models?filter=led

Available notebooks:

Contributions:

Generation Scores & other outputs (@patrickvonplaten)

The PyTorch generation function now allows to return:

  • scores - the logits generated at each step
  • attentions - all attention weights at each generation step
  • hidden_states - all hidden states at each generation step

by simply adding return_dict_in_generate to the config or as an input to .generate()

Tweet:

Notebooks for a better explanation:

PR:

  • Add flags to return scores, hidden states and / or attention weights in GenerationMixin #9150 (@SBrandeis)

TensorFlow improvements

TensorFlow BERT-like model improvements (@jplu)

The TensorFlow version of the BERT-like models have been updated and are now twice as fast as the previous versions.

  • Improve BERT-like models performance with better self attention #9124 (@jplu)

... (truncated)

Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually

Updates the requirements on [transformers](https://github.com/huggingface/transformers) to permit the latest version.
- [Release notes](https://github.com/huggingface/transformers/releases)
- [Commits](huggingface/transformers@v3.4.0...v4.2.0)

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label Jan 14, 2021
@dependabot @github
Copy link
Author

dependabot bot commented on behalf of github Feb 9, 2021

Superseded by #10.

@dependabot dependabot bot closed this Feb 9, 2021
@dependabot dependabot bot deleted the dependabot/pip/transformers-gte-3.4-and-lt-4.3 branch February 9, 2021 13:06
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file
Projects
None yet
Development

Successfully merging this pull request may close these issues.

0 participants