Skip to content

Commit

Permalink
Release: adapter-transformers v2.1.0
Browse files Browse the repository at this point in the history
  • Loading branch information
calpt committed Jul 8, 2021
1 parent 574d090 commit cf56b39
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -337,7 +337,7 @@ def run(self):

setup(
name="adapter-transformers",
version="2.1.0a0",
version="2.1.0",
author="Jonas Pfeiffer, Andreas Rücklé, Clifton Poth, Hannah Sterz, based on work by Thomas Wolf, Lysandre Debut, Victor Sanh, Julien Chaumond, Sam Shleifer, Patrick von Platen, Sylvain Gugger, Suraj Patil, Stas Bekman, Google AI Language Team Authors, Open AI team Authors, Facebook AI Authors, Carnegie Mellon University Authors",
author_email="pfeiffer@ukp.tu-darmstadt.de",
description="A friendly fork of Huggingface's Transformers, adding Adapters to PyTorch language models",
Expand Down
2 changes: 1 addition & 1 deletion src/transformers/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
# to defer the actual importing for when the objects are requested. This way `import transformers` provides the names
# in the namespace without actually importing anything (and especially none of the backends).

__version__ = "2.1.0a0"
__version__ = "2.1.0"
__hf_version__ = "4.8.2"

# Work around to update TensorFlow's absl.logging threshold which alters the
Expand Down

0 comments on commit cf56b39

Please sign in to comment.