-
Notifications
You must be signed in to change notification settings - Fork 27.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add methods to PreTrainedModel to use PyTorch's BetterTransformer #21259
Add methods to PreTrainedModel to use PyTorch's BetterTransformer #21259
Conversation
The documentation is not available anymore as the PR was closed or merged. |
as a side note, since in the previous |
Yes we should probably force the next optimum version. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for adding those! I'd also make sure to document the methods and add something in the optimization guides we have :-)
src/transformers/modeling_utils.py
Outdated
if not is_optimum_available(): | ||
raise ImportError("The package `optimum` is required to use BetterTransformer.") | ||
|
||
from optimum.bettertransformer import BetterTransformer |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think a version check on optimum with a clear error message would be good? Also can the transform be applied twice? If not there should be a check and a clear error message as well.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also can the transform be applied twice? If not there should be a check and a clear error message as well.
I think there's currently no check for this @younesbelkada . I will add it on the Optimum side. So as to keep the transformers side as lightweight as possible.
66366ea
to
c6dd4b9
Compare
Should be ready @sgugger , the documentation has been extended in https://moon-ci-docs.huggingface.co/docs/transformers/pr_21259/en/perf_infer_gpu_one . Let me know if I should add a test - in which case optimum should be added in the setup.py, I guess. |
@fxmarty there should be no need to add
I very much agree that we should add tests, especially to test accelerate compatibility, happy to help you on this, let me know if you need help
|
Thanks, will do!
Isn't this already tested on Optimum side? |
Yes but the tests are run on GPU: therefore not run on any of the runners on |
There are tests on the daily basis on GPU in Optimum, for example https://github.com/huggingface/optimum/blob/main/.github/workflows/test_onnxruntime_train.yml and https://github.com/huggingface/optimum/blob/main/.github/workflows/test_onnxruntime_gpu.yml In my opinion, thorough tests should be added in Optimum, not Transformers. The test I was thinking of in Transformers was only an integration one to check that there's no error. |
There is an issue with |
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread. Please note that issues that do not follow the contributing guidelines are likely to be ignored. |
not stale |
If you want this PR included in the next release, you should finish the work and have it merged sooner rather than later :-) |
Thanks for the headsup! |
There is substantial work left in Optimum before this should be merged. Marking as draft for now! |
OK, so this won't be in the next release of Transformers (probably this week in preparation for PyTorch 2.0). |
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread. Please note that issues that do not follow the contributing guidelines are likely to be ignored. |
Hey @fxmarty and @younesbelkada, are there standing PRs in |
Hey @LysandreJik @sgugger |
@sgugger @LysandreJik this is now ready for review! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks! Just have one comment on the is_optimum_available
function but the rest looks fine!
Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Left a few nits.
LGTM!
if not is_optimum_available(): | ||
raise ImportError("The package `optimum` is required to use Better Transformer.") | ||
|
||
from optimum.version import __version__ as optimum_version | ||
|
||
if version.parse(optimum_version) < version.parse("1.7.0"): | ||
raise ImportError( | ||
f"Please install optimum>=1.7.0 to use Better Transformer. The version {optimum_version} was found." | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe factor all of this into a is_bettertransformer_available
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm I would say this is too specific, maybe let's keep it as it is
tests/bettertransformer/__init__.py
Outdated
@@ -0,0 +1 @@ | |||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it wanted?
Co-authored-by: Michael Benayoun <mickbenayoun@gmail.com>
…ggingface#21259) * fix mess * better documentation * typo * fix doc * update * add test * fix test * more tests * Update src/transformers/modeling_utils.py Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> * move to utils * Apply suggestions from code review Co-authored-by: Michael Benayoun <mickbenayoun@gmail.com> * nit --------- Co-authored-by: younesbelkada <younesbelkada@gmail.com> Co-authored-by: Younes Belkada <49240599+younesbelkada@users.noreply.github.com> Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> Co-authored-by: Michael Benayoun <mickbenayoun@gmail.com>
…ggingface#21259) * fix mess * better documentation * typo * fix doc * update * add test * fix test * more tests * Update src/transformers/modeling_utils.py Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> * move to utils * Apply suggestions from code review Co-authored-by: Michael Benayoun <mickbenayoun@gmail.com> * nit --------- Co-authored-by: younesbelkada <younesbelkada@gmail.com> Co-authored-by: Younes Belkada <49240599+younesbelkada@users.noreply.github.com> Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> Co-authored-by: Michael Benayoun <mickbenayoun@gmail.com>
As per title.
Should be merged only on the next Optimum release that will include huggingface/optimum#676
Before submitting
Tests are still to be done.
Who can review?
@younesbelkada @sgugger