-
Notifications
You must be signed in to change notification settings - Fork 27.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adds TRANSFORMERS_TEST_BACKEND
#25655
Conversation
Allows specifying arbitrary additional import following first `import torch`. This is useful for some custom backends, that will require additional imports to trigger backend registration with upstream torch. See pytorch/benchmark#1805 for a similar change in `torchbench`.
cc @ydshieh |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi! Thank you for the PR.
I would like to see a real example of usage, please.
I am also wondering if it's always only one backend needed. What happens if we need multiple ones?
Co-authored-by: Yih-Dar <2521628+ydshieh@users.noreply.github.com>
I have now provided example usage with the
I feel just having one is fine for now, I am struggling to imagine a use-case for multiple, unless a single worker somehow had CPU, GPU, and other backends all on one physical machine. |
```bash | ||
TRANSFORMERS_TEST_DEVICE="cpu" pytest tests/test_logging.py | ||
``` | ||
|
||
This variable is useful for testing custom or less common PyTorch backends such as `mps`. It can also be used to achieve the same effect as `CUDA_VISIBLE_DEVICES` by targeting specific GPUs or testing in CPU-only mode. | ||
|
||
Certain devices will require an additional import after importing `torch` for the first time. This can be specified using the environment variable `TRANSFORMERS_TEST_BACKEND`: | ||
```bash | ||
TRANSFORMERS_TEST_BACKEND="torch_npu" pytest tests/test_logging.py |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
tests/test_logging.py
doesn't exist in transformers
.
It would be nice if the example test against a test file (or even a single test device) that could use the npu
device with the necessary backends. But I guess such a test doesn't exist yet and will wait your work on #25654.
Let keep this in mind (to update later)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
tests/test_logging.py doesn't exist in transformers.
Interesting, it is referenced numerous times in the docs/source/en/testing.md
file!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
well, probably we move the test files. Will check but thanks pointing out this ❤️
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thansk! Looking forward seeing a real example running!
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. |
* Adds `TRANSFORMERS_TEST_BACKEND` Allows specifying arbitrary additional import following first `import torch`. This is useful for some custom backends, that will require additional imports to trigger backend registration with upstream torch. See pytorch/benchmark#1805 for a similar change in `torchbench`. * Update src/transformers/testing_utils.py Co-authored-by: Yih-Dar <2521628+ydshieh@users.noreply.github.com> * Adds real backend example to documentation --------- Co-authored-by: Yih-Dar <2521628+ydshieh@users.noreply.github.com>
* Adds `TRANSFORMERS_TEST_BACKEND` Allows specifying arbitrary additional import following first `import torch`. This is useful for some custom backends, that will require additional imports to trigger backend registration with upstream torch. See pytorch/benchmark#1805 for a similar change in `torchbench`. * Update src/transformers/testing_utils.py Co-authored-by: Yih-Dar <2521628+ydshieh@users.noreply.github.com> * Adds real backend example to documentation --------- Co-authored-by: Yih-Dar <2521628+ydshieh@users.noreply.github.com>
What does this PR do?
Allows specifying arbitrary additional import following first
import torch
. This is useful for some custom backends, that will require additional imports to trigger backend registration with upstream torch. See pytorch/benchmark#1805 for a similar change intorchbench
.If the specified backend does not exist, we throw a helpful error. I have updated the docs to include this new variable.
Before submitting
Pull Request section?
to it if that's the case.
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
@sgugger would you mind taking a look at this? It relates to my previous PR.