We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Pytorch XLA/PJRT TPU support for bitsandbytes
Would allow for faster and more memory efficient training of models on TPUs.
Happy to provide TPUs.
The text was updated successfully, but these errors were encountered:
+1 please!
Sorry, something went wrong.
+1 as well please!
No branches or pull requests
Feature request
Pytorch XLA/PJRT TPU support for bitsandbytes
Motivation
Would allow for faster and more memory efficient training of models on TPUs.
Your contribution
Happy to provide TPUs.
The text was updated successfully, but these errors were encountered: