-
Notifications
You must be signed in to change notification settings - Fork 27.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Revert workaround for TF safetensors loading #30128
Conversation
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
After testing on the PR branch, this looked good, so I merged the changes to |
All the actual tests are passing, but we have some rate limit errors on other tests. Raised the issue internally! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for handling!
Merging as the failing tests are unrelated |
* See if we can get tests to pass with the fixed weights * See if we can get tests to pass with the fixed weights * Replace the revisions now that we don't need them anymore
* See if we can get tests to pass with the fixed weights * See if we can get tests to pass with the fixed weights * Replace the revisions now that we don't need them anymore
In #30118 @amyeroberts skipped safetensors loading on TF models. I believe I've worked out the cause, and it was that the safetensors weights in that repo were a bit malformed. I tried fixing them by loading and saving them again with a torch
AutoModelForCausalLM
, so hopefully we can use the fixed weights and revert the changes.