-
Notifications
You must be signed in to change notification settings - Fork 27.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix pytorch division warning by using suggested torch.div rounding_mode #14577
Conversation
Note that this argument of |
Agree! Let's put it in
|
I'd be happy to put forward something to that effect in the next few days |
Hi @mgoldey did you have time to work on this? It looks like it's going to be needed in the PR mentioned above as well, so we can take over on the implementation of the custom function Patrick mentioned you don't have time :-) |
Feel free to jump in if you have time. I'll notify if I wind up being free, but I had to switch directions from time-sensitive projects. |
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread. Please note that issues that do not follow the contributing guidelines are likely to be ignored. |
Moot - resolved by #15180 |
What does this PR do?
This PR removes a warning that is repeatedly thrown with the latest releases of transformers and pytorch.
example:
The fact that this occurs in multiple places suggests there's an opportunity for a shared function. Instead of a larger refactor, this PR touches a few specific places using
//
to avoid causing other side effects.Before submitting
No
documentation guidelines, and
here are tips on formatting docstrings.
Docs should be unaffected
The test coverage should be unchanged.