-
Notifications
You must be signed in to change notification settings - Fork 27.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Torch.compile Graph break introduced due to new loss function api #34615
Comments
ensures no additional graph break introduced when torch.compile'ed fixes huggingface#34615 Signed-off-by: ChanderG <mail@chandergovind.org>
Thanks , do you have a small reproducer? We would need to add this to our tests! |
Very straightforward use of compile actually, nothing out of the ordinary:
|
Running it like this:
results in no logs in
|
It should have been fixed by #34511 |
@ArthurZucker Not really? As I mention in the first comment, the graph break is introduced in that very PR - the fix for #34402. Before that PR, Dynamo was broken, that PR fixed the error, but Graph Breaks have been introduced during this process. |
Yep sorry, working on #34616 with the fixes |
System Info
transformers
version: 4.47.0.dev0Who can help?
@ArthurZucker
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
PR #34191 introduces a new Loss API. Post this PR, Dynamo was broken, which was identified and fixed in this issue: #34402. Post this (on master), Dynamo runs without errors.
However, in this process, a new Graph Break has been introduced due to this line:
This is due to the new regex check.
Since the dispatch function actually checks for an attr on the config, the fix for this is quite simple - set the loss_type at model init time itself.
Expected behavior
No additional graph breaks.
The text was updated successfully, but these errors were encountered: