-
Notifications
You must be signed in to change notification settings - Fork 412
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix metrics not being torch-scriptable due to new is_differentiable property #172
Fix metrics not being torch-scriptable due to new is_differentiable property #172
Conversation
Hello @maximsch2! Thanks for updating this PR. There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻 Comment last updated at 2021-04-15 02:51:02 UTC |
Codecov Report
@@ Coverage Diff @@
## master #172 +/- ##
==========================================
+ Coverage 96.14% 96.16% +0.01%
==========================================
Files 180 90 -90
Lines 5580 2791 -2789
==========================================
- Hits 5365 2684 -2681
+ Misses 215 107 -108
Flags with carried forward coverage won't be shown. Click here to find out more. Continue to review full report at Codecov.
|
heh can we also add a test for this so prevent it in the future... :] |
raise NotImplementedError | ||
# There is a bug in PyTorch that leads to properties being executed during scripting | ||
# To make the metric scriptable, we add property to ignore list and switch to return None here | ||
return None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
that was my concern earlier... #154 (comment)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It should hopefully be OK to throw once PyTorch fixes the issue (it's not respecting the ignore on properties!), but that would be a while till it gets to the released version, etc.
6f98744
to
124116e
Compare
Actually there is a fix on PyTorch side already, but it's not part of the released version yet and given the desire for backwards-compatibility I don't think we'll be able to use it for a while: pytorch/pytorch#52367 We need to use |
Before submitting
What does this PR do?
Metrics were made scriptable in Lightning-AI/pytorch-lightning#4428 but this was broken with addition of
is_differentiable
property which is triggering TorchScript issue (reported to PyTorch). For now, I'm suggesting we just return None there instead of raising an exception.