Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix metrics not being torch-scriptable due to new is_differentiable property #172

Merged
merged 3 commits into from
Apr 15, 2021

Conversation

maximsch2
Copy link
Contributor

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure to update the docs?
  • Did you write any new necessary tests?

What does this PR do?

Metrics were made scriptable in Lightning-AI/pytorch-lightning#4428 but this was broken with addition of is_differentiable property which is triggering TorchScript issue (reported to PyTorch). For now, I'm suggesting we just return None there instead of raising an exception.

@pep8speaks
Copy link

pep8speaks commented Apr 14, 2021

Hello @maximsch2! Thanks for updating this PR.

There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻

Comment last updated at 2021-04-15 02:51:02 UTC

@codecov
Copy link

codecov bot commented Apr 14, 2021

Codecov Report

Merging #172 (16a642e) into master (da00174) will increase coverage by 0.01%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #172      +/-   ##
==========================================
+ Coverage   96.14%   96.16%   +0.01%     
==========================================
  Files         180       90      -90     
  Lines        5580     2791    -2789     
==========================================
- Hits         5365     2684    -2681     
+ Misses        215      107     -108     
Flag Coverage Δ
Linux 79.79% <50.00%> (+<0.01%) ⬆️
Windows 79.79% <50.00%> (+<0.01%) ⬆️
cpu 96.16% <100.00%> (+0.03%) ⬆️
gpu ?
macOS 96.16% <100.00%> (+0.03%) ⬆️
pytest 96.16% <100.00%> (+0.01%) ⬆️
python3.6 96.15% <100.00%> (+0.03%) ⬆️
python3.8 96.16% <100.00%> (+0.03%) ⬆️
python3.9 96.05% <100.00%> (+0.03%) ⬆️
torch1.3.1 95.07% <100.00%> (+0.03%) ⬆️
torch1.4.0 95.19% <100.00%> (+0.03%) ⬆️
torch1.8.1 96.05% <100.00%> (+0.03%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
torchmetrics/metric.py 95.14% <100.00%> (+0.39%) ⬆️
__w/2/s/torchmetrics/regression/spearman.py
__w/2/s/torchmetrics/functional/__init__.py
.../s/torchmetrics/classification/precision_recall.py
...rics/functional/classification/confusion_matrix.py
__w/2/s/torchmetrics/utilities/__init__.py
__w/2/s/torchmetrics/functional/regression/psnr.py
__w/2/s/torchmetrics/functional/regression/ssim.py
...chmetrics/classification/precision_recall_curve.py
__w/2/s/torchmetrics/retrieval/retrieval_metric.py
... and 81 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update da00174...16a642e. Read the comment docs.

@maximsch2 maximsch2 changed the title Fix differentiable Fix metrics not being torch-scriptable due to new is_differentiable property Apr 14, 2021
@Borda
Copy link
Member

Borda commented Apr 14, 2021

but this was broken with addition of is_differentiable property which is triggering TorchScript issue

heh can we also add a test for this so prevent it in the future... :]

raise NotImplementedError
# There is a bug in PyTorch that leads to properties being executed during scripting
# To make the metric scriptable, we add property to ignore list and switch to return None here
return None
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

that was my concern earlier... #154 (comment)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It should hopefully be OK to throw once PyTorch fixes the issue (it's not respecting the ignore on properties!), but that would be a while till it gets to the released version, etc.

@Borda Borda force-pushed the fix_differentiable branch from 6f98744 to 124116e Compare April 14, 2021 21:39
@maximsch2
Copy link
Contributor Author

Actually there is a fix on PyTorch side already, but it's not part of the released version yet and given the desire for backwards-compatibility I don't think we'll be able to use it for a while: pytorch/pytorch#52367

We need to use __jit_ignored_attributes__ there essentially which will be supported in new pytorch, I've updated the PR.

@SkafteNicki SkafteNicki merged commit 82eab8f into Lightning-AI:master Apr 15, 2021
@maximsch2 maximsch2 deleted the fix_differentiable branch April 15, 2021 16:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants