Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix _sync_dist for compositional metrics #454

Merged
merged 8 commits into from
Aug 17, 2021
Merged

Conversation

SkafteNicki
Copy link
Member

@SkafteNicki SkafteNicki commented Aug 17, 2021

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure to update the docs?
  • Did you write any new necessary tests?

What does this PR do?

Fixes #438
self._sync_dist in compositional metrics have a different signature as the base class. Compare this:
https://github.com/PyTorchLightning/metrics/blob/1171a1f12507df5c8206475f3692b31ddab8d934/torchmetrics/metric.py#L216
to this:
https://github.com/PyTorchLightning/metrics/blob/1171a1f12507df5c8206475f3692b31ddab8d934/torchmetrics/metric.py#L666
which leads to the error:
TypeError: _sync_dist() got an unexpected keyword argument 'process_group'

This PR fixes it and add test.

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

@SkafteNicki SkafteNicki added the bug / fix Something isn't working label Aug 17, 2021
@SkafteNicki SkafteNicki added this to the v0.6 milestone Aug 17, 2021
@codecov
Copy link

codecov bot commented Aug 17, 2021

Codecov Report

Merging #454 (35ac1df) into master (a2712eb) will increase coverage by 20.96%.
The diff coverage is 100.00%.

Impacted file tree graph

@@             Coverage Diff             @@
##           master     #454       +/-   ##
===========================================
+ Coverage   74.94%   95.90%   +20.96%     
===========================================
  Files         129      129               
  Lines        4227     4227               
===========================================
+ Hits         3168     4054      +886     
+ Misses       1059      173      -886     
Flag Coverage Δ
Linux 74.94% <100.00%> (ø)
Windows 74.94% <100.00%> (ø)
cpu 74.94% <100.00%> (ø)
gpu 95.90% <100.00%> (?)
macOS 74.94% <100.00%> (ø)
pytest 95.90% <100.00%> (+20.96%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
torchmetrics/metric.py 95.78% <100.00%> (+38.85%) ⬆️
torchmetrics/regression/r2.py 94.28% <0.00%> (+2.85%) ⬆️
torchmetrics/regression/explained_variance.py 97.05% <0.00%> (+2.94%) ⬆️
torchmetrics/functional/regression/pearson.py 100.00% <0.00%> (+3.12%) ⬆️
torchmetrics/classification/average_precision.py 96.87% <0.00%> (+3.12%) ⬆️
...chmetrics/classification/precision_recall_curve.py 96.87% <0.00%> (+3.12%) ⬆️
torchmetrics/classification/roc.py 96.77% <0.00%> (+3.22%) ⬆️
torchmetrics/utilities/enums.py 100.00% <0.00%> (+3.44%) ⬆️
torchmetrics/regression/cosine_similarity.py 96.42% <0.00%> (+3.57%) ⬆️
torchmetrics/regression/spearman.py 100.00% <0.00%> (+3.84%) ⬆️
... and 78 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update a2712eb...35ac1df. Read the comment docs.

@Borda Borda enabled auto-merge (squash) August 17, 2021 10:21
@Borda Borda merged commit 00f0256 into master Aug 17, 2021
@Borda Borda deleted the fix_compositional_sync_dist branch August 17, 2021 10:55
@Borda Borda modified the milestones: v0.6, v0.5 Aug 18, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug / fix Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants