Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Compositional metrics #5464

Merged
merged 36 commits into from
Jan 26, 2021
Merged

Compositional metrics #5464

merged 36 commits into from
Jan 26, 2021

Conversation

justusschock
Copy link
Member

@justusschock justusschock commented Jan 11, 2021

What does this PR do?

Fixes #5392

Implements composition of metrics with simple operator interface like metric_a + metric_b.

Before submitting

  • Was this discussed/approved via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests? (not for typos and docs)
  • Did you verify new and existing tests pass locally with your changes?
  • Did you update the CHANGELOG? (not for typos, docs, test updates, or internal minor changes/refactorings)

PR review

Anyone in the community is free to review the PR once the tests have passed.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:

  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified
  • Check that target branch and milestone match!

Did you have fun?

Make sure you had fun coding 🙃

@justusschock justusschock added the feature Is an improvement or enhancement label Jan 11, 2021
@justusschock justusschock added this to the 1.2 milestone Jan 11, 2021
@justusschock justusschock self-assigned this Jan 11, 2021
@pep8speaks
Copy link

pep8speaks commented Jan 11, 2021

Hello @justusschock! Thanks for updating this PR.

Line 89:13: W503 line break before binary operator

Line 319:13: W503 line break before binary operator

Comment last updated at 2021-01-26 14:35:27 UTC

@teddykoker
Copy link
Contributor

Very cool! Is there a reason that the CompositionalMetric import is need for each operation?

Also, is there a way we could register all of these methods dynamically? Something like:

operations = {
    "__add__": torch.add,
    "__mul__": torch.mul,
    ...
}

def register_operations():
    for attr, fn in operations.items():
        setattr(Metric, attr,  lambda self, other: CompositionalMetric(fn, self, other))

@justusschock
Copy link
Member Author

@teddykoker the registering should be possible, we just have to split it into 2 groups: The ones that have two arguments and the ones having only one arg. I also think, that we should not use lambdas here since that way we might run into issues for some DP/DDP backends due to pickling.

regarding the import: it has to be somewhat locally, since the compositional metric depends on Metric and can therefore not yet be imported before the definition of Metric itself. this is why I now imported it locally so that the import is only triggered, when the function is called. also the import is written there multiple times, but essentially only done once, since python caches imports

@SkafteNicki
Copy link
Member

Since some of the operations also need the input in the opposite order i.e __radd__ uses CompositionalMetric(torch.add, other, self), IMO it would probably be better just to leave it as it is (hopefully this should not need to be changed in the future)

@justusschock
Copy link
Member Author

@SkafteNicki I have one more question for you. I just reused your kwargs filter, but right now I noticed, that if the user does not explicitly define argument names but just *args, **kwargs the filter does not work, right? Should we in that case forward all the kwargs?

@SkafteNicki
Copy link
Member

You are right that it does not work, since the signature keys would then be simply args and kwargs

import inspect
def f(*args, **kwargs):
    return 1
signature = inspect.signature(f)
print(signature.parameters.keys()) #odict_keys(['args', 'kwargs'])

I agree that in this case we should just forward everything.

@SkafteNicki SkafteNicki linked an issue Jan 12, 2021 that may be closed by this pull request
@Borda Borda enabled auto-merge (squash) January 24, 2021 09:12
@mergify mergify bot removed the has conflicts label Jan 26, 2021
@Borda Borda merged commit 8c55a08 into release/1.2-dev Jan 26, 2021
@Borda Borda deleted the compositional_metric branch January 26, 2021 16:56
@codecov
Copy link

codecov bot commented Jan 26, 2021

Codecov Report

Merging #5464 (4a5c6be) into release/1.2-dev (4e7e1df) will decrease coverage by 0%.
The diff coverage is 94%.

@@               Coverage Diff                @@
##           release/1.2-dev   #5464    +/-   ##
================================================
- Coverage               93%     93%    -0%     
================================================
  Files                  153     154     +1     
  Lines                10890   11041   +151     
================================================
+ Hits                 10077   10214   +137     
- Misses                 813     827    +14     

@Borda
Copy link
Member

Borda commented Jan 26, 2021

@justusschock it seems that we have merged something where most tests are failing https://github.com/PyTorchLightning/pytorch-lightning/runs/1769896518
could you please check it...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature Is an improvement or enhancement ready PRs ready to be merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Metrics] Add Depth Loss
6 participants