Skip to content

Commit

Permalink
fixes metric hashing (#478)
Browse files Browse the repository at this point in the history
* fixes metric hashing

* Update torchmetrics/metric.py

* Apply suggestions from code review

* chlog

* fix test

* fix decorator

* fix decorator

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: thomas chaton <thomas@grid.ai>
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Co-authored-by: Nicki Skafte <skaftenicki@gmail.com>
Co-authored-by: Jirka <jirka.borovec@seznam.cz>
(cherry picked from commit 3257c60)
  • Loading branch information
justusschock authored and Borda committed Aug 27, 2021
1 parent 760b888 commit 830f5c7
Show file tree
Hide file tree
Showing 4 changed files with 33 additions and 2 deletions.
4 changes: 4 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,10 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0

- Fixed bug where compositional metrics where unable to sync because of type mismatch ([#454](https://github.com/PyTorchLightning/metrics/pull/454))


- Fixed metric hashing ([#478](https://github.com/PyTorchLightning/metrics/pull/478))


## [0.5.0] - 2021-08-09

### Added
Expand Down
22 changes: 22 additions & 0 deletions tests/bases/test_hashing.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
import pytest

from tests.helpers.testers import DummyListMetric, DummyMetric


@pytest.mark.parametrize(
"metric_cls",
[
DummyMetric,
DummyListMetric,
],
)
def test_metric_hashing(metric_cls):
"""Tests that hases are different.
See the Metric's hash function for details on why this is required.
"""
instance_1 = metric_cls()
instance_2 = metric_cls()

assert hash(instance_1) != hash(instance_2)
assert id(instance_1) != id(instance_2)
2 changes: 1 addition & 1 deletion tests/bases/test_metric.py
Original file line number Diff line number Diff line change
Expand Up @@ -160,7 +160,7 @@ class B(DummyListMetric):

b1 = B()
b2 = B()
assert hash(b1) == hash(b2)
assert hash(b1) != hash(b2) # different ids
assert isinstance(b1.x, list) and len(b1.x) == 0
b1.x.append(tensor(5))
assert isinstance(hash(b1), int) # <- check that nothing crashes
Expand Down
7 changes: 6 additions & 1 deletion torchmetrics/metric.py
Original file line number Diff line number Diff line change
Expand Up @@ -502,7 +502,12 @@ def _filter_kwargs(self, **kwargs: Any) -> Dict[str, Any]:
return filtered_kwargs

def __hash__(self) -> int:
hash_vals = [self.__class__.__name__]
# we need to add the id here, since PyTorch requires a module hash to be unique.
# Internally, PyTorch nn.Module relies on that for children discovery
# (see https://github.com/pytorch/pytorch/blob/v1.9.0/torch/nn/modules/module.py#L1544)
# For metrics that include tensors it is not a problem,
# since their hash is unique based on the memory location but we cannot rely on that for every metric.
hash_vals = [self.__class__.__name__, id(self)]

for key in self._defaults:
val = getattr(self, key)
Expand Down

0 comments on commit 830f5c7

Please sign in to comment.