Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Configurable metrics #230

Merged
merged 11 commits into from
Apr 20, 2022
Merged

Configurable metrics #230

merged 11 commits into from
Apr 20, 2022

Conversation

djdameln
Copy link
Contributor

Description

Changes

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update

Checklist

  • My code follows the pre-commit style and check guidelines of this project.
  • I have performed a self-review of my code
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works
  • New and existing tests pass locally with my changes

@djdameln djdameln added Enhancement New feature or request Metrics Metric Component. Do not Merge labels Apr 13, 2022
pixel_f1 = F1(num_classes=1, compute_on_step=False, threshold=self.hparams.model.threshold.pixel_default)
self.image_metrics = MetricCollection([image_auroc, image_f1], prefix="image_").cpu()
self.pixel_metrics = MetricCollection([pixel_auroc, pixel_f1], prefix="pixel_").cpu()
self.image_metrics, self.pixel_metrics = get_metrics(self.hparams)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there something planned for classification models which do not have pixel metrics? When I removed the pixel metric key from the config file it threw error for padim

Copy link
Contributor Author

@djdameln djdameln Apr 13, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should work:

metrics:
  image:
    - F1
    - AUROC
  pixel: []

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Works now 🙂

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure why my comments disappeared from here, but ideally

metrics:
  image:
    - F1
    - AUROC

should also work fine.

@alexriedel1
Copy link
Contributor

Works like a charm!

from .min_max import MinMax
from .optimal_f1 import OptimalF1

__all__ = ["AUROC", "OptimalF1", "AdaptiveThreshold", "AnomalyScoreDistribution", "MinMax"]


def get_metrics(config: Union[ListConfig, DictConfig]) -> Tuple[AnomalibMetricCollection, AnomalibMetricCollection]:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We might need to modify this for LightningCLI

pixel_f1 = F1(num_classes=1, compute_on_step=False, threshold=self.hparams.model.threshold.pixel_default)
self.image_metrics = MetricCollection([image_auroc, image_f1], prefix="image_").cpu()
self.pixel_metrics = MetricCollection([pixel_auroc, pixel_f1], prefix="pixel_").cpu()
self.image_metrics, self.pixel_metrics = get_metrics(self.hparams)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure why my comments disappeared from here, but ideally

metrics:
  image:
    - F1
    - AUROC

should also work fine.

@ashwinvaidya17
Copy link
Collaborator

Might be out of the scope of this PR but we can have a look at this PR Lightning-AI/torchmetrics#709 which updates the base metrics only once. Maybe we can find a way to automatically group metrics.

@djdameln djdameln mentioned this pull request Apr 19, 2022
11 tasks
@ashwinvaidya17
Copy link
Collaborator

Can you fix torchmetrics version in this PR as well >=0.8.0

@samet-akcay samet-akcay changed the title WIP: configurable metrics Configurable metrics Apr 20, 2022
@samet-akcay samet-akcay merged commit a1d49f9 into development Apr 20, 2022
@samet-akcay samet-akcay deleted the da/feature/optional-metrics branch April 20, 2022 19:49
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Enhancement New feature or request Metrics Metric Component.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Optional performance metrics
4 participants