Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Breaking change in MeanAveragePrecision by removing average param #2004

Closed
Advitya17 opened this issue Aug 17, 2023 · 7 comments · Fixed by #2018
Closed

Breaking change in MeanAveragePrecision by removing average param #2004

Advitya17 opened this issue Aug 17, 2023 · 7 comments · Fixed by #2018
Labels
bug / fix Something isn't working help wanted Extra attention is needed v1.0.x
Milestone

Comments

@Advitya17
Copy link

Advitya17 commented Aug 17, 2023

🐛 Bug

We're using the MeanAveragePrecision method in the Responsible AI Toolbox repo for calculating Object Detection metrics, and realized that a breaking update in torchmetrics seems to remove the average attribute.

What would now be a way to specify a 'macro' or 'micro' averaging method for calculating the Object Detection metrics, and is that soon to be addressed in a next release?

To Reproduce

Steps to reproduce the behavior...

Code sample

https://github.com/microsoft/responsible-ai-toolbox/blob/main/responsibleai_vision/responsibleai_vision/rai_vision_insights/rai_vision_insights.py#L1110

Expected behavior

Environment

  • TorchMetrics version (and how you installed TM, e.g. conda, pip, build from source): pip
  • Python & PyTorch Version (e.g., 1.0): PyTorch 1.13.1, python 3.7-3.10
  • Any other relevant information such as OS (e.g., Linux): Windows, Ubuntu, MacOS

Additional context

The average attribute has suddenly been removed from https://torchmetrics.readthedocs.io/en/stable/detection/mean_average_precision.html, though still exists for other image scenarios - https://torchmetrics.readthedocs.io/en/stable/classification/average_precision.html

@Advitya17 Advitya17 added bug / fix Something isn't working help wanted Extra attention is needed labels Aug 17, 2023
@github-actions
Copy link

Hi! thanks for your contribution!, great first issue!

@SkafteNicki
Copy link
Member

Hi @Advitya17, thanks for raising this issue.
I am a bit confused because I am pretty sure that MeanAveragePrecision have never had a average argument, but I may be wrong.
What version of torchmetrics have you been using?

@Advitya17
Copy link
Author

torchmetrics==1.0.1, we had specified it in our code quite a few months back and were able to use torchmetrics until the breaking change happened.

@SkafteNicki
Copy link
Member

Then I do still not understand, because in v1.0.1 of torchmetrics MeanAveragePrecision did not have a average argument:

def __init__(
self,
box_format: Literal["xyxy", "xywh", "cxcywh"] = "xyxy",
iou_type: Literal["bbox", "segm"] = "bbox",
iou_thresholds: Optional[List[float]] = None,
rec_thresholds: Optional[List[float]] = None,
max_detection_thresholds: Optional[List[int]] = None,
class_metrics: bool = False,
**kwargs: Any,
) -> None:

@SkafteNicki
Copy link
Member

@Advitya17 I can see that you have decided to remove the argument from your code.
Even if I cannot find a release where this have been a feature, would you still like it to be part of a future release or is it not needed anymore?

@Advitya17
Copy link
Author

Yes if it can be included in a future release, that'll be very helpful, thank you very much!

@Borda
Copy link
Member

Borda commented Aug 25, 2023

Then I do still not understand, because in v1.0.1 of torchmetrics MeanAveragePrecision did not have a average argument:

the arg average was taken/eaten by the **kwargs so it has zero effect but it did not crash...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug / fix Something isn't working help wanted Extra attention is needed v1.0.x
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants