Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support ignore_index in ConfusionMatrix #1101

Closed
idow09 opened this issue Jun 20, 2022 · 2 comments · Fixed by #1195
Closed

Support ignore_index in ConfusionMatrix #1101

idow09 opened this issue Jun 20, 2022 · 2 comments · Fixed by #1195
Labels
enhancement New feature or request
Milestone

Comments

@idow09
Copy link

idow09 commented Jun 20, 2022

🚀 Feature

Add an ignore_index parameter to the classification metric ConfusionMatrix, as in Accuracy, Precision, Recall, etc...

Motivation

I use a MetricCollection for my classifier evaluation, and every other metric in this collection supports ignore_index except ConfusionMatrix. This causes a lot of trouble and I don't see a reason for not supporting it.

Pitch

ConfusionMatrix should support ignore_index so this should work totally fine:

target = torch.tensor([1, 1, 0, 0, -1])
preds = torch.tensor([0, 1, 0, 0, 1])
confmat = ConfusionMatrix(num_classes=2, ignore_index=-1)
confmat(preds, target)

and output:

tensor([[2, 0],
        [1, 1]])

Alternatives

I tried implementing my own IgnoreIndexConfusionMatrix but I encountered device synchronization issues...

@idow09 idow09 added the enhancement New feature or request label Jun 20, 2022
@github-actions
Copy link

Hi! thanks for your contribution!, great first issue!

@Borda Borda added this to the v0.10 milestone Jul 27, 2022
@SkafteNicki
Copy link
Member

Issue will be fixed by classification refactor: see this issue #1001 and this PR #1195 for all changes

Small recap: Issue ask that ConfusionMatrix also supports ignore_index argument known from Accuracy, Precision, Recall . Is now supported in all the new introduced versions e.g. BinaryConfusionMatrix, MulticlassConfusionMatrix, MultilabelConfusionMatrix:

import torch
from torchmetrics.classification import BinaryConfusionMatrix

target = torch.tensor([1, 1, 0, 0, -1])
preds = torch.tensor([0, 1, 0, 0, 1])
confmat = BinaryConfusionMatrix(num_classes=2, ignore_index=-1)
confmat(preds, target)
# tensor([[2, 0],
#         [1, 1]])

Issue will be closed when #1195 is merged.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants