-
Notifications
You must be signed in to change notification settings - Fork 402
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Some metrics are handling absent values incorrectly #1017
Comments
Hi! thanks for your contribution!, great first issue! |
Issue will be fixed by classification refactor: see this issue #1001 and this PR #1195 for all changes Small recap: This issue describe that accuracy metric is not computing the right value in the binary setting. The problem with the current implementation is that the metric are calculated as average over the 0 and 1 class, which is wrong. After the refactor this has been fixed. Using the new binary_* version of the metric on the provided example: from torchmetrics.functional import binary_accuracy
import torch
target = torch.tensor(
[
[0,0,0,0],
[0,0,1,1],
]
)
preds = torch.tensor(
[
[0,0,0,0],
[0,0,1,1],
]
)
binary_accuracy(preds, target, multidim_average="samplewise") # tensor([1., 1.]) which give the correct result. |
🐛 Bug
Some metrics such as
Accuracy
,Precision
,Recall
andF1Score
are handling the absent values incorrectly.A value absent in target and pred, and therefore correctly predicted, is considered incorrect.
To Reproduce
Steps to reproduce the behavior...
num_classes = 2
,mdmc_average = "samplewise"
andaverage = "none"
;Code sample
Expected behavior
The result should be
torch.tensor([1., 1.])
because the two classes are predicted correctly for both elements of the batch.In fact, in the first element of the batch the absence of class 1 is expected by the target tensor.
Despite this the result of the metric is
torch.tensor([1., .5])
, because in the first element of the batch the value of the metric for class 1 is0.0
Environment
3.8.12
&1.11
conda
,pip
, build command if you used source):pip
Additional context
The metric
JaccardIndex
provide the argumentabsent_score
to handle such cases.The text was updated successfully, but these errors were encountered: