-
Notifications
You must be signed in to change notification settings - Fork 415
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ignore_index
for F1 doesn't behave as expected.
#613
Comments
Hi! thanks for your contribution!, great first issue! |
It seems like if reduce is not |
I've had a similar experience using ignore_index with IoU (Jaccard Index), where the IoU value will start at 100.00 and as training progresses the value tends towards 0. |
Since I am not sure about the original intention of
|
@tchayintr would you be interested in sending a PR and @stancld may help if needed? 🐰 |
@Borda Sure. |
Issue will be fixed by classification refactor: see this issue #1001 and this PR #1195 for all changes Small recap: This issue describes that the from torchmetrics.functional import multiclass_f1_score
import torch
preds = torch.tensor([1, 1, 1, 1, 2, 1, 1])
target = torch.tensor([0, 0, 1, 1, 2, 0, 0])
multiclass_f1_score(preds, target, num_classes=3, average="micro", ignore_index=0) # tensor(1.) which give the correct result. |
🐛 Bug
F1 doesn't ignore indices properly.
To Reproduce
Run the following code.
This gives you tensor(0.6000) not tensor(1.0).
Expected behavior
The specified
ignore_index
should not count towards the F1 score. For example, the above code example should be effectively equivalent to the following:Environment
conda
,pip
, source): pipThe text was updated successfully, but these errors were encountered: