-
Notifications
You must be signed in to change notification settings - Fork 2.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Fix] Make accuracy take into account ignore_index #1259
Conversation
Codecov Report
@@ Coverage Diff @@
## master #1259 +/- ##
=======================================
Coverage 90.22% 90.22%
=======================================
Files 130 130
Lines 7558 7560 +2
Branches 1258 1258
=======================================
+ Hits 6819 6821 +2
Misses 531 531
Partials 208 208
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please add UT to test this feature in
def test_accuracy(): |
mmseg/models/losses/accuracy.py
Outdated
@@ -2,12 +2,13 @@ | |||
import torch.nn as nn | |||
|
|||
|
|||
def accuracy(pred, target, topk=1, thresh=None): | |||
def accuracy(pred, target, topk=1, thresh=None, ignore_index=255): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it would be better to set None
as default value of ignore_index
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure, I will do it with adding UT by tonight(KST). Thank you for the comment :)
mmseg/models/losses/accuracy.py
Outdated
"""Calculate accuracy according to the prediction and target. | ||
|
||
Args: | ||
pred (torch.Tensor): The model prediction, shape (N, num_class, ...) | ||
target (torch.Tensor): The target of each prediction, shape (N, , ...) | ||
ignore_index (int | None): The label index to be ignored. Default: 255 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ignore_index (int | None): The label index to be ignored. Default: 255 | |
ignore_index (int | None): The label index to be ignored. | |
If it is None, any label will be taken into account. Default: None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi, thanks for your contribution.
Please fix the lint error and add unittests.
I found a bug while making unit tests for it, and it is fixed now. I checked the pre-commit and all passed fine :) |
Great. |
Hi! I have a question. Should I do pull-rebase based on the most recent commit on the master branch? or Am I done here? :) |
Yes, you should |
@MeowZheng Okay, I did pull-rebase based on the current master branch and pushed again! |
* make accuracy take into account ignore_index * add UT for accuracy
…nction (open-mmlab#1259) * add unit tests for Q(error) is Gaussian * minor refactor * add docstrings and paper reference * update pretrained model path * fix bug * fix bug * align with official training setting * remove random shift * add docs and remove irrelevant files * remove debug code * fix code style * fix unittest and remove irrelevant files Co-authored-by: ly015 <liyining0712@gmail.com>
Motivation
Your repository has been so helpful for my projects and I deeply appreciate it.
My opinion can be wrong, but please look into it and I hope it can be helpful a bit at least.
I tried to apply
to ignore the background label, and it turned out that the value of acc_seg from the decode_head was weirdly low while other values such as loss and IoU were all looking good. Looking into the decode_head.py,
the loss_decode receives self.ignore_index as an input argument, while the accuracy doesn't. I think it explains why the accuracy seemed so different from other metrics. Thus, I think the accuracy function should receive the ignore index as an argument as well to take into account the ignore_index to calculate the accuracy of the decode head.
And, I found that the point_head.py also uses the same function without receiving the ignore_index. So I guess both of them should be fixed.
Modification
Firstly, the accuracy function can be defined like this
and inside of the function, the accuracy can be calculated like this to exclude the ignored index,
Secondly, in decode_head.py, we can simply put self.ignore_index into one of the arguments like this,
Finally, the same strategy can be applied to point_head.py like this,
BC-breaking (Optional)
Use cases (Optional)
Checklist