-
Notifications
You must be signed in to change notification settings - Fork 2.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[New Feature]add lovasz loss #351
Conversation
Task linked: CU-j6umcn Lovász-Softmax loss #351 |
Codecov Report
@@ Coverage Diff @@
## master #351 +/- ##
==========================================
+ Coverage 86.07% 86.14% +0.07%
==========================================
Files 95 96 +1
Lines 4789 4900 +111
Branches 778 798 +20
==========================================
+ Hits 4122 4221 +99
- Misses 519 526 +7
- Partials 148 153 +5
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
@@ -0,0 +1,296 @@ | |||
import mmcv |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We may add sth like
Modified from ...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Get it!
mmseg/models/losses/lovasz_loss.py
Outdated
"""Binary Lovasz hinge loss. | ||
|
||
Args: | ||
probas (torch.Tensor): [B, H, W], logits at each pixel |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
probas
is not an ideal abbr.
We may use logits
directly.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In our codebase, logits
usually refers to unnormalized energy. prob
refers to probability distribution.
mmseg/models/losses/lovasz_loss.py
Outdated
"""Multi-class Lovasz-Softmax loss. | ||
|
||
Args: | ||
probas (torch.Tensor): [P, C], class probabilities at each prediction |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Similarly.
mmseg/models/losses/lovasz_loss.py
Outdated
"""Multi-class Lovasz-Softmax loss. | ||
|
||
Args: | ||
probas (torch.Tensor): [B, C, H, W], class probabilities at each |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Similarly.
mmseg/models/losses/lovasz_loss.py
Outdated
|
||
This loss is proposed in `The Lovasz-Softmax loss: A tractable surrogate | ||
for the optimization of the intersection-over-union measure in neural | ||
networks <https://openaccess.thecvf.com/content_cvpr_2018/html/Berman_The_LovaSz-Softmax_Loss_CVPR_2018_paper.html>`_. # noqa |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
https://arxiv.org/abs/1705.08790
should be fine.
mmseg/models/losses/lovasz_loss.py
Outdated
# only void pixels, the gradients should be 0 | ||
return logits.sum() * 0. | ||
signs = 2. * labels.float() - 1. | ||
errors = (1. - logits * Variable(signs)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why Variable
here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't know, it is the original code.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
* add lovasz loss * Modify as comments * Modify paper url * add unittest and remove Var * impove unittest
No description provided.