You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This implies that anytime the model has a nan prediction and the ground truth is NOT NAN (or vice versa), the resulting dice coefficient is NOT penalized as though the prediction has a dice coefficient of zero.
For instance, let's assume the ground truth is has pixels in it, but the model predicts nothing for one class, resulting in these dice values: [0.0000, 0.8775, 0.1213]
The way that AverageMeter is currently used might simply ignore the 0 dice value in the first class if DiceMetric returned nan for the corresponding class.
However, NaNs are caused by the ground truth being NaN. What happens when the ground truth is NaN AND the predicted is not NaN, because that is also not desirable (i.e. false positive)?
Edit: please see my next comment for further clarification.
To Reproduce
Just run the notebook or use the AverageMeter implementation within your own code.
Expected behavior
Would it be possible to refactor the aggregation/average meter to yield zero if the ground truth is NaN and the predicted is NOT NaN?
The text was updated successfully, but these errors were encountered:
Perhaps the intended purpose is to avoid giving weight to ground truths that have NaNs? However this implementation still doesn't make sense if we view it from that perspective.
In other words, as I see it, there are four possibilities with NaNs with ground truth (GT) and predicted (PRD).
GT: Not NaN, PRD: Not NaN
GT: NaN, PRD: Not NaN
GT: Not NaN, PRD: NaN
GT: NaN, PRD: NaN
If a parameter like ignore_empty is set to true, scenarios 2 & 4 will be set to NaN. Scenario 4 makes sense because we want to ignore any dice calculation where both ground truth and prediction are NaN. However, what about scenario 2 (false positive), which is not desirable?
25benjaminli
changed the title
AverageMeter not used properly in swin unetr brain tumor tutorial?
AverageMeter and Metric Reduction Clarification
Aug 16, 2024
I am seeking clarification on the
AverageMeter
and metric reduction implementations in 3d_segmentation/swin_unetr_brats21_segmentation_3d.ipynb.For instance, here is the
AverageMeter
implementation provided in the brats swinunetr tutorial:It's similar to the ImageNet implementation apart from the last line.
ImageNet's last line:
self.avg = self.sum / self.count
The reason why I think this could be problematic is because the tutorial does the following:
This implies that anytime the model has a nan prediction and the ground truth is NOT NAN (or vice versa), the resulting dice coefficient is NOT penalized as though the prediction has a dice coefficient of zero.
For instance, let's assume the ground truth is has pixels in it, but the model predicts nothing for one class, resulting in these dice values: [0.0000, 0.8775, 0.1213]
The way that
AverageMeter
is currently used might simply ignore the 0 dice value in the first class if DiceMetric returned nan for the corresponding class.However, NaNs are caused by the ground truth being NaN. What happens when the ground truth is NaN AND the predicted is not NaN, because that is also not desirable (i.e. false positive)?
Edit: please see my next comment for further clarification.
To Reproduce
Just run the notebook or use the AverageMeter implementation within your own code.
Expected behavior
Would it be possible to refactor the aggregation/average meter to yield zero if the ground truth is NaN and the predicted is NOT NaN?
The text was updated successfully, but these errors were encountered: