-
I am working on medical image segmentation and trying different networks from Monai. 1- AHNet: 2-Basic U-Net 3-UNETR 4-UNet 5-Pre trained UNet : Are the results correct? Since it does not make sense to have one network with a high loss and a high metric value where another has a high metric value and low loss. I searched the internet but no clear example where it used both dice metric and loss at the same time except one who calculated the loss using the built in function and calculates the dice through equation: loss = 1 - dice. Also. I tried changing many sittings but nothing work. This is the code: (I tried to make lines between the relevant code)
Thanks |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
thanks for the question, maybe it's because of the different final blocks for different networks, for example, the basic_unet returns the logit directly https://github.com/Project-MONAI/MONAI/blob/e6ec945e4b87b90835cdad29ea64b1f27b8accda/monai/networks/nets/basic_unet.py#L251 unet by default returns the result after a relu https://github.com/Project-MONAI/MONAI/blob/e6ec945e4b87b90835cdad29ea64b1f27b8accda/monai/networks/nets/unet.py#L122 also dice_metric is not 1 - DiceLoss because dice_metric is computed after |
Beta Was this translation helpful? Give feedback.
thanks for the question, maybe it's because of the different final blocks for different networks, for example, the basic_unet returns the logit directly https://github.com/Project-MONAI/MONAI/blob/e6ec945e4b87b90835cdad29ea64b1f27b8accda/monai/networks/nets/basic_unet.py#L251
unet by default returns the result after a relu https://github.com/Project-MONAI/MONAI/blob/e6ec945e4b87b90835cdad29ea64b1f27b8accda/monai/networks/nets/unet.py#L122
also dice_metric is not 1 - DiceLoss because dice_metric is computed after
post_pred
which is argmax in this case, but DiceLoss is based on softmax=True in your example.