-
Notifications
You must be signed in to change notification settings - Fork 86
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Pretrained models #9
Comments
@saeedizadi maybe has some? |
I've modified the models by @bodokaiser for a binary segmentation problems on medical images (using nn.Sigmoid + BCELoss) and hence my pretrained models may not come helpful in general. However, the training is very straight forward on other datasets. |
Hi @saeedizadi , can you share what you changed for the binary segmentation? I have set the number of classes = 2 everywhere and used a Sigmoid + BCELoss. However, I got some weird results. How did you modify the output of the net with shape (2xHxW) to match to the mask's shape (1xHxW)? |
Hi @ZweeLe Let me know if you need more information. |
@saeedizadi |
Hi @saeedizadi ,
It means:
and
? |
Hi @ZweeLe , |
I think you are right that binary segmentation only needs 1 output class / channel with 0 corresponding to first and 1 corresponding to second class. So something like: model = Net(1)
criterion = nn.BCEWithLogitsLoss()
outputs = model(inputs)
loss = criterion(outputs, targets) should work. Further check |
Hi @andrewssobral , you just need to follow the above modifications of @bodokaiser. For calculating the loss, the outputs and targets must be 1D arrays:
"thres" is the threshold pixel value you set to make a binary mask. You can find out the best value with validation (mine is 120). If you use the default transform then you don't need to multiply with 255. |
Thank you @ZweeLe !
It means?
|
@andrewssobral Yes, they are similar. However, the BCELoss only accepts 1D tensors so you have to reshape your tensors or you can define a new BCELoss2d function. |
Thank you @ZweeLe I did your tips, and now the algorithm is running, but the loss becomes negative, please see:
this is normal? |
@andrewssobral That is weird, the loss must be positive and decrease over time. You should check your outputs of the model and the targeted masks before applying BCELoss. All values should be in range of 0~1. Did you apply the transformation to the labels? |
Hi @ZweeLe ,
to this:
Now, the algorithm is learning well, the loss is decreasing and the preliminary results are very good! Thank you! and thanks to @bodokaiser for your advices ;-) |
@andrewssobral Yes. The ToLabel() and Relabel() are not necessary for binary segmentation. Can you share your training time? For me, it took nearly 5 secs for an iteration and 0.5s to evaluate an image, which is a bit slow since I have a large dataset. |
@ZweeLe as I have only 126 images and I'm using Amazon EC2 with p2x.large instance (NVIDIA K80 with 12Gb of RAM), the training time is very fast (less than 5 minutes). In my laptop with only CPU, it is much more slow, I stopped the training berfore 10 epochs. I think the training will take around (or more than) 30min for the whole dataset. |
Can I directly use nn.CrossEntropyLoss() for binary segmentation? Need I set num_class=1 ? And I want use weighed loss. @andrewssobral @duylp @saeedizadi @bodokaiser |
You would suggest to use Sigmoid() + BinaryCrossEntropy for Binary Segmentation. |
@saeedizadi I don't understand how to set the parameter |
@woaichipinngguo You should see the definitions for the parameters. But, for the initial try, just try to use the default values, and see the results, then, you can customise them toward your taste |
Hi @woaichipinngguo , you can find here an example of training binary segmentation on pytorch here: |
when I set num_class=1 and use NLLLoss() loss, I will encounter an error:
|
@woaichipinngguo You may have a label index overpassing the limit. Cross-entropy needs at least 2 classes. |
Is it possible to make the weights of the trained models available?
The text was updated successfully, but these errors were encountered: