-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Some error in your shadow detection results at test datasets #15
Comments
And can you upload your pretrained shadow detection model, we need it to compare with my method on our own dataset |
Q1: We generate the outputs again and get the similar results. The reason of the comparability of SBU and SBU_crf maybe lead by that we use the binarization operation (prediction = (prediction>90)*255) before CRF in SBU dataset. Because at that time, we observed that many pixels are positive but still under 127.5, we adjust it by binarization operation with bias. However, after the submission, we find that we can use the weighted BCE loss to make the balance this problem. |
hello, I wonder that whether you select some different thresholds on different dataset. For example, may you select (prediction>90) in SBU, and (prediction>x) in ISTD(x!=90)? Thanks! |
Actually, we just do this binary operation in SBU. For UCF and ISTD, we save the soft output before crf. |
hi bro,we think you may upload wrong result on SUB, because we find the SUB result is the same as the SBU_crf result
The text was updated successfully, but these errors were encountered: