-
Notifications
You must be signed in to change notification settings - Fork 68
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
focal loss modulator #1
Comments
Hi @bei-startdt Thanks for pointing this out! The implementation you mentioned is not very numerically stable (same for the implementation in https://github.com/tensorflow/tpu/blob/master/models/official/retinanet/retinanet_model.py#L130-L162). When gamma is small (< 1), there might be NaN occurs during back-propagation. The full derivation can be found in the figure below. Hope this will help! |
Thanks a lot! |
@richardaecn Hi,have you experiment on detection datasets such as coco, and the results? |
Hi @Angzz , we haven't tried it on detection datasets. |
@richardaecn Hi , have you compared the class balanced focal loss with the orignal focal loss using resnet 50 or 101 ? When did such comparsion , you used resnet 32 in your paper. Will stronger networks weaken the framework you proposed ? |
|
Hi @shawnthu, in the formulation, we are using 1 for positive labels and 0 for negative labels. |
in fact we are both right, but your solution more concise (^o^)/~ |
how to infer the modulator
the code in your repo
for focal loss in tensorflow/models/blob/master/research/object_detection
the focal loss form is the same as what is shown in paper
Could you please tell me how to transform the paper form to your form?
Thank you very much!
The text was updated successfully, but these errors were encountered: