You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There is an exp operation in this code, and clip is used to clip it to avoid explosion, but this still has hidden dangers, that is, before clipping, the exp may have exploded and overflowed, so the clip is useless at this time.
@chensnathan
I think the following passage may be the main cause of the NAN problem:
normalized_cls_score = cls_score + objectness - torch.log(
1. + torch.clamp(cls_score.exp(), max=self.INF) + torch.clamp(
objectness.exp(), max=self.INF))
There is an exp operation in this code, and clip is used to clip it to avoid explosion, but this still has hidden dangers, that is, before clipping, the exp may have exploded and overflowed, so the clip is useless at this time.
So, I changed to clip first and then exp,:
normalized_cls_pred = cls_pred + obj_pred - torch.log(
1. +
torch.clamp(cls_pred, max=DEFAULT_EXP_CLAMP).exp() +
torch.clamp(obj_pred, max=DEFAULT_EXP_CLAMP).exp())
where DEFAULT_EXP_CLAMP = log(INF).
After above modification, NAN problem no longer encountered.
The text was updated successfully, but these errors were encountered: