You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am experimenting with the SimCLR framework for audio data, and I found that in the objective function, exponential operation on the negative and positive samples may induce numerical instability i.e. loss becomes nan. I think this is not unexpected.
@jefflai108out_1 and out_2 are normalized vectors and should not create numerical instability with the current loss function. The maximum value after torch.exp of any element can be e and the minimum is 1.
Looking at the range for exponentiated similarity values of normalized vectors, we also eliminate the possibility of division by 0 in the line loss = -torch.log(pos / neg).mean()
Can you verify for me that out_1 and out_2 are normalized in your code?
Issue
https://github.com/PyTorchLightning/pytorch-lightning-bolts/blob/e6b10875d59a39a4dcf382d3a599528a40ba088c/pl_bolts/losses/self_supervised_learning.py#L17-L24
I am experimenting with the SimCLR framework for audio data, and I found that in the objective function, exponential operation on the negative and positive samples may induce numerical instability i.e. loss becomes nan. I think this is not unexpected.
Temporary solution
Uses torch.nn.CrossEntropyLoss
this is what some other open-source implementation does, like this one: https://github.com/sthalles/SimCLR/blob/master/loss/nt_xent.py
The text was updated successfully, but these errors were encountered: