You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I replaced GLPDepth's backbone with MiT-B1, and after training 10 epochs, the loss of the model increased with each batch of data, and finally the loss became NaN. Has this happened to the author? I guarantee that the model has not changed anything other than backbone, the dataset is NYU V2, and the input size is 640x480.
The text was updated successfully, but these errors were encountered:
I replaced GLPDepth's backbone with MiT-B1, and after training 10 epochs, the loss of the model increased with each batch of data, and finally the loss became NaN. Has this happened to the author? I guarantee that the model has not changed anything other than backbone, the dataset is NYU V2, and the input size is 640x480.
The text was updated successfully, but these errors were encountered: