Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About SiLog Loss NaN #30

Open
cheryllbl opened this issue Dec 9, 2022 · 0 comments
Open

About SiLog Loss NaN #30

cheryllbl opened this issue Dec 9, 2022 · 0 comments

Comments

@cheryllbl
Copy link

I replaced GLPDepth's backbone with MiT-B1, and after training 10 epochs, the loss of the model increased with each batch of data, and finally the loss became NaN. Has this happened to the author? I guarantee that the model has not changed anything other than backbone, the dataset is NYU V2, and the input size is 640x480.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant