-
Notifications
You must be signed in to change notification settings - Fork 18.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Silence layer deletes data during backprop #3151
Comments
Thanks for reporting @cvondrick. You're right, this is a bug and should be fixed. In practice though I think |
I agree that is an odd case, but I think it can also be triggered by setting the entire network to do force_backward (which is how I found this bug). |
You're right, that would trigger it as well. I'll send a PR with the fix in a second. Thanks again! |
SilenceLayer Backward bugfix (fixes #3151)
During Backward_* in the Slience layer, it performs the following operation when propagate_down[i] is set to true:
and similarly for the GPU.
Usually this code will not run because the Silence layer does not generally need backprop. However, if you force the network to do backprop, this will have the consequence of overwriting the bottom data blob with all zeros.
I think the correct behavior is to set the bottom diff to 0:
Should this be changed?
Thanks!
The text was updated successfully, but these errors were encountered: