-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Dropout with prob == 0 doesn't validate consistently #1799
Comments
Using
|
O.o any suggestion as to what we should do? |
Tried to flip drop prob to 1.0 and looks like the issue with rand_like is real. We are producing
|
Fixes #1799 1. Updates rand_like by changing output==1 to 0 via `where`; 2. Patches codegen float output.
🐛 Describe the bug
The following script doesn't validate consistently on TOT. It seems we may still be dropping out some values even though probability == 0. I think this may be because of: https://github.com/csarofeen/pytorch/blob/devel/torch/csrc/jit/codegen/cuda/ops/composite.cpp#L31 which maybe should be
le
notlt
?Versions
TOT
The text was updated successfully, but these errors were encountered: