You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When implementing nn.Dropout (training mode), either passing self.p or (1 - self.p) to init.randb function can pass test_nn_dropout's cases.
Although test_nn_dropout_forward_1 and test_nn_dropout_backward_1 pass 0.45 and 0.26 as the prob parameter respectively, under the current random seed, the array generated by the init.rand function always has 50% of values less than prob and the other 50% greater than prob, leading to the above result.
Specificlly, output of init.randb function would be:
When implementing nn.Dropout (training mode), either passing self.p or (1 - self.p) to init.randb function can pass test_nn_dropout's cases.
Although test_nn_dropout_forward_1 and test_nn_dropout_backward_1 pass 0.45 and 0.26 as the prob parameter respectively, under the current random seed, the array generated by the init.rand function always has 50% of values less than prob and the other 50% greater than prob, leading to the above result.
Specificlly, output of init.randb function would be:
test_nn_dropout_forward_1 :
[[ True False False]
[ True False True]]
test_nn_dropout_backward_1:
[[ True False False]
[ True False True]]
due to output of init.rand function being:
test_nn_dropout_forward_1 :
[[0.01457496 0.91874701 0.90071485]
[0.03342143 0.95694934 0.13720932]]
test_nn_dropout_backward_1:
[[0.01457496 0.91874701 0.90071485]
[0.03342143 0.95694934 0.13720932]]
The text was updated successfully, but these errors were encountered: