Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

test_nn_dropout cases fail to validate drop_prob correctly #18

Open
wong-1994 opened this issue Dec 13, 2024 · 0 comments
Open

test_nn_dropout cases fail to validate drop_prob correctly #18

wong-1994 opened this issue Dec 13, 2024 · 0 comments

Comments

@wong-1994
Copy link

When implementing nn.Dropout (training mode), either passing self.p or (1 - self.p) to init.randb function can pass test_nn_dropout's cases.

Although test_nn_dropout_forward_1 and test_nn_dropout_backward_1 pass 0.45 and 0.26 as the prob parameter respectively, under the current random seed, the array generated by the init.rand function always has 50% of values less than prob and the other 50% greater than prob, leading to the above result.

Specificlly, output of init.randb function would be:

test_nn_dropout_forward_1 :
[[ True False False]
[ True False True]]

test_nn_dropout_backward_1:
[[ True False False]
[ True False True]]

due to output of init.rand function being:

test_nn_dropout_forward_1 :
[[0.01457496 0.91874701 0.90071485]
[0.03342143 0.95694934 0.13720932]]

test_nn_dropout_backward_1:
[[0.01457496 0.91874701 0.90071485]
[0.03342143 0.95694934 0.13720932]]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant