Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tests for ReLU do not provide negative inputs #9

Open
navalnica opened this issue Nov 6, 2022 · 0 comments
Open

tests for ReLU do not provide negative inputs #9

navalnica opened this issue Nov 6, 2022 · 0 comments

Comments

@navalnica
Copy link

test_nn_relu_forward_1() and test_nn_relu_backward_1() both pass non-negative input to ReLU layer. This does not allow to test how layer works for negative inputs.

Inputs get generated as x = get_tensor(*shape) and get_tensor() function samples data uniformly from [0; 5[ interval:

def get_tensor(*shape, entropy=1):
    np.random.seed(np.prod(shape) * len(shape) * entropy)
    return ndl.Tensor(np.random.randint(0, 100, size=shape) / 20, dtype="float32")
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant