Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Different dropout rate between training and testing? #1

Open
elwell67 opened this issue Mar 7, 2018 · 0 comments
Open

Different dropout rate between training and testing? #1

elwell67 opened this issue Mar 7, 2018 · 0 comments

Comments

@elwell67
Copy link

elwell67 commented Mar 7, 2018

Any suggestions on how to implement the stochastic predictor with a different dropout rate than that which was used in training? I have tried to modify the layer attributes (.rate), but this does not change the output of the stochastic predictor function (built on the keras backend function).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant