Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Confusion about initialization in bigger nets #10

Open
koenigpeter opened this issue Jun 26, 2019 · 2 comments
Open

Confusion about initialization in bigger nets #10

koenigpeter opened this issue Jun 26, 2019 · 2 comments

Comments

@koenigpeter
Copy link

koenigpeter commented Jun 26, 2019

Hey,
I'm trying out concrete dropout with bigger nets (namely DenseNet121 and ResNet18) and for that tried to port the Keras implementation for spatial concrete dropout to PyTorch.
Since it works for DenseNet121 (model converges) but strangely not for ResNet18, I was wondering, if maybe the initialization I used was wrong.
For both weight_regularizer and dropout_regularizer I used the initialization given in the MNIST example of the spatial concrete dropout Keras implementation (both dependent by division on the train dataset length). However when looking at the paper, you seem to have used 0.01 x N x H x W for the dropout regularizer when using bigger models, but this multiplication would lead to a much much bigger factor than the 2. / N specified in the example.
What kind of initialization is right?
I would greatly appreciate if you could clear up my confusion!
Cheers!

@axel971
Copy link

axel971 commented Feb 12, 2021

Hi !
I agree and I am confuse for the same reasons. I read the paper and I did not understand how the weight regularizer and dropout regularizer are initialized. Could you please tell us what means prior length scale ? and which value to assign to this variable ?

@JFagin
Copy link

JFagin commented Feb 3, 2022

I am also confused about this

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants