-
Notifications
You must be signed in to change notification settings - Fork 90
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Number of parameters inconsistent with the paper? #11
Comments
Hi @yinmustark, Maybe double check your code. |
Thanks for your reply! |
@yinmustark, |
Problem solved. |
Hello! Thanks for your sharing this awesome repository!
When I try your get_model_summary function in the demo file, with the default model (m2o DIN nonlinear+context) I find the parameter size is 5,953,515.
According to your paper, the parameters should be about 8.15M, right?
And I notice that when setting use_nonlinear=False, the parameters and GFLOPs are the same with the data on your paper (with or without context).
Do you have a clue about this discrepancy?
Here's what the summary function returns:
Total Parameters: 5,953,515
Total Multiply Adds (For Convolution and Linear Layers only): 5.189814209938049 GFLOPs
Number of Layers
Conv2d : 109 layers BatchNorm2d : 88 layers ReLU6 : 71 layers DepthwiseM2OIndexBlock : 5 layers InvertedResidual : 17 layers _ASPPModule : 4 layers AdaptiveAvgPool2d : 1 layers Dropout : 1 layers ASPP : 1 layers IndexedUpsamlping : 7 layers
The text was updated successfully, but these errors were encountered: