-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to calculate the FLOPs and Params after freezing and pruning a model #4
Comments
Thanks for your attention to our work! I'm glad to address your questions. You can try using the " For the freezing operation, you can calculate the unfrozen part, as the frozen part doesn't need to be trained. I hope these answers are helpful to you. Once again, thank you for your interest in our work! |
Thank you very much for your patient response! I had also considered using the
Before reaching out to you, I tried many methods to resolve the issue, but none were successful. Have you encountered this problem before? How did you solve it? I feel so sorry to bothering you again and look forward to your reply. |
Dear Dr.Jia,
I attempted to run your code and encountered some questions regarding the calculation of the model's params and FLOPs.
dummy_input = torch.randn(1, 3, 32, 32)
flops, params = profile(glb_model, (dummy_input,))
It seems you are using the above statements to compute the model's params and FLOPs. However, after applying operations such as freezing and pruning, the resulting params and FLOPs should theoretically change as described in your paper. Yet, regardless of the pruning ratio, the computed values remain unchanged and reflect the pre-pruning model's params and FLOPs.
I suspect that my understanding of the code might be insufficient, so I would like to ask for your guidance on how to correctly compute the FLOPs and params after the model has undergone freezing and pruning.
Thank you for your time and assistance.
The text was updated successfully, but these errors were encountered: