We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hello, I'd like to know if there is a self-attention module in backbone, is it accurate to use get_flops.py to calculate flops?
The text was updated successfully, but these errors were encountered:
The flop counter relies on the implementation is MMCV here. It seems that multihead attention or customized attention module's FLOP is not included.
Sorry, something went wrong.
Ok, I find that, thank u~
In fact, if the torch.bmm operator is included, the flops will be accurate. And, the LayerNorm is also ignored in get_model_complexity_info function.
Thanks for your suggestion. Would you like to create a PR to MMCV? If not, we may add this feature in MMCV in the next month.
ooooh! I'm sorry that I don't know how to implement it in fact.
Therefore, looking forward to the update of mmcv !
Thank u~
ZwwWayne
No branches or pull requests
Hello,
I'd like to know if there is a self-attention module in backbone, is it accurate to use get_flops.py to calculate flops?
The text was updated successfully, but these errors were encountered: