Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Fix] Restrict the warning message #1267

Merged
merged 22 commits into from
Sep 7, 2021
Merged

Conversation

yyz561
Copy link
Contributor

@yyz561 yyz561 commented Aug 13, 2021

Thanks for your contribution and we appreciate it a lot. The following instructions would make your pull request more healthy and more easily get feedback. If you do not understand some items, don't worry, just make the pull request and seek help from maintainers.

Motivation

When I implemented a CenterNet2 model in mmdet, there was a situation when I had to use GroupNorm after Conv2d, and the Conv2d layer was supposed to have a bias. Once I finished the modeling part, I got a warning message 'ConvModule has norm and bias at the same time'. At first, I was not so sure about this warning, it made me uneased. Then I did a little research and realized that for BatchNorm (not so sure about SyncBN), the bias does not affect the calculation. But for other kinds of Norm, e.g., GN, I am pretty sure the bias does give some influence. I think maybe we should restrict the popping of this warning message, since it can be quite misleading for new starters.

Modification

Restrict a warning message of 'ConvModule has norm and bias at the same time' to BN, BN1, BN2 and BN3 normalization.

BC-breaking (Optional)

Does the modification introduce changes that break the backward-compatibility of the downstream repos?
If so, please describe how it breaks the compatibility and how the downstream projects should modify their code to keep compatibility with this PR.

Use cases (Optional)

If this PR introduces a new feature, it is better to list some use cases here, and update the documentation.

Checklist

  1. Pre-commit or other linting tools are used to fix the potential lint issues.
  2. The modification is covered by complete unit tests. If not, please add more unit test to ensure the correctness.
  3. If the modification has potential influence on downstream projects, this PR should be tested with downstream projects, like MMDet or MMCls.
  4. The documentation has been modified accordingly, like docstring or example tutorials.

@CLAassistant
Copy link

CLAassistant commented Aug 13, 2021

CLA assistant check
All committers have signed the CLA.

@zhouzaida
Copy link
Collaborator

zhouzaida commented Aug 13, 2021

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.

hi @yyz561 , please sign the CLA

@zhouzaida zhouzaida changed the title [FIX] Restrict the warning message [Fix] Restrict the warning message Aug 13, 2021
@zhouzaida
Copy link
Collaborator

hi, please fix the CI

@yyz561
Copy link
Contributor Author

yyz561 commented Aug 14, 2021

hi, please fix the CI

Soryy, didn't noticed that code too long, will fix it immediately.

@yyz561
Copy link
Contributor Author

yyz561 commented Aug 14, 2021

hi, please fix the CI

It seems that the password is not valid for git push since yesterday. It took me some time to solve the problem, sorry for the delay. It should work now.

@zhouzaida
Copy link
Collaborator

hi, please fix the CI

It seems that the password is not valid for git push since yesterday. It took me some time to solve the problem, sorry for the delay. It should work now.

yet, thanks for your contribution

@zhouzaida
Copy link
Collaborator

we also need to add a unittest

@yyz561
Copy link
Contributor Author

yyz561 commented Aug 16, 2021

we also need to add a unittest

will this work? I didn't

we also need to add a unittest

Sorry mate, my first time to write a unit test, will this work?

@zhouzaida
Copy link
Collaborator

we also need to add a unittest

will this work? I didn't

we also need to add a unittest

Sorry mate, my first time to write a unit test, will this work?

I think it will work

@yyz561
Copy link
Contributor Author

yyz561 commented Aug 16, 2021

we also need to add a unittest

will this work? I didn't

we also need to add a unittest

Sorry mate, my first time to write a unit test, will this work?

I think it will work

I am exhausted by the format issue... both here and in another PR, could you please name a few external tools I can use? Thanks a lot.

@zhouzaida
Copy link
Collaborator

hi, you can refer to https://github.com/open-mmlab/mmcv/blob/master/CONTRIBUTING.md.

@codecov
Copy link

codecov bot commented Aug 18, 2021

Codecov Report

Merging #1267 (76af0d0) into master (8aab4f2) will increase coverage by 0.05%.
The diff coverage is 93.10%.

❗ Current head 76af0d0 differs from pull request most recent head 817ea57. Consider uploading reports for the commit 817ea57 to get more accurate results
Impacted file tree graph

@@            Coverage Diff             @@
##           master    #1267      +/-   ##
==========================================
+ Coverage   68.23%   68.28%   +0.05%     
==========================================
  Files         160      161       +1     
  Lines       10722    10744      +22     
  Branches     1969     1973       +4     
==========================================
+ Hits         7316     7337      +21     
- Misses       3022     3023       +1     
  Partials      384      384              
Flag Coverage Δ
unittests 68.28% <93.10%> (+0.05%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
mmcv/runner/default_constructor.py 77.77% <77.77%> (ø)
mmcv/cnn/bricks/conv_module.py 100.00% <100.00%> (ø)
mmcv/runner/__init__.py 100.00% <100.00%> (ø)
mmcv/runner/builder.py 100.00% <100.00%> (ø)
mmcv/runner/hooks/logger/pavi.py 71.42% <100.00%> (+0.41%) ⬆️
mmcv/version.py 82.35% <100.00%> (ø)
mmcv/video/processing.py 77.41% <100.00%> (+0.37%) ⬆️
mmcv/cnn/utils/weight_init.py 85.92% <0.00%> (+0.36%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 8aab4f2...817ea57. Read the comment docs.

Copy link
Collaborator

@zhouzaida zhouzaida left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@zhouzaida
Copy link
Collaborator

hi @yyz561 , after reading the GN formula, I found the bias of conv also has no effect for the output of GN

@yyz561
Copy link
Contributor Author

yyz561 commented Aug 23, 2021

hi @yyz561 , after reading the GN formula, I found the bias of conv also has no effect for the output of GN

Hey, I didn't read the actual coding, but it is easy to do a GN experience. Try the following code:

import torch 
import torch.nn as nn
torch.manual_seed(0)
input = torch.randn(1, 4, 4, 4)
conv = nn.Conv2d(4, 4, 3, padding=1, bias=True)
norm = nn.GroupNorm(2, 4)
output = norm(conv(input))
print(output[0, 0, :, :].view(-1))

for comparison:

import torch 
import torch.nn as nn
torch.manual_seed(0)
input = torch.randn(1, 4, 4, 4)
conv = nn.Conv2d(4, 4, 3, padding=1, bias=False)
norm = nn.GroupNorm(2, 4)
output = norm(conv(input))
print(output[0, 0, :, :].view(-1))

@yyz561
Copy link
Contributor Author

yyz561 commented Aug 25, 2021

Hey @zhouzaida, I wrote an blog to talk about the problem. At the same time, I realized that bias of conv also has no effect for the output of Instance Norm too. Could you please have a look at the blog? https://zhuanlan.zhihu.com/p/403444336

@zhouzaida
Copy link
Collaborator

Hey @zhouzaida, I wrote an blog to talk about the problem. At the same time, I realized that bias of conv also has no effect for the output of Instance Norm too. Could you please have a look at the blog? https://zhuanlan.zhihu.com/p/403444336

yet, you are right. Thanks for your patience

@zhouzaida zhouzaida mentioned this pull request Aug 25, 2021
16 tasks
@yyz561
Copy link
Contributor Author

yyz561 commented Aug 27, 2021

@zhouzaida do you agree with me to change the warning message into: "Unnecessary conv bias before batch/instance norm"

@zhouzaida
Copy link
Collaborator

@zhouzaida do you agree with me to change the warning message into: "Unnecessary conv bias before batch/instance norm"

Agree

@yyz561
Copy link
Contributor Author

yyz561 commented Aug 27, 2021

@zhouzaida do you agree with me to change the warning message into: "Unnecessary conv bias before batch/instance norm"

Agree

"First-time contributors need a maintainer to approve running workflows."

Thank you very much. Although it is a trivial change, I learned a lot through the process.

@ZwwWayne ZwwWayne merged commit 99088c8 into open-mmlab:master Sep 7, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants