Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Refactory] Merge BEiT and ConvNext 's LR decay optimizer constructors #1438

Merged
merged 36 commits into from
Apr 27, 2022

Conversation

linfangjian01
Copy link
Contributor

@linfangjian01 linfangjian01 commented Mar 31, 2022

Thanks for your contribution and we appreciate it a lot. The following instructions would make your pull request more healthy and more easily get feedback. If you do not understand some items, don't worry, just make the pull request and seek help from maintainers.

Motivation

As BEiT and ConvNext 's LR decay optimizer constructors have similar functions, the difference between them is how to get layer/stage id and we merge them in this pr.

Modification

  1. Rename mmseg/core/utils/layer_decay_optimizer_constructor.py to mmseg/core/optimizers/layer_decay_optimizer_constructor.py
  2. Remove mmseg/core/layer_decay_optimizer_constructor.py
  3. Rename get_num_layer_layer_wise get_num_stage_stage_wist
  4. Add a condition for BEiT in layer_wise
  5. Add deprecated warning to LayerDecayOptimizerConstructor
  6. Re-organize UTs based on modifications

@linfangjian01 linfangjian01 changed the title Moveoptim move layer_decay_optimizer_constructor Mar 31, 2022
@linfangjian01 linfangjian01 changed the title move layer_decay_optimizer_constructor [fix] move layer_decay_optimizer_constructor Mar 31, 2022
@linfangjian01 linfangjian01 changed the title [fix] move layer_decay_optimizer_constructor [Fix] move layer_decay_optimizer_constructor Mar 31, 2022
@linfangjian01 linfangjian01 changed the title [Fix] move layer_decay_optimizer_constructor [Fix] Move layer_decay_optimizer_constructor Mar 31, 2022
@codecov
Copy link

codecov bot commented Mar 31, 2022

Codecov Report

Merging #1438 (23b920d) into master (239049c) will increase coverage by 0.03%.
The diff coverage is 100.00%.

@@            Coverage Diff             @@
##           master    #1438      +/-   ##
==========================================
+ Coverage   90.30%   90.34%   +0.03%     
==========================================
  Files         140      140              
  Lines        8345     8327      -18     
  Branches     1403     1400       -3     
==========================================
- Hits         7536     7523      -13     
+ Misses        571      569       -2     
+ Partials      238      235       -3     
Flag Coverage Δ
unittests 90.34% <100.00%> (+0.03%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
mmseg/core/__init__.py 100.00% <100.00%> (ø)
mmseg/core/optimizers/__init__.py 100.00% <100.00%> (ø)
...re/optimizers/layer_decay_optimizer_constructor.py 96.11% <100.00%> (ø)
mmseg/core/utils/__init__.py 100.00% <100.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 239049c...23b920d. Read the comment docs.



@OPTIMIZER_BUILDERS.register_module()
class LayerDecayOptimizerConstructor(DefaultOptimizerConstructor):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we merge LayerDecayOptimizerConstructor and LearningRateDecayOptimizerConstructor to one class?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

must be add a deprecated warning for the deleted one

@MengzhangLI
Copy link
Contributor

MengzhangLI commented Apr 1, 2022

I suggest split LearningRateDecayOptimizerConstructor of ConvNeXt and BEiT only if you can ensure their training performances would be the same as before and log info does not leave out import message.

Because the diff shows the difference between original two LearningRateDecayOptimizerConstructor files not only lies in get_num_layer_for_vit function.

image

Btw, it is absolutely excellent if you can merge these two LearningRateDecayOptimizerConstructor into one file which means you would like to spend more efforts to check its detailed difference and performance.

@MengzhangLI
Copy link
Contributor

MengzhangLI commented Apr 3, 2022

I suggest we may remind users know what are these two optimizer construtors for. Because based on their name, users may not know relation between them. Related comments are flexible so you name it .

@MeowZheng MeowZheng merged commit f16bb06 into open-mmlab:master Apr 27, 2022
@linfangjian01 linfangjian01 deleted the moveoptim branch June 21, 2022 17:47
@linfangjian01 linfangjian01 restored the moveoptim branch June 21, 2022 17:47
@linfangjian01 linfangjian01 deleted the moveoptim branch June 21, 2022 17:48
ZhimingNJ pushed a commit to AetrexTechnology/mmsegmentation that referenced this pull request Jun 29, 2022
open-mmlab#1438)

* move layer_decay_optimizer_constructor

* fix

* fix

* merge test_core

* fix

* add DeprecationWarning

* fix DeprecationWarning

* fix

* fix

* fix

* fix

* fix

* fix

* fix

* fix

* fix

* fix

* fix

* fix

* fix test

* fix

* fix

* fix

* fix

* fix ut

* fix

* fix

* Update tests/test_core/test_layer_decay_optimizer_constructor.py

* fix

* fix

* fix

* fix

Co-authored-by: MeowZheng <meowzheng@outlook.com>
Co-authored-by: Miao Zheng <76149310+MeowZheng@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants