Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add decay to CyclicLrUpdater #1655

Merged
merged 13 commits into from
Jan 24, 2022
Merged

Add decay to CyclicLrUpdater #1655

merged 13 commits into from
Jan 24, 2022

Conversation

HAOCHENYE
Copy link
Collaborator

@HAOCHENYE HAOCHENYE commented Jan 10, 2022

Thanks for your contribution and we appreciate it a lot. The following instructions would make your pull request more healthy and more easily get feedback. If you do not understand some items, don't worry, just make the pull request and seek help from maintainers.

Motivation

I followed #1059 to modify and fixed the cyclic lr decay.

Modification

Fixed lr update error and unit test. The lr curve of config:

lr_config = dict(
    _delete_="step",
    policy='cyclic',
    # anneal_strategy='linear',
    target_ratio=(10, 1),
    cyclic_times=5,
    step_ratio_up=0.4,
    gamma=0.5,
)

image

lr_config = dict(
    _delete_="step",
    policy='cyclic',
    # anneal_strategy='linear',
    target_ratio=(10, 1e-4),
    cyclic_times=5,
    step_ratio_up=0.4,
    gamma=0.5,
)

image

lr_config = dict(
    _delete_="step",
    policy='cyclic',
    # anneal_strategy='linear',
    target_ratio=(0.5, 1),
    cyclic_times=5,
    step_ratio_up=0.4,
    gamma=0.5,
) # support target_ratio[0] < 1

image

BC-breaking (Optional)

No.

Use cases (Optional)

If this PR introduces a new feature, it is better to list some use cases here, and update the documentation.

Checklist

Before PR:

  • I have read and followed the workflow indicated in the CONTRIBUTING.md to create this PR.
  • Pre-commit or linting tools indicated in CONTRIBUTING.md are used to fix the potential lint issues.
  • Bug fixes are covered by unit tests, the case that causes the bug should be added in the unit tests.
  • New functionalities are covered by complete unit tests. If not, please add more unit test to ensure the correctness.
  • The documentation has been modified accordingly, including docstring or example tutorials.

After PR:

  • If the modification has potential influence on downstream or other related projects, this PR should be tested with some of those projects, like MMDet or MMCls.
  • CLA has been signed and all committers have signed the CLA in this PR.

@CLAassistant
Copy link

CLAassistant commented Jan 10, 2022

CLA assistant check
All committers have signed the CLA.

@zhouzaida zhouzaida requested a review from teamwong111 January 11, 2022 03:21
Copy link
Contributor

@teamwong111 teamwong111 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@zhouzaida zhouzaida requested a review from ZwwWayne January 11, 2022 14:34
@zhouzaida
Copy link
Collaborator

It would be nice if an image can be provided in the PR message to visualize the learning rate. Maybe you can refer to https://mmclassification.readthedocs.io/en/latest/tools/visualization.html#learning-rate-schedule-visualization.

HAOCHENYE and others added 2 commits January 12, 2022 12:06
update docstring

Co-authored-by: Zaida Zhou <58739961+zhouzaida@users.noreply.github.com>
@zhouzaida
Copy link
Collaborator

We also need to add the two arguments anneal_strategy and gamma for CyclicMomentumUpdaterHook.

class CyclicMomentumUpdaterHook(MomentumUpdaterHook):

Copy link
Collaborator

@zhouzaida zhouzaida left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The PR can be merged after resolving the comments.

Fix CyclicMomentum comment
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants