Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added a how-to-guide for parameter schedulers #80

Closed
wants to merge 2 commits into from

Conversation

divyanshugit
Copy link

@divyanshugit divyanshugit commented Mar 17, 2022

This PR is a modified version of PR#49

@Priyansi @sdesrozis Can you please give a review?

This PR is modified version of PR#49
Co-authored-by: Sylvain Desroziers <sylvain.desroziers@gmail.com>
@sdesrozis
Copy link
Contributor

sdesrozis commented Mar 17, 2022

Why not improve #49 instead of creating a new PR ? Anyway.

@divyanshugit
Copy link
Author

Why not improve #49 instead of creating a new PR ?

I was unable to add any new commits to #49

@sdesrozis
Copy link
Contributor

I feel doubtful about the added description. It’s copy / paste from official PyTorch doc and I’m not sure that some details really matter. I would have preferred something more personal for the project.

"metadata": {},
"outputs": [],
"source": [
"import ast\n",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Useful ?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is used in MultiStepLR for parsing the string values of Milestones.

"source": [
"## LambdaLR\n",
"\n",
"Sets the learning rate of each parameter group to the initial lr times a given function. When last_epoch=-1, sets initial lr as lr.\n",
Copy link
Contributor

@sdesrozis sdesrozis Mar 17, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

last_epoch ? Just copy / paste sounds not enough. IMO useless and to be removed in others description cells.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should I try to rephrase it?

@divyanshugit
Copy link
Author

I feel doubtful about the added description. It’s copy / paste from official PyTorch doc and I’m not sure that some details really matter. I would have preferred something more personal for the project.

I thought this is the simplest way to explain these schedulers.

"Sets the learning rate of each parameter group to the initial lr times a given function. When last_epoch=-1, sets initial lr as lr.\n",
"\n",
"$$\n",
"l r_{\\text {epoch}} = l r_{\\text {initial}} * Lambda(epoch)\n",
Copy link
Contributor

@sdesrozis sdesrozis Mar 17, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In the PyTorch doc, optimizer.step() is called at the epoch end, it’s more flexible in ignite. The formula should not be written using epoch as index name.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, my bad. I'm sorry. I didn't think in this manner.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants