-
Notifications
You must be signed in to change notification settings - Fork 758
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
【Hackathon No.13】为 Paddle 新增 CyclicLR 优化调度器 #4315
Conversation
Thanks for your contribution! |
CyclicLR | ||
----------------------------------- | ||
|
||
.. py:class:: paddle.optimizer.lr.CyclicLR(base_learning_rate, max_learning_rate, step_size_up, step_size_down, mode, gamma, scale_fn, scale_mode, last_epoch, verbose) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
gamma -> exp_gamma
|
||
.. py:class:: paddle.optimizer.lr.CyclicLR(base_learning_rate, max_learning_rate, step_size_up, step_size_down, mode, gamma, scale_fn, scale_mode, last_epoch, verbose) | ||
|
||
该接口提供一种学习率按固定频率在两个边界之间循环的策略。 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
该接口
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
已修改
参数 | ||
:::::::::::: | ||
|
||
- **base_learning_rate** (float) - 初始学习率,也是学习率变化的下边界。 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
和英文对齐语义,其他部分也同理,需要中英文对齐语义
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
已修改
@@ -0,0 +1,61 @@ | |||
.. _cn_api_paddle_optimizer_lr_CyclicLR: | |||
|
|||
CyclicLR |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
CyclicLR | ||
----------------------------------- | ||
|
||
.. py:class:: paddle.optimizer.lr.CyclicLR(base_learning_rate, max_learning_rate, step_size_up, step_size_down, mode, exp_gamma, scale_fn, scale_mode, last_epoch, verbose) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
有默认值的参数要写上默认值
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
down
|
||
返回: | ||
:::::::::::: | ||
用于调整学习率的``CyclicLR``实例对象。 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这里 `` 前后要空格
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
down
- **mode** (str,可选) - 可以是triangular、triangular2或者exp_range,对应策略已在上文描述,当scale_fn被指定时时,该参数将被忽略。默认值:triangular。 | ||
- **exp_gamma** (float,可选) - exp_range缩放函数中的常量。默认值:1.0。 | ||
- **sacle_fn** (function, 可选) - 一个有且仅有单个参数的函数,且对于任意的输入x,都必须满足0 ≤ scale_fn(x) ≤ 1;如果该参数被指定,则会忽略mode参数。默认值: ``False`` 。 | ||
- **scale_mode** (str,可选) - cycle或者iterations,表示缩放函数使用cycle数或iterations数作为输入。 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
默认值要说一下
【Hackathon No.13】为 Paddle 新增 CyclicLR 优化调度器
PADDLEPADDLE_PR=40698