Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

【Hackathon No.13】为 Paddle 新增 CyclicLR 优化调度器 #4315

Merged
merged 12 commits into from
Jun 8, 2022
Merged

【Hackathon No.13】为 Paddle 新增 CyclicLR 优化调度器 #4315

merged 12 commits into from
Jun 8, 2022

Conversation

Asthestarsfalll
Copy link
Contributor

@Asthestarsfalll Asthestarsfalll commented Mar 18, 2022

【Hackathon No.13】为 Paddle 新增 CyclicLR 优化调度器
PADDLEPADDLE_PR=40698

@paddle-bot-old
Copy link

Thanks for your contribution!

@Asthestarsfalll Asthestarsfalll changed the title Hackathon No.26 Hackathon No.13 Mar 18, 2022
@Asthestarsfalll Asthestarsfalll changed the title Hackathon No.13 【Hackathon No.13】为 Paddle 新增 CyclicLR 优化调度器 Apr 8, 2022
CyclicLR
-----------------------------------

.. py:class:: paddle.optimizer.lr.CyclicLR(base_learning_rate, max_learning_rate, step_size_up, step_size_down, mode, gamma, scale_fn, scale_mode, last_epoch, verbose)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

gamma -> exp_gamma


.. py:class:: paddle.optimizer.lr.CyclicLR(base_learning_rate, max_learning_rate, step_size_up, step_size_down, mode, gamma, scale_fn, scale_mode, last_epoch, verbose)

该接口提供一种学习率按固定频率在两个边界之间循环的策略。
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

该接口

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

已修改

参数
::::::::::::

- **base_learning_rate** (float) - 初始学习率,也是学习率变化的下边界。
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

和英文对齐语义,其他部分也同理,需要中英文对齐语义

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

已修改

@@ -0,0 +1,61 @@
.. _cn_api_paddle_optimizer_lr_CyclicLR:

CyclicLR
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

http://preview-pr-4315.paddle-docs-preview.paddlepaddle.org.cn/documentation/docs/zh/api/paddle/optimizer/lr/CyclicLR_cn.html

可以对着预览检查一下格式,目前有些格式问题,比如以下:
image

image

CyclicLR
-----------------------------------

.. py:class:: paddle.optimizer.lr.CyclicLR(base_learning_rate, max_learning_rate, step_size_up, step_size_down, mode, exp_gamma, scale_fn, scale_mode, last_epoch, verbose)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

有默认值的参数要写上默认值

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

down


返回:
::::::::::::
用于调整学习率的``CyclicLR``实例对象。
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里 `` 前后要空格

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

down

- **mode** (str,可选) - 可以是triangular、triangular2或者exp_range,对应策略已在上文描述,当scale_fn被指定时时,该参数将被忽略。默认值:triangular。
- **exp_gamma** (float,可选) - exp_range缩放函数中的常量。默认值:1.0。
- **sacle_fn** (function, 可选) - 一个有且仅有单个参数的函数,且对于任意的输入x,都必须满足0 ≤ scale_fn(x) ≤ 1;如果该参数被指定,则会忽略mode参数。默认值: ``False`` 。
- **scale_mode** (str,可选) - cycle或者iterations,表示缩放函数使用cycle数或iterations数作为输入。
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

默认值要说一下

@TCChenlong TCChenlong merged commit 00ca528 into PaddlePaddle:develop Jun 8, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants