-
Notifications
You must be signed in to change notification settings - Fork 525
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
chore: move LearningRateExp
to deepmd.utils.learning_rate
#4219
Conversation
Signed-off-by: Jinzhe Zeng <jinzhe.zeng@rutgers.edu>
📝 Walkthrough📝 WalkthroughWalkthroughThe changes involve the removal of the Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant LearningRateExp as Old_LearningRateExp
participant LearningRateExp as New_LearningRateExp
User->>Old_LearningRateExp: Instantiate
Old_LearningRateExp-->>User: Class removed
User->>New_LearningRateExp: Instantiate
New_LearningRateExp-->>User: Class added
User->>New_LearningRateExp: Call value(step)
New_LearningRateExp-->>User: Returns computed learning rate
Possibly related PRs
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 4
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (2)
- deepmd/pt/utils/learning_rate.py (1 hunks)
- deepmd/utils/learning_rate.py (1 hunks)
✅ Files skipped from review due to trivial changes (1)
- deepmd/pt/utils/learning_rate.py
🧰 Additional context used
Signed-off-by: Jinzhe Zeng <jinzhe.zeng@rutgers.edu>
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## devel #4219 +/- ##
=======================================
Coverage 83.50% 83.51%
=======================================
Files 541 542 +1
Lines 52486 52488 +2
Branches 3043 3047 +4
=======================================
+ Hits 43830 43833 +3
Misses 7708 7708
+ Partials 948 947 -1 ☔ View full report in Codecov by Sentry. |
Or, should we move it to |
Is it easy to merge both tf and pt? If not (so far), |
No, for performance, TF needs a static graph, even for the learning rate. |
Signed-off-by: Jinzhe Zeng <jinzhe.zeng@rutgers.edu>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Outside diff range and nitpick comments (2)
deepmd/dpmodel/utils/learning_rate.py (2)
37-40
: Clarify the fallback mechanism for decay_steps in the docstring.The logic for setting
decay_steps
includes a fallback mechanism that's not explained in the docstring. Consider adding an explanation in the docstring about howdecay_steps
is adjusted if it's greater than or equal tostop_steps
.
6-14
: Consider adding type hints to the constructor parameters.Adding type hints to the constructor parameters would improve code readability and maintainability. It would also help catch potential type-related errors earlier in the development process.
Here's a suggested implementation with type hints:
def __init__( self, start_lr: float, stop_lr: float, decay_steps: int, stop_steps: int, decay_rate: Optional[float] = None, **kwargs, ) -> None:
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (2)
- deepmd/dpmodel/utils/learning_rate.py (1 hunks)
- deepmd/pt/utils/learning_rate.py (1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- deepmd/pt/utils/learning_rate.py
🧰 Additional context used
🔇 Additional comments (3)
deepmd/dpmodel/utils/learning_rate.py (3)
5-5
: LGTM: Class structure is appropriate for an exponential decay learning rate scheduler.The
LearningRateExp
class provides a good structure for implementing an exponential decay learning rate scheduler. It includes a constructor for initialization and a method to compute the learning rate at a given step.
48-53
: LGTM: Thevalue
method is correctly implemented.The
value
method correctly implements the exponential decay formula for the learning rate. It also properly enforces the minimum learning rate (stop_lr). The use of a return type hint (np.float64) is a good practice for code clarity.
1-53
: Overall assessment: Well-implemented exponential decay learning rate scheduler.The
LearningRateExp
class provides a solid implementation of an exponential decay learning rate scheduler. The code is well-structured, documented, and follows good practices. The suggested improvements (clarifying the fallback mechanism fordecay_steps
and adding type hints) are minor and would further enhance the code's readability and maintainability.Great job on implementing this functionality!
Summary by CodeRabbit
New Features
Bug Fixes
LearningRateExp
class from the previous module to avoid confusion.