-
Notifications
You must be signed in to change notification settings - Fork 526
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Chore(pt): fix warning in test_training
#4245
Conversation
📝 Walkthrough📝 WalkthroughWalkthroughThe changes in this pull request involve modifications to the Changes
Sequence Diagram(s)sequenceDiagram
participant TestFparam
participant FileSystem
TestFparam->>FileSystem: Create temporary directory
FileSystem-->>TestFparam: Temporary directory created
TestFparam->>FileSystem: Copy original data to temporary directory
FileSystem-->>TestFparam: Data copied
TestFparam->>FileSystem: Remove temporary directory
FileSystem-->>TestFparam: Temporary directory deleted
Possibly related PRs
Suggested reviewers
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## devel #4245 +/- ##
==========================================
- Coverage 84.40% 84.22% -0.18%
==========================================
Files 570 570
Lines 53071 53076 +5
Branches 3054 3054
==========================================
- Hits 44794 44705 -89
- Misses 7318 7412 +94
Partials 959 959 ☔ View full report in Codecov by Sentry. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
for more information, see https://pre-commit.ci
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Outside diff range and nitpick comments (1)
deepmd/pt/train/training.py (1)
1036-1050
: Improve memory cleanup by explicitly deleting data structures.The code properly handles cleanup of training and validation data structures based on the training mode (multi-task vs single-task). This is a good practice for memory management, especially when dealing with large datasets.
Consider using a context manager pattern for better resource management:
- if self.multi_task: - for model_key in self.model_keys: - del ( - self.training_data[model_key], - self.training_dataloader[model_key], - self.validation_data[model_key], - self.validation_dataloader[model_key], - ) - else: - del ( - self.training_data, - self.training_dataloader, - self.validation_data, - self.validation_dataloader, - ) + def cleanup_data(): + if self.multi_task: + for model_key in self.model_keys: + del ( + self.training_data[model_key], + self.training_dataloader[model_key], + self.validation_data[model_key], + self.validation_dataloader[model_key], + ) + else: + del ( + self.training_data, + self.training_dataloader, + self.validation_data, + self.validation_dataloader, + ) + + try: + # Your training code here + pass + finally: + cleanup_data()This ensures cleanup happens even if an exception occurs during training.
del ( | ||
self.training_data, | ||
self.training_dataloader, | ||
self.validation_data, | ||
self.validation_dataloader, | ||
) |
Check warning
Code scanning / CodeQL
Unnecessary delete statement in function Warning
Tuple
delete_dataloader
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Outside diff range and nitpick comments (1)
deepmd/pt/train/training.py (1)
1040-1056
: Add error handling and docstring to the delete_dataloader method.The implementation looks good, but could benefit from these improvements:
- Add a docstring explaining the purpose and when to call this method
- Add error handling for cases where attributes don't exist
def delete_dataloader(self): + """Delete training and validation data structures to free up memory. + + This method should be called when the data loaders are no longer needed, + typically after training is complete or when switching between different + training phases. + """ + try: if self.multi_task: for model_key in self.model_keys: del ( self.training_data[model_key], self.training_dataloader[model_key], self.validation_data[model_key], self.validation_dataloader[model_key], ) else: del ( self.training_data, self.training_dataloader, self.validation_data, self.validation_dataloader, ) + except AttributeError as e: + log.warning(f"Failed to delete some data loaders: {str(e)}")🧰 Tools
🪛 GitHub Check: CodeQL
[warning] 1050-1055: Unnecessary delete statement in function
Unnecessary deletion of local variable Tuple in function delete_dataloader.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (1)
deepmd/pt/train/training.py
(1 hunks)
🧰 Additional context used
🪛 GitHub Check: CodeQL
deepmd/pt/train/training.py
[warning] 1050-1055: Unnecessary delete statement in function
Unnecessary deletion of local variable Tuple in function delete_dataloader.
The warning still exists... We may consider giving up this PR. |
This PR will be closed as discussed. |
Summary by CodeRabbit
Bug Fixes
Tests
Improvements