Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Chore(pt): fix warning in test_training #4245

Closed
wants to merge 7 commits into from

Conversation

iProzd
Copy link
Collaborator

@iProzd iProzd commented Oct 23, 2024

Summary by CodeRabbit

  • Bug Fixes

    • Improved the test cleanup process by ensuring complete removal of temporary directories, reducing potential file deletion conflicts during separate test executions.
  • Tests

    • Enhanced test isolation by using temporary directories for test data, improving the reliability of test outcomes while maintaining the structure and functionality of existing test cases for training and fine-tuning models.
  • Improvements

    • Optimized resource management in the training process by introducing a dedicated method for deleting training and validation data structures based on the training mode (multi-task or single-task).

Copy link
Contributor

coderabbitai bot commented Oct 23, 2024

📝 Walkthrough
📝 Walkthrough

Walkthrough

The changes in this pull request involve modifications to the TestFparam class within the source/tests/pt/test_training.py file. The setUp method has been updated to create a temporary directory for test data, copying the original data directory to this new location. The data_file variable now references this temporary path. Additionally, the tearDown method has been altered to remove the entire temporary directory instead of unlinking a specific file, enhancing test isolation and reliability.

Changes

File Path Change Summary
source/tests/pt/test_training.py Updated setUp to create a temporary directory for test data and modified tearDown to remove the entire temporary directory.
deepmd/pt/train/training.py Added delete_dataloader method in Trainer class to manage deletion of training and validation data based on task configuration.

Sequence Diagram(s)

sequenceDiagram
    participant TestFparam
    participant FileSystem

    TestFparam->>FileSystem: Create temporary directory
    FileSystem-->>TestFparam: Temporary directory created
    TestFparam->>FileSystem: Copy original data to temporary directory
    FileSystem-->>TestFparam: Data copied
    TestFparam->>FileSystem: Remove temporary directory
    FileSystem-->>TestFparam: Temporary directory deleted
Loading

Possibly related PRs

  • Chore(pt): slim uts for dpa1 #4244: The changes in the TestDPA1 class involve modifications to test parameters and logic, which may relate to the overall testing framework and methodology used in TestFparam, indicating a potential connection in how tests are structured and executed.

Suggested reviewers

  • njzjz
  • wanghan-iapcm

Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL

📥 Commits

Files that changed from the base of the PR and between 911f41b and 75e89c8.

📒 Files selected for processing (1)
  • source/tests/pt/test_training.py (1 hunks)
🧰 Additional context used

source/tests/pt/test_training.py Outdated Show resolved Hide resolved
Copy link

codecov bot commented Oct 23, 2024

Codecov Report

Attention: Patch coverage is 60.00000% with 2 lines in your changes missing coverage. Please review.

Project coverage is 84.22%. Comparing base (dabedd2) to head (225f99e).
Report is 3 commits behind head on devel.

Files with missing lines Patch % Lines
deepmd/pt/train/training.py 60.00% 2 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##            devel    #4245      +/-   ##
==========================================
- Coverage   84.40%   84.22%   -0.18%     
==========================================
  Files         570      570              
  Lines       53071    53076       +5     
  Branches     3054     3054              
==========================================
- Hits        44794    44705      -89     
- Misses       7318     7412      +94     
  Partials      959      959              

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@iProzd iProzd requested review from njzjz and wanghan-iapcm October 23, 2024 14:49
@njzjz njzjz added the Test CUDA Trigger test CUDA workflow label Oct 23, 2024
@github-actions github-actions bot removed the Test CUDA Trigger test CUDA workflow label Oct 23, 2024
Copy link
Member

@njzjz njzjz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@iProzd iProzd added the Test CUDA Trigger test CUDA workflow label Oct 26, 2024
@github-actions github-actions bot removed the Test CUDA Trigger test CUDA workflow label Oct 26, 2024
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Outside diff range and nitpick comments (1)
deepmd/pt/train/training.py (1)

1036-1050: Improve memory cleanup by explicitly deleting data structures.

The code properly handles cleanup of training and validation data structures based on the training mode (multi-task vs single-task). This is a good practice for memory management, especially when dealing with large datasets.

Consider using a context manager pattern for better resource management:

-        if self.multi_task:
-            for model_key in self.model_keys:
-                del (
-                    self.training_data[model_key],
-                    self.training_dataloader[model_key],
-                    self.validation_data[model_key],
-                    self.validation_dataloader[model_key],
-                )
-        else:
-            del (
-                self.training_data,
-                self.training_dataloader,
-                self.validation_data,
-                self.validation_dataloader,
-            )
+        def cleanup_data():
+            if self.multi_task:
+                for model_key in self.model_keys:
+                    del (
+                        self.training_data[model_key],
+                        self.training_dataloader[model_key],
+                        self.validation_data[model_key],
+                        self.validation_dataloader[model_key],
+                    )
+            else:
+                del (
+                    self.training_data,
+                    self.training_dataloader,
+                    self.validation_data,
+                    self.validation_dataloader,
+                )
+        
+        try:
+            # Your training code here
+            pass
+        finally:
+            cleanup_data()

This ensures cleanup happens even if an exception occurs during training.

📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL

📥 Commits

Files that changed from the base of the PR and between 17dafea and 9f69bb1.

📒 Files selected for processing (1)
  • deepmd/pt/train/training.py (1 hunks)

@iProzd iProzd added the Test CUDA Trigger test CUDA workflow label Nov 5, 2024
@github-actions github-actions bot removed the Test CUDA Trigger test CUDA workflow label Nov 5, 2024
Comment on lines +1050 to +1055
del (
self.training_data,
self.training_dataloader,
self.validation_data,
self.validation_dataloader,
)

Check warning

Code scanning / CodeQL

Unnecessary delete statement in function Warning

Unnecessary deletion of local variable
Tuple
in function
delete_dataloader
.
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Outside diff range and nitpick comments (1)
deepmd/pt/train/training.py (1)

1040-1056: Add error handling and docstring to the delete_dataloader method.

The implementation looks good, but could benefit from these improvements:

  1. Add a docstring explaining the purpose and when to call this method
  2. Add error handling for cases where attributes don't exist
 def delete_dataloader(self):
+    """Delete training and validation data structures to free up memory.
+    
+    This method should be called when the data loaders are no longer needed,
+    typically after training is complete or when switching between different
+    training phases.
+    """
+    try:
         if self.multi_task:
             for model_key in self.model_keys:
                 del (
                     self.training_data[model_key],
                     self.training_dataloader[model_key],
                     self.validation_data[model_key],
                     self.validation_dataloader[model_key],
                 )
         else:
             del (
                 self.training_data,
                 self.training_dataloader,
                 self.validation_data,
                 self.validation_dataloader,
             )
+    except AttributeError as e:
+        log.warning(f"Failed to delete some data loaders: {str(e)}")
🧰 Tools
🪛 GitHub Check: CodeQL

[warning] 1050-1055: Unnecessary delete statement in function
Unnecessary deletion of local variable Tuple in function delete_dataloader.

📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL

📥 Commits

Reviewing files that changed from the base of the PR and between 9f69bb1 and 225f99e.

📒 Files selected for processing (1)
  • deepmd/pt/train/training.py (1 hunks)
🧰 Additional context used
🪛 GitHub Check: CodeQL
deepmd/pt/train/training.py

[warning] 1050-1055: Unnecessary delete statement in function
Unnecessary deletion of local variable Tuple in function delete_dataloader.

@njzjz
Copy link
Member

njzjz commented Nov 5, 2024

The warning still exists... We may consider giving up this PR.

@iProzd
Copy link
Collaborator Author

iProzd commented Nov 7, 2024

This PR will be closed as discussed.

@iProzd iProzd closed this Nov 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[BUG] source/tests/pt/test_training.py::TestFparam::test_dp_train throws a warning (instead of an error)
3 participants