-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Aadi/sleap-anchors #21
Conversation
-gtr_runner.py: add logging function, condense metrics,persistent tracking -train.py: allow selection of batch task when running local -config.py: minor bug fix
Co-authored-by: aaprasad <aaprasad.ucsd.edu>
Important Auto Review SkippedAuto reviews are disabled on base/target branches other than the default branch. Please add the base/target branch pattern to the list of additional branches to be reviewed in the settings. Please check the settings in the CodeRabbit UI or the You can disable this status message by setting the WalkthroughThe project update primarily enhances the Changes
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (invoked as PR comments)
Additionally, you can add CodeRabbit Configration File (
|
@coderabbitai review |
@aaprasad, I've reviewed the changes made to Key Changes:
Suggestions:
Overall, the modifications align well with the stated goal of the PR to allow user-defined anchors for cropping, with sensible defaults and improved error handling. If you have specific areas in these changes or other parts of the PR you'd like me to look deeper into, please let me know! Analysis chainScripts executedThe following scripts were executed for the analysis: Script: #!/bin/bash
# List the files changed in the PR
git diff --name-only HEAD~1..HEAD
Length of output: 1129 Script: #!/bin/bash
# Display the changes made to sleap_dataset.py
git diff HEAD~1..HEAD -- biogtr/datasets/sleap_dataset.py
Length of output: 10805 Actions performedReview triggered.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 9
Actionable comments outside the diff hunks (1)
biogtr/datasets/data_utils.py (1)
Line range hint
342-342
: Avoid using bareexcept
statements.Using a bare
except
statement can catch unexpected exceptions and make debugging more difficult. Specify the exception type to improve error handling and maintainability.
|
||
if not self.persistent_tracking: | ||
if self.verbose: | ||
warnings.warn(f"Clearing Queue after tracking") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Use of f-string without placeholders.
- warnings.warn(f"Clearing Queue after tracking")
+ warnings.warn("Clearing Queue after tracking")
Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.
warnings.warn(f"Clearing Queue after tracking") | |
warnings.warn("Clearing Queue after tracking") |
def to(self, map_location): | ||
"""Move instance to different device or change dtype. (See `torch.to` for more info). | ||
|
||
Args: | ||
map_location: Either the device or dtype for the instance to be moved. | ||
|
||
Returns: | ||
self: reference to the instance moved to correct device/dtype. | ||
""" | ||
self._gt_track_id = self._gt_track_id.to(map_location) | ||
self._pred_track_id = self._pred_track_id.to(map_location) | ||
self._bbox = self._bbox.to(map_location) | ||
self._crop = self._crop.to(map_location) | ||
self._features = self._features.to(map_location) | ||
self.device = map_location | ||
return self |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The method to
should handle cases where map_location
is None
to avoid potential errors when no device or dtype is specified.
+ if map_location is not None:
Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.
def to(self, map_location): | |
"""Move instance to different device or change dtype. (See `torch.to` for more info). | |
Args: | |
map_location: Either the device or dtype for the instance to be moved. | |
Returns: | |
self: reference to the instance moved to correct device/dtype. | |
""" | |
self._gt_track_id = self._gt_track_id.to(map_location) | |
self._pred_track_id = self._pred_track_id.to(map_location) | |
self._bbox = self._bbox.to(map_location) | |
self._crop = self._crop.to(map_location) | |
self._features = self._features.to(map_location) | |
self.device = map_location | |
return self | |
def to(self, map_location): | |
"""Move instance to different device or change dtype. (See `torch.to` for more info). | |
Args: | |
map_location: Either the device or dtype for the instance to be moved. | |
Returns: | |
self: reference to the instance moved to correct device/dtype. | |
""" | |
if map_location is not None: | |
self._gt_track_id = self._gt_track_id.to(map_location) | |
self._pred_track_id = self._pred_track_id.to(map_location) | |
self._bbox = self._bbox.to(map_location) | |
self._crop = self._crop.to(map_location) | |
self._features = self._features.to(map_location) | |
self.device = map_location | |
return self |
|
||
import seaborn as sns | ||
import imageio | ||
import hydra | ||
import pandas as pd | ||
import numpy as np | ||
import cv2 | ||
import imageio | ||
from matplotlib import pyplot |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove unused import matplotlib.pyplot
.
- from matplotlib import pyplot
Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.
from matplotlib import pyplot |
@@ -80,7 +81,7 @@ def main(cfg: DictConfig): | |||
if cfg.view_batch.no_train: | |||
return | |||
|
|||
model = train_cfg.get_gtr_runner() | |||
model = train_cfg.get_gtr_runner() # TODO see if we can use torch.compile() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A TODO comment has been added regarding the use of torch.compile()
. This should be addressed or tracked in the project's issue tracking system to ensure it does not get overlooked.
Would you like me to open a GitHub issue to track the investigation into the use of torch.compile()
in this context?
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
Fix SleapDataset to allow user-defined anchor for which crop will be centered around. Default to midpoint if the anchor isnt in the skeleton.
Summary by CodeRabbit
New Features
EvalDataset
class for merging datasets.Enhancements
Bug Fixes
Documentation
Refactor