Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merge main to inference #35

Closed
wants to merge 43 commits into from
Closed

Merge main to inference #35

wants to merge 43 commits into from

Conversation

gitttt-1234
Copy link
Collaborator

No description provided.

gitttt-1234 and others added 30 commits October 19, 2023 14:18
* added make_centered_bboxes & normalize_bboxes

* added make_centered_bboxes & normalize_bboxes

* created test_instance_cropping.py

* added test normalize bboxes; added find_global_peaks_rough

* black formatted

* black formatted peak_finding

* added make_grid_vectors, normalize_bboxes, integral_regression, added docstring to make_centered_bboxes, fixed find_global_peaks_rough; added crop_bboxes

* finished find_global_peaks with integral regression over centroid crops!

* reformatted with pydocstyle & black

* moved make_grid_vectors to data/utils

* removed normalize_bboxes

* added tests docstrings

* sorted imports with isort

* remove unused imports

* updated test cases for instance cropping

* added minimal_cms.pt fixture + unit tests

* added minimal_bboxes fixture; added unit tests for crop_bboxes & integral_regression

* added find_global_peaks unit tests

* finished find_local_peaks_rough!

* finished find_local_peaks!

* added unit tests for find_local_peaks and find_local_peaks_rough

* updated test cases

* added more test cases for find_local_peaks

* updated test cases

* added architectures folder

* added maxpool2d same padding, get_act_fn; added simpleconvblock, simpleupsamplingblock, encoder, decoder; added unet

* added test_unet_reference

* black formatted common.py & test_unet.py

* deleted tmp nb

* _calc_same_pad returns int

* fixed test case

* added simpleconvblock tests

* added tests

* added tests for simple upsampling block

* updated test_unet

* removed unnecessary variables

* updated augmentation random erase default values

* created data/pipelines.py

* added base config in config/data; temporary till config system settled

* updated variable defaults to 0 and edited variable names in augmentation

* updated parameter names in data/instance_cropping

* added data/pipelines topdown pipeline make_base_pipeline

* added test_pipelines

* removed configs

* updated augmentation class

* modified test

* removed cuda cache

* added Model builder class and heads

* added type hinting for init

* black reformatted heads.py

* updated model.py

* updated test_model.py

* updated test_model.py

* updated pipelines docstring

* added from_config for Model

* added more act fn to get_act_fn

* black reformatted & updated model.py & test_model.py

* updated config, typehints, black formatted & added doc strings

* added test_heads.py

* updated module docstring

* updated Model docstring

* added coderabbit suggestions

* black reformat

* added 2 helper methods for getting backbone/head; separated common and utils in architectures

* removed comments

* updated test_get_act_fn

* added multi-head feature to Model

* black reformatted model.py

* added all test cases for heads.py

* reformatted test_heads.py

* updated L44 in confidence_maps.py

* added output channels to unet

* resolved merge conflicts + small bugs

* black reformatted

* added coderabbit suggestions

* not sure how intermediate features + multi head would work

* Separate Augmentations into Intensity and Geometric (#18)

* initial commit

* separated intensity and geometric augmentations

* test

* pseudo code in model.py

* small fix

* name property in heads.py

* name of head docstring added

* added ruff cache to gitignore; added head selection in Model class

* updated return value for decoder

* small change to model.py

* made model.py forward more efficient

* small comments updated in instance_cropping for clarity

* updated output structure of unet to dict; updated model.py attribute head

* added single instance confmaps pipeline

* added test cases

* fixed batching issue

* updated test cases

* fixed pydocstyle

* updated toml; added keep keys explicitly; added test case for singleinstance pipeline

* updated toml

* updated test_pipelines.py test case

* black reformatted

---------

Co-authored-by: Talmo Pereira <talmo@salk.edu>
Copy link
Contributor

coderabbitai bot commented Dec 23, 2023

Important

Auto Review Skipped

Auto reviews are disabled on base/target branches other than the default branch. Please add the base/target branch pattern to the list of additional branches to be reviewed in the settings.

Please check the settings in the CodeRabbit UI or the .coderabbit.yaml file in this repository.

To trigger a single review, invoke the @coderabbitai review command.

Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on X ?


Tips

Chat with CodeRabbit Bot (@coderabbitai)

  • You can directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>.
  • You can tag CodeRabbit on specific lines of code or entire files in the PR by tagging @coderabbitai in a comment. Examples:
  • You can tag @coderabbitai in a PR comment and ask questions about the PR and the codebase. Use quoted replies to pass the context for follow-up questions. Examples:
    • @coderabbitai render interesting statistics about this repository as a table.
    • @coderabbitai show all the console.log statements in this repository.
    • @coderabbitai generate unit tests for the src/utils.ts file.

CodeRabbit Commands (invoked as PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger a review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai help to get help.

Additionally, you can add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.

CodeRabbit Configration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • The JSON schema for the configuration file is available here.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/coderabbit-overrides.v2.json

CodeRabbit Discord Community

Join our Discord Community to get help, request features, and share feedback.

@gitttt-1234 gitttt-1234 requested a review from alckasoc December 23, 2023 00:50
Copy link

codecov bot commented Dec 23, 2023

Codecov Report

Attention: 5 lines in your changes are missing coverage. Please review.

Comparison is base (649280d) 99.78% compared to head (3db6fd7) 84.77%.
Report is 2 commits behind head on divya/inference.

Files Patch % Lines
sleap_nn/model_trainer.py 96.29% 5 Missing ⚠️
Additional details and impacted files
@@                 Coverage Diff                  @@
##           divya/inference      #35       +/-   ##
====================================================
- Coverage            99.78%   84.77%   -15.02%     
====================================================
  Files                   19       21        +2     
  Lines                  936     1274      +338     
====================================================
+ Hits                   934     1080      +146     
- Misses                   2      194      +192     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants