-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SImplified chained-comparison. #265
Conversation
Codecov Report
@@ Coverage Diff @@
## master #265 +/- ##
==========================================
+ Coverage 83.03% 83.09% +0.05%
==========================================
Files 112 112
Lines 6448 6441 -7
Branches 1029 1029
==========================================
- Hits 5354 5352 -2
+ Misses 901 896 -5
Partials 193 193
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
if (pt[2] > 0 and x >= 0 and y >= 0 and x < self.output_res | ||
and y < self.output_res): | ||
if (pt[2] > 0 and self.output_res > y >= 0 | ||
and self.output_res > x >= 0): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
< is more natural, 0 <= x < self.output_res
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Modified.
0933706
to
7b39dd5
Compare
* SImplified chained-comparison. * Modified following mentor's convention. * Pre-commit.
* SImplified chained-comparison. * Modified following mentor's convention. * Pre-commit.
* SImplified chained-comparison. * Modified following mentor's convention. * Pre-commit.
* Support multiple optimizers * minor refinement * improve unit tests * minor fix * Update unit tests for resuming or saving ckpt for multiple optimizers * refine docstring * refine docstring * fix typo * update docstring * refactor the logic to build multiple optimizers * resolve comments * ParamSchedulers spports multiple optimizers * add optimizer_wrapper * fix comment and docstirng * fix unit test * add unit test * refine docstring * RuntimeInfoHook supports printing multi learning rates * resolve comments * add optimizer_wrapper * fix mypy * fix lint * fix OptimizerWrapperDict docstring and add unit test * rename OptimizerWrapper to OptimWrapper, OptimWrapperDict inherit OptimWrapper, and fix as comment * Fix AmpOptimizerWrapper * rename build_optmizer_wrapper to build_optim_wrapper * refine optimizer wrapper * fix AmpOptimWrapper.step, docstring * resolve confict * rename DefaultOptimConstructor * fix as comment * rename clig grad auguments * refactor optim_wrapper config * fix docstring of DefaultOptimWrapperConstructor fix docstring of DefaultOptimWrapperConstructor * add get_lr method to OptimWrapper and OptimWrapperDict * skip some amp unit test * fix unit test * fix get_lr, get_momentum docstring * refactor get_lr, get_momentum, fix as comment * fix error message Co-authored-by: zhouzaida <zhouzaida@163.com>
* SImplified chained-comparison. * Modified following mentor's convention. * Pre-commit.
Fix #245