-
Notifications
You must be signed in to change notification settings - Fork 597
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
support fp16 training and testing #230
Conversation
Codecov Report
@@ Coverage Diff @@
## master #230 +/- ##
==========================================
+ Coverage 70.01% 70.05% +0.03%
==========================================
Files 87 87
Lines 4496 4532 +36
Branches 877 884 +7
==========================================
+ Hits 3148 3175 +27
- Misses 1052 1062 +10
+ Partials 296 295 -1
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
mmtrack/models/mot/deep_sort.py
Outdated
'Please train `detector` and `reid` models first and \ | ||
inference with Tracktor.') | ||
'Please train `detector` and `reid` models firstly, then \ | ||
inference with Sort/DeepSort.') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"SORT/DeepSORT"
mmtrack/apis/train.py
Outdated
optimizer_config = Fp16OptimizerHook( | ||
**cfg.optimizer_config, **fp16_cfg, distributed=distributed) | ||
elif distributed and 'type' not in cfg.optimizer_config: | ||
optimizer_config = OptimizerHook(**cfg.optimizer_config) | ||
else: | ||
optimizer_config = cfg.optimizer_config | ||
if fp16_cfg is not None: | ||
optimizer_config.update(fp16_cfg, distributed=distributed) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if `type` not in cfg.optimizer_config:
optimizer_config['type'] = 'Fp16OptimizerHook' if fp16_cfg else 'OptimizerHook'
if fp16_cfg:
optimizer_config.update(fp16_cfg)
optimizer_config.update(distributed=distributed)
No description provided.