Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

support fp16 training and testing #230

Merged
merged 14 commits into from
Aug 5, 2021
Merged

support fp16 training and testing #230

merged 14 commits into from
Aug 5, 2021

Conversation

GT9505
Copy link
Collaborator

@GT9505 GT9505 commented Aug 2, 2021

No description provided.

@codecov
Copy link

codecov bot commented Aug 2, 2021

Codecov Report

Merging #230 (ca8d3b3) into master (f24cbfc) will increase coverage by 0.03%.
The diff coverage is 75.86%.

❗ Current head ca8d3b3 differs from pull request most recent head c63227a. Consider uploading reports for the commit c63227a to get more accurate results
Impacted file tree graph

@@            Coverage Diff             @@
##           master     #230      +/-   ##
==========================================
+ Coverage   70.01%   70.05%   +0.03%     
==========================================
  Files          87       87              
  Lines        4496     4532      +36     
  Branches      877      884       +7     
==========================================
+ Hits         3148     3175      +27     
- Misses       1052     1062      +10     
+ Partials      296      295       -1     
Flag Coverage Δ
unittests 70.01% <75.86%> (+0.03%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
mmtrack/apis/train.py 16.07% <0.00%> (-0.60%) ⬇️
mmtrack/models/mot/deep_sort.py 60.37% <ø> (ø)
mmtrack/models/mot/tracktor.py 65.62% <0.00%> (+2.93%) ⬆️
mmtrack/core/optimizer/sot_optimizer_hook.py 33.33% <33.33%> (-4.17%) ⬇️
mmtrack/models/reid/linear_reid_head.py 90.66% <87.50%> (+0.52%) ⬆️
mmtrack/models/vid/base.py 43.20% <88.88%> (+0.45%) ⬆️
mmtrack/core/motion/flow.py 100.00% <100.00%> (ø)
mmtrack/core/optimizer/__init__.py 100.00% <100.00%> (ø)
mmtrack/core/track/transforms.py 97.95% <100.00%> (ø)
mmtrack/models/mot/base.py 42.85% <100.00%> (+0.54%) ⬆️
... and 8 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update f24cbfc...c63227a. Read the comment docs.

'Please train `detector` and `reid` models first and \
inference with Tracktor.')
'Please train `detector` and `reid` models firstly, then \
inference with Sort/DeepSort.')
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"SORT/DeepSORT"

optimizer_config = Fp16OptimizerHook(
**cfg.optimizer_config, **fp16_cfg, distributed=distributed)
elif distributed and 'type' not in cfg.optimizer_config:
optimizer_config = OptimizerHook(**cfg.optimizer_config)
else:
optimizer_config = cfg.optimizer_config
if fp16_cfg is not None:
optimizer_config.update(fp16_cfg, distributed=distributed)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if `type` not in cfg.optimizer_config:
    optimizer_config['type'] = 'Fp16OptimizerHook' if fp16_cfg else 'OptimizerHook'
if fp16_cfg:
    optimizer_config.update(fp16_cfg)
optimizer_config.update(distributed=distributed)
    

@OceanPang OceanPang merged commit c89fe8e into open-mmlab:master Aug 5, 2021
@GT9505 GT9505 deleted the fp16 branch August 11, 2021 07:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants