Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Topdown multi batch inference #390

Merged
merged 12 commits into from
Jan 4, 2021
Merged

Conversation

wusize
Copy link
Collaborator

@wusize wusize commented Dec 30, 2020

Topdown multi batch inference
fix #25
fix #120
fix #197

@codecov
Copy link

codecov bot commented Dec 30, 2020

Codecov Report

Merging #390 (b7d4199) into master (e6801ba) will decrease coverage by 0.32%.
The diff coverage is 65.76%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #390      +/-   ##
==========================================
- Coverage   82.77%   82.44%   -0.33%     
==========================================
  Files         120      121       +1     
  Lines        7874     8028     +154     
  Branches     1251     1298      +47     
==========================================
+ Hits         6518     6619     +101     
- Misses       1097     1145      +48     
- Partials      259      264       +5     
Flag Coverage Δ
unittests 82.44% <65.76%> (-0.33%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
mmpose/apis/train.py 17.02% <ø> (ø)
...s/datasets/top_down/topdown_posetrack18_dataset.py 35.21% <0.00%> (-0.51%) ⬇️
...sets/datasets/top_down/topdown_mpii_trb_dataset.py 48.38% <18.75%> (-2.51%) ⬇️
...datasets/datasets/top_down/topdown_mpii_dataset.py 40.83% <27.77%> (-1.89%) ⬇️
...atasets/datasets/top_down/topdown_jhmdb_dataset.py 90.44% <86.66%> (-0.66%) ⬇️
...datasets/datasets/top_down/topdown_coco_dataset.py 87.96% <88.88%> (-0.28%) ⬇️
mmpose/models/detectors/top_down.py 79.44% <93.75%> (+0.03%) ⬆️
...atasets/top_down/topdown_coco_wholebody_dataset.py 97.16% <100.00%> (+0.04%) ⬆️
mmpose/datasets/pipelines/shared_transform.py 96.87% <100.00%> (+0.10%) ⬆️
mmpose/models/backbones/mspn.py 88.03% <0.00%> (-6.30%) ⬇️
... and 4 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update e6801ba...b7d4199. Read the comment docs.

@innerlee
Copy link
Contributor

Some minor comments

@open-mmlab open-mmlab deleted a comment from CLAassistant Jan 4, 2021
@innerlee innerlee merged commit 9f077d8 into master Jan 4, 2021
@innerlee innerlee deleted the topdown_multi_batch_inference branch January 4, 2021 13:21
rollingman1 pushed a commit to rollingman1/mmpose that referenced this pull request Nov 5, 2021
* resolve comments

* update changelog

* add test_dataloader settings to localization cfg
shuheilocale pushed a commit to shuheilocale/mmpose that referenced this pull request May 6, 2023
* batch_inference

* batch_inference

* batch_inference for topdown

* topdown_multi_batch_inference

* topdown_multi_batch_inference

* topdown_multi_batch_inference

* topdown_multi_batch_inference

* topdown_multi_batch_inference

* some modifications

* topdown_mpii_dataset

* topdown_posetrack18_dataset

* assert bbox_id while bs > 1

Co-authored-by: sensetime <sensetime@sensetime.domain.sensetime.com>
HAOCHENYE added a commit to HAOCHENYE/mmpose that referenced this pull request Jun 27, 2023
* Support use 'global var' in config function

* upload test file
ajgrafton pushed a commit to ajgrafton/mmpose that referenced this pull request Mar 6, 2024
* batch_inference

* batch_inference

* batch_inference for topdown

* topdown_multi_batch_inference

* topdown_multi_batch_inference

* topdown_multi_batch_inference

* topdown_multi_batch_inference

* topdown_multi_batch_inference

* some modifications

* topdown_mpii_dataset

* topdown_posetrack18_dataset

* assert bbox_id while bs > 1

Co-authored-by: sensetime <sensetime@sensetime.domain.sensetime.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Support Batch Inference Multiple samples per GPU while testing batch inference
3 participants