-
Notifications
You must be signed in to change notification settings - Fork 755
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
decouple batch_size to det_batch_size, rec_batch_size and kie_batch_size in MMOCRInferencer #1801
decouple batch_size to det_batch_size, rec_batch_size and kie_batch_size in MMOCRInferencer #1801
Conversation
… and chunk_size in MMOCRInferencer
Codecov ReportPatch coverage:
Additional details and impacted files@@ Coverage Diff @@
## dev-1.x #1801 +/- ##
===========================================
- Coverage 89.36% 89.29% -0.07%
===========================================
Files 192 192
Lines 11279 11291 +12
Branches 1596 1602 +6
===========================================
+ Hits 10079 10082 +3
- Misses 886 889 +3
- Partials 314 320 +6
Flags with carried forward coverage won't be shown. Click here to find out more.
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. ☔ View full report in Codecov by Sentry. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for your contribution! The PR overall looks good, and it would be even greater if you could polish the documentation to make others aware of this change, by appending instructions behind line 462:
mmocr/docs/en/user_guides/inference.md
Lines 456 to 462 in bfb36d8
**MMOCRInferencer.\_\_call\_\_()** | |
| Arguments | Type | Default | Description | | |
| -------------------- | ----------------------- | ------------ | ------------------------------------------------------------------------------------------------ | | |
| `inputs` | str/list/tuple/np.array | **required** | It can be a path to an image/a folder, an np array or a list/tuple (with img paths or np arrays) | | |
| `return_datasamples` | bool | False | Whether to return results as DataSamples. If False, the results will be packed into a dict. | | |
| `batch_size` | int | 1 | Inference batch size. | |
chunk_size (int): For chunking inputs. Defaults to None. | ||
If it is None, batch_size will be used as chunk_size. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's hard to tell the conceptual difference between chunk_size
and batch_size
when they are presented together, I think just keeping batch_size
only may be a better choice - simplicity always goes first.
det_batch_size: int = None, | ||
rec_batch_size: int = None, | ||
kie_batch_size: int = None, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
det_batch_size: int = None, | |
rec_batch_size: int = None, | |
kie_batch_size: int = None, | |
det_batch_size: Optional[int] = None, | |
rec_batch_size: Optional[int] = None, | |
kie_batch_size: Optional[int] = None, |
det_batch_size (int): Batch size for text detection model. | ||
Defaults to None. Overwrite batch_size if it is not None. | ||
rec_batch_size (int): Batch size for text recognition model. | ||
Defaults to None. Overwrite batch_size if it is not None. | ||
kie_batch_size (int): Batch size for KIE model. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
det_batch_size (int): Batch size for text detection model. | |
Defaults to None. Overwrite batch_size if it is not None. | |
rec_batch_size (int): Batch size for text recognition model. | |
Defaults to None. Overwrite batch_size if it is not None. | |
kie_batch_size (int): Batch size for KIE model. | |
det_batch_size (int, optional): Batch size for text detection model. | |
Defaults to None. Overwrite batch_size if it is not None. | |
rec_batch_size (int, optional): Batch size for text recognition model. | |
Defaults to None. Overwrite batch_size if it is not None. | |
kie_batch_size (int, optional): Batch size for KIE model. |
@gaotongxiao Thanks for the feedback! I updated the code and documentation. |
Thanks! LGTM now. |
Hi @hugotong6425 !First of all, we want to express our gratitude for your significant PR in the mmocr project. Your contribution is highly appreciated, and we are grateful for your efforts in helping improve this open-source project during your personal time. We believe that many developers will benefit from your PR. We would also like to invite you to join our Special Interest Group (SIG) private channel on Discord, where you can share your experiences, ideas, and build connections with like-minded peers. To join the SIG channel, simply message moderator— OpenMMLab on Discord or briefly share your open-source contributions in the #introductions channel and we will assist you. Look forward to seeing you there! Join us :https://discord.gg/raweFPmdzG If you have WeChat,welcome to join our community on WeChat. You can add our assistant :openmmlabwx. Please add "mmsig + Github ID" as a remark when adding friends:) |
Motivation
in response to #1799
Modification
The MMOCRInferencer.__call__ method is modified to provide the flexibility of setting different values for det_batch_size, rec_batch_size, kie_batch_size, and chunk_size. Any one of these 4 parameters will override batch_size if it is not None.
To maintain backward compatibility, the default behavior of MMOCRInferencer.__call__ will remain unchanged.
Checklist
Before PR:
After PR: