-
Notifications
You must be signed in to change notification settings - Fork 27.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Clean-up] Planned removal of the max_size
argument
#35090
base: main
Are you sure you want to change the base?
Conversation
…` and triggered unnecessary deprecated warning
…` and triggered unnecessary deprecated warning
Add a test to ensure test can pass successfully and backward compatibility
Remove `max_size` from test pipelines and replace by `size` by a `Dict` with `'shortest_edge'` `'longest_edge'` as keys
…ze mismatching results with previous processing
I started by testing on my own repository to make sure everything works good and avoid spamming all models. Here is the link to commit that is returned by The pushed config file adds optional arguments and rearranges the arguments in an order sometimes different from that of the original file. If this is ok, we can start pushing the first 5 models. |
max_size
argument
Hi @HichTala! Thanks for continuing with this issue 🤗 It's great that you tried using your own model! I suggest:
This way, it will be easier to merge those pull requests, and the difference will be smaller! I suppose it will require modifying the raw JSON file, instead of saving with ImageProcessor. |
Hi @HichTala! Looks good to me, let's proceed with the first five 👍 |
Hi @qubvel, here are the 5 firsts!
|
Hi @HichTala! Thanks for creating the PRs and providing links. I just verified them, and everything looks perfect. Special thanks for providing a message with the PR. 🤗 I will ask to merge them and then we can proceed with the next batch. |
Hi @HichTala, hope you are doing great! 2 PRs were merged by owners, however we can't merge another PRs on our side, so we have to wait to see if they will be accepted for a while. Let's open more PRs 🤗 I would say the next 20-30 would be enough. And if you can, please provide links to the PRs directly instead of commits, thank you! |
Hi @qubvel,
|
Thank you, 10 is enough! We also need to check the other tags, e.g. |
What does this PR do?
This PR aims to serve as a progress tracker for updating the size configurations of detection models on the Hugging Face Hub. The updates focus on aligning the configs with the deprecation of the
max_size
argument while ensuring backward compatibility where needed. Each update will be submitted as an individual PR directly to the model repositories on the Hub, following the script and guidance provided by @qubvel.All individual pull requests made directly to the model repositories on the Hub will be tracked here via links, providing a clear overview of progress and ensuring comprehensive updates across relevant models.
Here's a brief summary of what's been done so far regarding the planned deletion of the
max_size
argument:This pull request includes multiple changes across several image processing files in the transformers library. The primary focus is on deprecating the
max_size
parameter and replacing it withsize['longest_edge']
. Additionally, backwards compatibility adjustments have been made to handle thesize
parameter more flexibly.Deprecation of
max_size
parameter:src/transformers/models/conditional_detr/image_processing_conditional_detr.py
: Replacedlogger.warning_once
withlogger.error
for themax_size
parameter deprecation and removed handling ofmax_size
in__init__
,resize
, andpreprocess
methods. [1] [2] [3] [4]src/transformers/models/deformable_detr/image_processing_deformable_detr.py
: Similar changes to deprecatemax_size
and update methods accordingly. [1] [2] [3] [4]src/transformers/models/detr/image_processing_detr.py
: Updated to uselogger.error
for deprecation and removedmax_size
handling. [1] [2] [3] [4]src/transformers/models/grounding_dino/image_processing_grounding_dino.py
: Changedlogger.warning_once
toraise ValueError
formax_size
deprecation and removed related code. [1] [2] [3] [4]src/transformers/models/rt_detr/image_processing_rt_detr.py
: Updated to handlemax_size
deprecation similarly.Backwards compatibility adjustments:
src/transformers/models/conditional_detr/image_processing_conditional_detr.py
: Added handling forsize
when it is an integer.src/transformers/models/deformable_detr/image_processing_deformable_detr.py
: Included similar handling for integersize
.src/transformers/models/detr/image_processing_detr.py
: Adjustedsize
handling for backwards compatibility.src/transformers/models/grounding_dino/image_processing_grounding_dino.py
: Added compatibility handling forsize
.src/transformers/models/rt_detr/image_processing_rt_detr.py
: Included backwards compatibility forsize
.It follows an issue and a pull request linked bellow:
max_size
inDetrImageProcessor.preprocess
#34977Before submitting
Pull Request section?
to it if that's the case.
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
@qubvel