Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Usage]: set num_crops in LVLM #7861

Closed
Liyan06 opened this issue Aug 26, 2024 · 2 comments · Fixed by #8658
Closed

[Usage]: set num_crops in LVLM #7861

Liyan06 opened this issue Aug 26, 2024 · 2 comments · Fixed by #8658
Labels
usage How to use vllm

Comments

@Liyan06
Copy link

Liyan06 commented Aug 26, 2024

How to set up the num_crops for LVLMs? For example, when initializing the processor for Ph-3.5-vision-instruct, the hugging face code looks like the following:

processor = AutoProcessor.from_pretrained(model_id, 
  trust_remote_code=True, 
  num_crops=4
) 

But I didn't find a way to set num_crops in vllm.

I checked the pull request #7710, but I didn't find the solution.

@Liyan06 Liyan06 added the usage How to use vllm label Aug 26, 2024
@vllm-project vllm-project deleted a comment Aug 26, 2024
@vllm-project vllm-project deleted a comment Aug 26, 2024
@DarkLight1337
Copy link
Member

DarkLight1337 commented Aug 26, 2024

Currently there is no way to pass options to the processor directly. Any help with that is welcome!

@alex-jw-brooks
Copy link
Contributor

Hi @Liyan06, are you planning to submit a PR for this? I'm interesting in being able to enable different image processor options for different models as well, happy to take a pass at adding this with num_crops for phi3v as an example if you aren't

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
usage How to use vllm
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants