Skip to content

Conversation

@Isotr0py
Copy link
Member

@Isotr0py Isotr0py commented Mar 26, 2025

Slack Discussion: https://vllm-dev.slack.com/archives/C07QCGVDNUF/p1743003881430129?thread_ts=1743001000.770969&cid=C07QCGVDNUF

  • Fix broken Mllama interleaved images input support
  • Rewrite multi-images example to use interleaved images input

cc @heheda12345

Signed-off-by: Isotr0py <2037008807@qq.com>
Signed-off-by: Isotr0py <2037008807@qq.com>
@Isotr0py Isotr0py requested a review from heheda12345 March 26, 2025 17:59
@github-actions
Copy link

👋 Hi! Thank you for contributing to the vLLM project.

💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels.

Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can either: Add ready label to the PR or enable auto-merge.

🚀

@mergify mergify bot added the documentation Improvements or additions to documentation label Mar 26, 2025
@Isotr0py
Copy link
Member Author

QQ截图20250327021546
Interleaved image test should pass now.

Copy link
Collaborator

@heheda12345 heheda12345 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will it be better to put this hack in get_replacement_mllama?

@heheda12345
Copy link
Collaborator

Interleaved image test should pass now.

I ran the tests in my local env and get the following errors:

FAILED test_mllama.py::test_models_interleaved_images[_Backend.XFORMERS-5-128-bfloat16-meta-llama/Llama-3.2-11B-Vision-Instruct] - AttributeError: 'list' object has no attribute 'shape'
FAILED test_mllama.py::test_models_interleaved_images[_Backend.FLASH_ATTN-5-128-bfloat16-meta-llama/Llama-3.2-11B-Vision-Instruct] - AttributeError: 'list' object has no attribute 'shape'

I'm expecting this PR to fix it #14883.

How can you pass the interleaved image tests?

@Isotr0py
Copy link
Member Author

I ran the tests in my local env and get the following errors:

Hmmm, that's weird... I re-ran the test locally again, and it can still pass. The only change I made is changing the test to use tensor_parallel_size=2 because I can't load the model with only one GPU, which should not affect the test...

Co-authored-by: Chen Zhang <zhangch99@outlook.com>
Copy link
Collaborator

@heheda12345 heheda12345 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Discussed with @Isotr0py offline. This PR can pass the tests in test_mllama.py if we set max_num_seqs=1 to vllm_runner. We can merge this PR first and wait for #14883 to support max_num_seqs>1.

@heheda12345 heheda12345 enabled auto-merge (squash) March 29, 2025 15:29
@github-actions github-actions bot added the ready ONLY add when PR is ready to merge/full CI is needed label Mar 29, 2025
@heheda12345 heheda12345 merged commit 3c0ff91 into vllm-project:main Mar 29, 2025
44 checks passed
@Isotr0py Isotr0py deleted the fix-mllama-interleave branch March 30, 2025 06:37
zhouyu5 pushed a commit to HabanaAI/vllm-fork that referenced this pull request Mar 31, 2025
)

Signed-off-by: Isotr0py <2037008807@qq.com>
Co-authored-by: Chen Zhang <zhangch99@outlook.com>
Alex4210987 pushed a commit to LeiWang1999/vllm-bitblas that referenced this pull request Apr 5, 2025
)

Signed-off-by: Isotr0py <2037008807@qq.com>
Co-authored-by: Chen Zhang <zhangch99@outlook.com>
Signed-off-by: xinyuxiao <xinyuxiao2024@gmail.com>
lulmer pushed a commit to lulmer/vllm that referenced this pull request Apr 7, 2025
)

Signed-off-by: Isotr0py <2037008807@qq.com>
Co-authored-by: Chen Zhang <zhangch99@outlook.com>
Signed-off-by: Louis Ulmer <ulmerlouis@gmail.com>
lk-chen pushed a commit to lk-chen/vllm that referenced this pull request Apr 29, 2025
)

Signed-off-by: Isotr0py <2037008807@qq.com>
Co-authored-by: Chen Zhang <zhangch99@outlook.com>
shreyankg pushed a commit to shreyankg/vllm that referenced this pull request May 3, 2025
)

Signed-off-by: Isotr0py <2037008807@qq.com>
Co-authored-by: Chen Zhang <zhangch99@outlook.com>
RichardoMrMu pushed a commit to RichardoMrMu/vllm that referenced this pull request May 12, 2025
)

Signed-off-by: Isotr0py <2037008807@qq.com>
Co-authored-by: Chen Zhang <zhangch99@outlook.com>
Signed-off-by: Mu Huai <tianbowen.tbw@antgroup.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

documentation Improvements or additions to documentation ready ONLY add when PR is ready to merge/full CI is needed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants