Skip to content

Conversation

@prashantgupta24
Copy link
Collaborator

@prashantgupta24 prashantgupta24 commented Jul 25, 2025

Changes

Related Issues

fix #336

@github-actions
Copy link

👋 Hi! Thank you for contributing to vLLM support on Spyre.
Just a reminder: Make sure that your code passes all the linting checks, otherwise your PR won't be able to be merged. To do so, first install the linting requirements, then run format.sh and commit the changes. This can be done with uv directly:

uv sync --frozen --group lint --active --inexact

Or this can be done with pip:

uv pip compile --group lint > requirements-lint.txt
pip install -r requirements-lint.txt
bash format.sh

Now you are good to go 🚀

@prashantgupta24 prashantgupta24 force-pushed the remove-prompt-adapters branch from 9074133 to 5ae4775 Compare July 25, 2025 17:19
@prashantgupta24 prashantgupta24 changed the title 🔥 Remove Prompt Adapters ♻️ Compatibility with vllm 0.10.0 Jul 25, 2025
@prashantgupta24 prashantgupta24 force-pushed the remove-prompt-adapters branch from 5ae4775 to 59ef9c9 Compare July 25, 2025 17:25
@prashantgupta24 prashantgupta24 changed the title ♻️ Compatibility with vllm 0.10.0 ♻️ Compatibility with vllm main Jul 25, 2025
@maxdebayser
Copy link
Collaborator

The pooling task changes are already part of the embedding PR. There are more upstream changes around the pooling tasks and I think they should be part of a follow-up PR. I don't want to pile up more changes in a PR that is already big enough.

@prashantgupta24
Copy link
Collaborator Author

Yeah I was trying to get a minimum set of changes in so that the tests pass against vllm:main. I think the pooling tasks are non-breaking, I need to get get_supported_tasks in for vllm:main to work atm

Based on upstream vllm changes in vllm-project/vllm#20588

Signed-off-by: Prashant Gupta <prashantgupta@us.ibm.com>
Signed-off-by: Prashant Gupta <prashantgupta@us.ibm.com>
@prashantgupta24 prashantgupta24 force-pushed the remove-prompt-adapters branch from 652eaa5 to a7cf3ab Compare July 25, 2025 20:06
Comment on lines +45 to +58
if hasattr(Pooler, "from_config_with_defaults"):
# TODO: remove this when we no longer support
# vllm version v0.9.2
self.pooler = Pooler.from_config_with_defaults(
pooler_config,
pooling_type=PoolingType.CLS,
normalize=True,
softmax=False)
else:
self.pooler = Pooler.for_embed(
pooler_config=pooler_config,
default_pooling_type=PoolingType.CLS,
default_normalize=True,
default_softmax=False)
Copy link
Collaborator Author

@prashantgupta24 prashantgupta24 Jul 25, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@maxdebayser I think just need this piece of code from your PR to enable vllm:main breaking changes, I've removed all other pooling changes from this PR. I'm fine in waiting for your PR to get in first in which case I'll rebase and remove this change

Signed-off-by: Prashant Gupta <prashantgupta@us.ibm.com>
Comment on lines +392 to +393
if "generate" in self.model_config.supported_tasks:
tasks.extend(["generate"])
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ideally we would want this to be coming from the model directly:

if is_text_generation_model(model):
            supported_tasks.append("generate")

but SpyreCausalLM doesn't seem to support it atm

type(self.model)
<class 'vllm_spyre.model_executor.model_loader.spyre.SpyreCausalLM'>

is_text_generation_model(self.model)
False

Signed-off-by: Prashant Gupta <prashantgupta@us.ibm.com>
@prashantgupta24 prashantgupta24 marked this pull request as ready for review July 25, 2025 22:41
@yannicks1 yannicks1 enabled auto-merge (squash) July 28, 2025 08:55
Copy link
Collaborator

@yannicks1 yannicks1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm! thanks for taking that on!

@yannicks1 yannicks1 merged commit 91e1a00 into main Jul 28, 2025
19 checks passed
@yannicks1 yannicks1 deleted the remove-prompt-adapters branch July 28, 2025 08:55
@github-actions github-actions bot added the ready label Jul 28, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

re-establish compatibility with vllm main

4 participants