Skip to content

[Serve][LLM] unexpected keyword argument 'trust_remote_code' with vllm>0.8.5post1 #52975

@lk-chen

Description

@lk-chen

What happened + What you expected to happen

  1. install ray==2.46.0
    install lastest vllm (from CI)
  2. go through ray serve deployment steps
  3. it fails with resolve_chat_template_content_format() got an unexpected keyword argument 'trust_remote_code' due to [Frontend] Chat template fallbacks for multimodal models vllm-project/vllm#17805 removing the args

Marking the severity as "low" because new vllm release not there yet, but 0.9.0 will be out today

Versions / Dependencies

ray==2.46.0
vllm@https://wheels.vllm.ai/d19110204c03e9b77ed957fc70c1262ff370f5e2/vllm-1.0.0.dev-cp38-abi3-manylinux1_x86_64.whl

Reproduction script

serve run config_TIMESTAMP.yaml

Issue Severity

Low: It annoys or frustrates me.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething that is supposed to be working; but isn'tcommunity-backlogllmtriageNeeds triage (eg: priority, bug/not-bug, and owning component)vllm

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions