Skip to content

[Bug]: vllm v0.7.3 - The following fields were present in the request but ignored: {'top_logprobs'} #13881

@tonyaw

Description

@tonyaw

Your current environment

N/A

🐛 Describe the bug

I'm using vllm v0.7.3 and openai 1.59.7.

I'm using openai stream mode with "logprobs" to True and "top_logprobs" to 5.
I observed following log is flushing in vllm log:

The following fields were present in the request but ignored: {'top_logprobs'}

This log disappears after I added top_logprobs field to DeltaMessage and ChatCompletionLogProb:

class DeltaMessage(OpenAIBaseModel):
    role: Optional[str] = None
    content: Optional[str] = None
    reasoning_content: Optional[str] = None
    tool_calls: List[DeltaToolCall] = Field(default_factory=list)
    top_logprobs: Optional[int] = 0   # this line.

class ChatCompletionLogProb(OpenAIBaseModel):
    token: str
    logprob: float = -9999.0
    bytes: Optional[List[int]] = None
    top_logprobs: Optional[int] = 0 # this line.

Before submitting a new issue...

  • Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions