Skip to content

vLLM PR #27490 变更核心文件提醒 #40

@Kay-Tian

Description

@Kay-Tian

vLLM PR 监控通知

PR 标题: [Attention] Add missing kv cache scale setup
PR 编号: vllm-project#27490
PR 链接: vllm-project#27490

变更的核心文件:

  • vllm/attention/layer.py

由GitHub Actions自动创建

Metadata

Metadata

Assignees

No one assigned

    Labels

    vLLM-monitorReported by vLLM-monitor

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions