Skip to content

vLLM PR #27994 变更核心文件提醒 #96

@Kay-Tian

Description

@Kay-Tian

vLLM PR 监控通知

PR 标题: [FlashInfer] Avoid FlashInfer block_size 16 + head_size 256 on blackwell
PR 编号: vllm-project#27994
PR 链接: vllm-project#27994

变更的核心文件:

  • vllm/model_executor/models/config.py

由GitHub Actions自动创建

Metadata

Metadata

Assignees

No one assigned

    Labels

    vLLM-monitorReported by vLLM-monitor

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions