forked from vllm-project/vllm
-
Notifications
You must be signed in to change notification settings - Fork 0
Closed
Labels
vLLM-monitorReported by vLLM-monitorReported by vLLM-monitor
Description
vLLM PR 监控通知
PR 标题: [FlashInfer] Avoid FlashInfer block_size 16 + head_size 256 on blackwell
PR 编号: vllm-project#27994
PR 链接: vllm-project#27994
变更的核心文件:
- vllm/model_executor/models/config.py
由GitHub Actions自动创建
Metadata
Metadata
Assignees
Labels
vLLM-monitorReported by vLLM-monitorReported by vLLM-monitor