Skip to content

Conversation

@whx-sjtu
Copy link
Collaborator

@whx-sjtu whx-sjtu commented Jun 17, 2025

This PR fixes the bug that constructs shorter sin/cos cache than model's max positional embedding.

Closes: #1038

Signed-off-by: whx-sjtu <2952154980@qq.com>
@wangxiyuan
Copy link
Collaborator

wangxiyuan commented Jun 17, 2025

Please update the test as well or later

@wangxiyuan wangxiyuan merged commit d7e19ed into vllm-project:main Jun 17, 2025
20 checks passed
Yikun added a commit to Yikun/vllm-ascend that referenced this pull request Jun 21, 2025
Yikun added a commit to Yikun/vllm-ascend that referenced this pull request Jun 21, 2025
shiyuan680 pushed a commit to raindaywhu/vllm-ascend that referenced this pull request Jul 7, 2025
This PR fixes the bug that constructs shorter sin/cos cache than model's
max positional embedding.

Closes: vllm-project#1038

Signed-off-by: whx-sjtu <2952154980@qq.com>
@whx-sjtu whx-sjtu deleted the fix_ds_rope branch July 9, 2025 07:10
chopper0126 pushed a commit to chopper0126/vllm-ascend that referenced this pull request Oct 16, 2025
This PR fixes the bug that constructs shorter sin/cos cache than model's
max positional embedding.

Closes: vllm-project#1038

Signed-off-by: whx-sjtu <2952154980@qq.com>
Angazenn pushed a commit to Angazenn/vllm-ascend that referenced this pull request Oct 21, 2025
This PR fixes the bug that constructs shorter sin/cos cache than model's
max positional embedding.

Closes: vllm-project#1038

Signed-off-by: whx-sjtu <2952154980@qq.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug]: Failed to complete vllm benchmark after enable VLLM_USE_V1=1 due to gather_v3 error

3 participants