Skip to content

Conversation

@MengqingCao
Copy link
Collaborator

What this PR does / why we need it?

backport form #1273

Add max_num_tokens_across_dp to AscendMetadata to fix dp

This pr fixes the bug introduced by #1229, which add an arg max_num_tokens_across_dp when dp_size > 1.

Does this PR introduce any user-facing change?

How was this patch tested?

@MengqingCao
Copy link
Collaborator Author

Please help review this, thanks! @wangxiyuan

Copy link
Collaborator

@wangxiyuan wangxiyuan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. But i'd like to see the PR for main branch is merged first.

@github-actions
Copy link

This pull request has conflicts, please resolve those before we can evaluate the pull request.

@github-actions
Copy link

This pull request has conflicts, please resolve those before we can evaluate the pull request.

Signed-off-by: MengqingCao <cmq0113@163.com>
Signed-off-by: MengqingCao <cmq0113@163.com>
@ganyi1996ppo
Copy link
Collaborator

@zzzzwwjj have a PR to remove max_num_tokens_across_dp out of the attention metadata, maybe we can drop this change? @MengqingCao

@MengqingCao
Copy link
Collaborator Author

@zzzzwwjj have a PR to remove max_num_tokens_across_dp out of the attention metadata, maybe we can drop this change? @MengqingCao

Yes, I noticed that the mainly changes, including bug fix, example modification, are also included in #1422. Thus closing this pr

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants