Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Layernorm Tag #61

Merged
merged 5 commits into from
Feb 18, 2023
Merged

[Feature] Layernorm Tag #61

merged 5 commits into from
Feb 18, 2023

Conversation

szhengac
Copy link
Contributor

Description

Add tag to layernorm when sequence parallelism is enabled. This will trigger additional allreduce within tensor parallel group in deepspeed.

Checklist

  • PR's title starts with a category (e.g. [Bugfix], [Model], [Tutorial], etc)
  • Changes are complete (i.e. I finished coding on this PR)
  • All changes have test coverage
  • Code is well-documented

@szhengac szhengac changed the title [Feature] Layernorm [Feature] Layernorm Tag Feb 17, 2023
Copy link
Contributor

@comaniac comaniac left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Otherwise LGTM

Comment on lines +1341 to +1342
if hasattr(param, "replicated_param") and param.replicated_param:
new_param.replicated_param = True
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a bit hacky. We should add a TODO or FIXME saying that we will have a mechanism to maintain a list of user-specified tags.

@comaniac comaniac merged commit 3757cca into awslabs:main Feb 18, 2023
@comaniac
Copy link
Contributor

Thanks @szhengac

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants