Skip to content

fix baichuan lm_head replace issue#34

Merged
delock merged 1 commit intodelock:gma/run-opt-branchfrom
Yejing-Lai:lyj/fix_baichuan_lmhead
Nov 23, 2023
Merged

fix baichuan lm_head replace issue#34
delock merged 1 commit intodelock:gma/run-opt-branchfrom
Yejing-Lai:lyj/fix_baichuan_lmhead

Conversation

@Yejing-Lai
Copy link

Update the lm_head replace logic. Only the last layer name is lm_head/embed_out and this layer is "Linear", it will be replaced by LmHeadLinearAllreduce.

Copy link
Owner

@delock delock left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@delock delock merged commit 547ac96 into delock:gma/run-opt-branch Nov 23, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants