Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix lion8b error correction with torch 2.1 #656

Merged
merged 3 commits into from
Oct 9, 2023
Merged

Conversation

dblalock
Copy link
Contributor

@dblalock dblalock commented Oct 8, 2023

Un-disables error correction and instead works around the new FSDP limitations.

This looks like a big diff because it reverts all the indentation changes in the error correction disabling PR, but it's a only a few lines different in lion8b.py and test_lion8b.py compared to the state of those files before that. Basically we just had to tell torch the errors are bf16 tensors and actually cast to bf16 when saving the state dict.

@dblalock dblalock marked this pull request as ready for review October 9, 2023 03:23
@dblalock dblalock requested a review from dakinggg October 9, 2023 03:23
Copy link
Collaborator

@dakinggg dakinggg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks Davis!

@dakinggg dakinggg merged commit aa2ba9f into main Oct 9, 2023
11 checks passed
@dakinggg dakinggg deleted the davis/lion8b-fsdp-fix branch October 11, 2023 21:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants