Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ValueError: paged_adamw_32bit is not a valid OptimizerNames #4

Closed
lxuechen opened this issue May 24, 2023 · 3 comments
Closed

ValueError: paged_adamw_32bit is not a valid OptimizerNames #4

lxuechen opened this issue May 24, 2023 · 3 comments

Comments

@lxuechen
Copy link

Trying to run the basic training script but getting this error. Is there a particular branch of transformers I should install?

@artidoro
Copy link
Owner

Hey! Thanks a lot for your interest in QLoRA. The necessary changes to use QLoRA will be merged in the transformers library tomorrow morning and we will update this repo with installation instructions.

@Qubitium
Copy link
Contributor

Temp fix via: #5 (comment)

@lxuechen
Copy link
Author

Fixed after nightly install.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants