Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Llama 3.1 liger example is not working #1892

Closed
6 of 8 tasks
Stealthwriter opened this issue Sep 4, 2024 · 3 comments
Closed
6 of 8 tasks

Llama 3.1 liger example is not working #1892

Stealthwriter opened this issue Sep 4, 2024 · 3 comments
Labels
bug Something isn't working waiting for reporter

Comments

@Stealthwriter
Copy link

Please check that this issue hasn't been reported before.

  • I searched previous Bug Reports didn't find any similar reports.

Expected Behavior

it should train

Current behaviour

examples/llama-3/fft-8b-liger-fsdp.yaml

this example is not working
optimizer: paged_adamw_8bit
is not compatible with fsdp
I tried changing it I still get this error: Value error, FSDP Offload not compatible with adamw_bnb_8bit

I commented out the fsdp settings and used deep speed it worked

Steps to reproduce

run example as is

Config yaml

No response

Possible solution

No response

Which Operating Systems are you using?

  • Linux
  • macOS
  • Windows

Python Version

3.10

axolotl branch-commit

latest

Acknowledgements

  • My issue title is concise, descriptive, and in title casing.
  • I have searched the existing issues to make sure this bug has not been reported yet.
  • I am using the latest version of axolotl.
  • I have provided enough information for the maintainers to reproduce and diagnose the issue.
@Stealthwriter Stealthwriter added the bug Something isn't working label Sep 4, 2024
@ganler
Copy link

ganler commented Sep 18, 2024

Seems not working for Mistral too.

Got the same error in linkedin/Liger-Kernel#100 even if upgraded Liger

@NanoCode012
Copy link
Collaborator

NanoCode012 commented Oct 30, 2024

Hello! I could not reproduce this issue on current main. I ran this on 2xL40 which works (with edits below for dataset due to some new changes).

datasets:
  - path: mlabonne/FineTome-100k
    type: chat_template
    split: train[:20%]
+    field_messages: conversations
+    message_field_role: from
+    message_field_content: value

optimizer: adamw_torch

Could either of you clarify if you still see this issue?

@winglian
Copy link
Collaborator

the current version of the example should be correct now. 8-bit optimizers do not work with FSDP1, so you should use regular 32bit optimizers with FSDP

bitsandbytes-foundation/bitsandbytes#89

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working waiting for reporter
Projects
None yet
Development

No branches or pull requests

4 participants