Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question: What's the recommended way to finetune newish models? #12571

Closed
0xDEADFED5 opened this issue Dec 18, 2024 · 4 comments
Closed

Question: What's the recommended way to finetune newish models? #12571

0xDEADFED5 opened this issue Dec 18, 2024 · 4 comments
Assignees

Comments

@0xDEADFED5
Copy link

I want to finetune Qwen 2.5 3B with Intel GPU.

Can't do it with the IPEX axolotl.

What's currently the best way to do this?

@qiyuangong qiyuangong self-assigned this Dec 19, 2024
@qiyuangong
Copy link
Contributor

I want to finetune Qwen 2.5 3B with Intel GPU.

Can't do it with the IPEX axolotl.

What's currently the best way to do this?

It seems axolotl begins to support Qwen 2 after V0.5.0. However, ipex-llm only supports axolotl v0.4.0 right now.

That means Qwen 2.5 3B is not supported right now. Can you try other models supported in 0.4.0 ?

If you find a candidate model in 0.4.0. You can use Linux OS + docker container and this guide to fine-tune models.

@0xDEADFED5
Copy link
Author

it doesn't have to be axolotl, i want to know what will work right now.

will torchtune work? or transformers trainer with newish transformers version?

@qiyuangong
Copy link
Contributor

it doesn't have to be axolotl, i want to know what will work right now.

will torchtune work? or transformers trainer with newish transformers version?

Torchtune is not supported yet. Also transformers trainer is not recommended. You can use Peft for finetuning.

You can check Running LLM Finetuning using IPEX-LLM on Intel GPU for all supported frameworks and examples.

@0xDEADFED5
Copy link
Author

sorry for late response, closing this now. Thanks @qiyuangong for the advice, i didn't know that folder was in the repo

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants