Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multi-GPU setting in example gpt-neox-20b_peft #276

Closed
Wsy002 opened this issue Apr 6, 2023 · 1 comment
Closed

Multi-GPU setting in example gpt-neox-20b_peft #276

Wsy002 opened this issue Apr 6, 2023 · 1 comment

Comments

@Wsy002
Copy link

Wsy002 commented Apr 6, 2023

Hi, I have successfully completed all three steps in one GPU setting. I want to ask how to do multi-gpu fine tuning in the code below(Step 1)

https://github.com/lvwerra/trl/blob/main/examples/sentiment/scripts/gpt-neox-20b_peft/clm_finetune_peft_imdb.py

By the way I have searched some solutions and tried to modify some code based on #219 and peft PR #145 provided by @younesbelkada. But they seem
still not work. Could you please help with this.

One more question, I would like to ask whether all three steps(clm+ merge+PPO)support multi-gpu setting now? Thanks.

@younesbelkada
Copy link
Contributor

hi @Wsy002

If I understood correctly you want to do https://github.com/lvwerra/trl/blob/main/examples/sentiment/scripts/gpt-neox-20b_peft/clm_finetune_peft_imdb.py in a multi-GPU setup
For that you need to use the main branch of transformers as it contains a fix for Trainer + multi-GPU: huggingface/transformers#22532
Can you try with that and let us know if it works?

@lvwerra lvwerra closed this as completed Jun 1, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants