Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Example of Instruction-Tuning Training #10

Closed
BowieHsu opened this issue Mar 14, 2023 · 5 comments
Closed

Example of Instruction-Tuning Training #10

BowieHsu opened this issue Mar 14, 2023 · 5 comments

Comments

@BowieHsu
Copy link

Hello, thank you for open-sourcing this work. We are now interested in generating our own instructions to fine-tune the Llama model based on your documentation and approach. Could you please advise on any resources or references we can use? Also, are these codes available on Hugging Face?

@Tiiiger
Copy link
Collaborator

Tiiiger commented Mar 14, 2023

Hi @BowieHsu

this repo contains the relevant code for generating instructions and check the README for how to run that

@Tiiiger Tiiiger closed this as completed Mar 14, 2023
@BowieHsu
Copy link
Author

@Tiiiger Hi, thank you for your response.
Your documentation is excellent, and I now understand the instruction generation approach.
Now I want to reproduce the training task and I found this repository, https://github.com/nebuly-ai/nebullvm, which seems to have a similar instruction fine-tuning training approach as yours. Is it consistent with your training method, and are there any codes that can be referenced there?

@Tiiiger
Copy link
Collaborator

Tiiiger commented Mar 14, 2023

hi @BowieHsu unfortunately we have no knowledge of this repo and cannot be more helpful.

@rtaori
Copy link
Contributor

rtaori commented Mar 15, 2023

Hi,

Quick update - We have released the training code, see https://github.com/tatsu-lab/stanford_alpaca#fine-tuning. Hope this is helpful.

@BowieHsu
Copy link
Author

@rtaori Thank you, my friend. I'll have all my GPUs running at full power to express my gratitude.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants