-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] LLaMA 3.1 RoPE #1699
Comments
We ran into the same issue when doing full finetuning on Llama3.1 8b. We observed that after a few training samples, the model started to generate responses with mispelt words and grammatical errors. @zzhhjjj I'll suggest that you change this to a bug label if you can. |
Thank you for confirming the issue @calvintwr |
Thanks for reporting this. There are currently a few other issues on my list, but I hope to be able to address this some time. |
Dear LitGPT Maintainer,
Thank you for your great work. I encountered an issue while trying to fine-tune LLaMA 3.1 and came here for reference. I was looking for the LLaMA 3.1 RoPE function change, but I couldn't find it in your repository. Based on this PR, it seems like it hasn't been added yet.
https://github.com/Lightning-AI/litgpt/pull/1619/files#diff-3b8a58a4d021803b3171b886bb9162fd659e671131f3f61036f9210cb5d0bc7c
Reference: https://github.com/huggingface/transformers/blob/5c1027bf09717f664b579e01cbb8ec3ef5aeb140/src/transformers/modeling_rope_utils.py#L329-L347
Thanks for your help.
The text was updated successfully, but these errors were encountered: