Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

added support for llama v2 and codellama in weight conversion for issue #28241 #28767

Closed
wants to merge 3 commits into from

Conversation

christoukmaji
Copy link
Contributor

What does this PR do?

This PR adds support for LLaMa V2 and CodeLLaMa while maintaining backwards compatibility for LLaMa V1 in the LLaMa-HuggingFace weight conversion script src/transformers/models/llama/convert_llama_weights_to_hf.py. This PR changes the max_position_embeddings for LLaMa V2 to 4096, and for CodeLLaMa to 16384, while maintaining a default max_position_embeddings of 2048 for LLaMa V1.

Fixes #28241

Who can review?

@ArthurZucker @amyeroberts

@ArthurZucker
Copy link
Collaborator

Hey @christoukmaji, thanks for opening the PR! Seems like #28754 was opened a bit earlier so we'll try to get it merged! 🤗

@christoukmaji
Copy link
Contributor Author

Hi @ArthurZucker, thanks for the response. I would like to avoid the duplication of work for future contributions.

What is the PR selection process for HuggingFace contributions; is it first comment or first PR? I thought it was the first comment as outlined in the Contribution documentation and how other PR's have been handled.

@ArthurZucker
Copy link
Collaborator

Hey, pretty sure that if you look at the PR, it's first PR first, then if there is no activity anyone can take it.
I'll update the contribution guidelines as commenting is not really enough as we can't track the progress / if you are stuck of if you even started. Sorry for that! 🤗

Copy link

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

Please note that issues that do not follow the contributing guidelines are likely to be ignored.

@github-actions github-actions bot closed this Mar 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

why max_position_embeddings = 2048 for llama2
2 participants