Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cohere: update RoPE structure #33408

Merged
merged 5 commits into from
Sep 16, 2024
Merged

Cohere: update RoPE structure #33408

merged 5 commits into from
Sep 16, 2024

Conversation

gante
Copy link
Member

@gante gante commented Sep 10, 2024

What does this PR do?

This PR propagates the updates to the RoPE structure to cohere -- the logic for RoPE was abstracted into a separate module for llama3.1 (#32135). Using the new structure, a model has access to all RoPE scaling strategies.

While touching the modeling code, I've taken the liberty to:

  1. update copied from statements, which were disabled in previous PRs;
  2. update (postpone) deprecation messages to ensure deprecated features are removed from all models in the same version.

✅ all slow tests passing


Note: #31999 was originally open to migrate all modern RoPE models into the upgraded structure. However, working on cohere, I noticed that there may be important implementation differences in RoPE. As such, I'll be opening multiple PRs, batching similar RoPE implementations together.

@@ -79,6 +80,43 @@ class CohereConfig(PretrainedConfig):
Whether to tie weight embeddings
rope_theta (`float`, *optional*, defaults to 10000.0):
The base period of the RoPE embeddings.
rope_scaling (`Dict`, *optional*):
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is copy/paste from llama

Comment on lines +139 to +140
# Note: the forward pass of this RoPE is slightly different from Llama's, resulting in different `sin`/`cos` for
# the same parameterization. The differences are highlighted with a comment.
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Aside from the line highlighted with a comment, this is copy/paste from llama

@gante gante requested a review from LysandreJik September 10, 2024 15:14
@gante
Copy link
Member Author

gante commented Sep 10, 2024

@LysandreJik a PR like this one will be open for a few more modern models. Since part of the changes consists of having a global view of the model to update the copied from statements, would you like me to update the import structures as well? 🤗

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Copy link
Member

@LysandreJik LysandreJik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks clean, nice to reuse the llama code

@LysandreJik
Copy link
Member

@LysandreJik a PR like this one will be open for a few more modern models. Since part of the changes consists of having a global view of the model to update the copied from statements, would you like me to update the import structures as well? 🤗

No need to updat the import structure for now!

@gante gante merged commit 95e816f into huggingface:main Sep 16, 2024
17 checks passed
@gante gante deleted the unify_rope branch September 16, 2024 08:45
itazap pushed a commit to NielsRogge/transformers that referenced this pull request Sep 20, 2024
amyeroberts pushed a commit to amyeroberts/transformers that referenced this pull request Oct 2, 2024
BernardZach pushed a commit to BernardZach/transformers that referenced this pull request Dec 5, 2024
BernardZach pushed a commit to innovationcore/transformers that referenced this pull request Dec 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants