Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bugfix/alexsherstinsky/fix none check for attention factor in rope scaling 2024 08 28 0 #33188

Conversation

alexsherstinsky
Copy link
Contributor

@alexsherstinsky alexsherstinsky commented Aug 29, 2024

What does this PR do?

  • This PR fixes a small bug in "src/transformers/modeling_rope_utils.py" whereby the check for attention_factor can raise "TypeError: '<' not supported between instances of 'NoneType' and 'int'" if attention_factor is None.
  • The test tests/utils/test_modeling_rope_utils.py::RopeTest::test_longrope_rope_numerically is extended to ensure that the config is validated and the above error is not raised.

Fixes # (issue)

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

@LysandreJik
Copy link
Member

cc @ArthurZucker

@alexsherstinsky
Copy link
Contributor Author

alexsherstinsky commented Sep 3, 2024

Hello, @LysandreJik and @ArthurZucker -- could you please help me with some guidance in terms of what I need to do in order to get this pull request reviewed (and merged, if approved)? Please feel free to make edits on my behalf. Thank you very much in advance for your time.

@LysandreJik
Copy link
Member

Hey @alexsherstinsky, Arthur will review your PR as soon as he has a bit of bandwidth available! Sorry for the delay, the repo is particularly active right now.

Copy link
Collaborator

@ArthurZucker ArthurZucker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for finding and fixing! 🤗

@ArthurZucker ArthurZucker merged commit 122ded0 into huggingface:main Sep 4, 2024
14 checks passed
@alexsherstinsky alexsherstinsky deleted the bugfix/alexsherstinsky/fix_none_check_for_attention_factor_in_rope_scaling-2024_08_28-0 branch September 4, 2024 15:41
@alexsherstinsky
Copy link
Contributor Author

Hey @alexsherstinsky, Arthur will review your PR as soon as he has a bit of bandwidth available! Sorry for the delay, the repo is particularly active right now.

@LysandreJik Many thanks to you and @ArthurZucker for reviewing and merging -- your libraries are a treasure trove!

@ArthurZucker
Copy link
Collaborator

🤗 thanks for your warm words

itazap pushed a commit to NielsRogge/transformers that referenced this pull request Sep 20, 2024
…aling 2024 08 28 0 (huggingface#33188)

* Fixing a bug in the way "attention_factor" is validated in ROPE utilities.

* Fixing a bug in the way "attention_factor" is validated in ROPE utilities.

* Fixing a bug in the way "attention_factor" is validated in ROPE utilities.
BernardZach pushed a commit to BernardZach/transformers that referenced this pull request Dec 5, 2024
…aling 2024 08 28 0 (huggingface#33188)

* Fixing a bug in the way "attention_factor" is validated in ROPE utilities.

* Fixing a bug in the way "attention_factor" is validated in ROPE utilities.

* Fixing a bug in the way "attention_factor" is validated in ROPE utilities.
BernardZach pushed a commit to innovationcore/transformers that referenced this pull request Dec 6, 2024
…aling 2024 08 28 0 (huggingface#33188)

* Fixing a bug in the way "attention_factor" is validated in ROPE utilities.

* Fixing a bug in the way "attention_factor" is validated in ROPE utilities.

* Fixing a bug in the way "attention_factor" is validated in ROPE utilities.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants