-
Notifications
You must be signed in to change notification settings - Fork 27.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bugfix/alexsherstinsky/fix none check for attention factor in rope scaling 2024 08 28 0 #33188
Conversation
…tention_factor_in_rope_scaling-2024_08_28-0
…tention_factor_in_rope_scaling-2024_08_28-0
…tention_factor_in_rope_scaling-2024_08_28-0
Hello, @LysandreJik and @ArthurZucker -- could you please help me with some guidance in terms of what I need to do in order to get this pull request reviewed (and merged, if approved)? Please feel free to make edits on my behalf. Thank you very much in advance for your time. |
Hey @alexsherstinsky, Arthur will review your PR as soon as he has a bit of bandwidth available! Sorry for the delay, the repo is particularly active right now. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for finding and fixing! 🤗
@LysandreJik Many thanks to you and @ArthurZucker for reviewing and merging -- your libraries are a treasure trove! |
🤗 thanks for your warm words |
…aling 2024 08 28 0 (huggingface#33188) * Fixing a bug in the way "attention_factor" is validated in ROPE utilities. * Fixing a bug in the way "attention_factor" is validated in ROPE utilities. * Fixing a bug in the way "attention_factor" is validated in ROPE utilities.
…aling 2024 08 28 0 (huggingface#33188) * Fixing a bug in the way "attention_factor" is validated in ROPE utilities. * Fixing a bug in the way "attention_factor" is validated in ROPE utilities. * Fixing a bug in the way "attention_factor" is validated in ROPE utilities.
…aling 2024 08 28 0 (huggingface#33188) * Fixing a bug in the way "attention_factor" is validated in ROPE utilities. * Fixing a bug in the way "attention_factor" is validated in ROPE utilities. * Fixing a bug in the way "attention_factor" is validated in ROPE utilities.
What does this PR do?
attention_factor
can raise"TypeError: '<' not supported between instances of 'NoneType' and 'int'"
ifattention_factor
isNone
.tests/utils/test_modeling_rope_utils.py::RopeTest::test_longrope_rope_numerically
is extended to ensure that theconfig
is validated and the above error is not raised.Fixes # (issue)
Before submitting
Pull Request section?
to it if that's the case.
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.