-
Notifications
You must be signed in to change notification settings - Fork 506
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: lora dropout applied to all models #995
fix: lora dropout applied to all models #995
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/torchtune/995
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit 408dbb5 with merge base f35e5d6 (): This comment was automatically generated by Dr. CI and updates every 15 minutes. |
Thanks @Optimox for finding this bug and making the fix! Can you also update the Mistral components while you're at it? |
@ebsmothers yes you are right I forgot to update the mistral model! I've just added a new commit to take care of it! |
@ebsmothers good catch! updated! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good! Thanks again for catching and fixing this. Just kicked off CI, once it's green I can merge.
Context
What is the purpose of this PR? Is it to
Related to issue #996
Changelog
I only pass the
lora_dropout
argument where neededTest plan
Please make sure to do each of the following if applicable to your PR. (If you're not sure about any one of these just ask and we will happily help.)
pre-commit install
)pytest tests
pytest tests -m integration_test