-
Notifications
You must be signed in to change notification settings - Fork 48
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unit tests #83
Unit tests #83
Conversation
Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com>
Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com>
Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com>
Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com>
Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com>
Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com>
Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com>
Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com>
Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com>
Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com>
Signed-off-by: Anh-Uong <anh.uong@ibm.com>
This PR looks good to me. Before we add edge cases, @tharapalanivel can we also add a unit test for fine tuning? there will not be any peft type associated with it |
requirements.txt
Outdated
@@ -1,7 +1,7 @@ | |||
numpy | |||
accelerate>=0.20.3 | |||
packaging | |||
transformers>=4.34.1 | |||
transformers>=4.34.1,<4.38.0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
#53 is merged so you shouldnt need this cap, thanks
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@Ssukriti Should we keep the cap(or a static version) for the transformers package avoid un intended errors like xla_fsdp_v2
. We could create github workflow to run tests and then update the cap regularly.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we dont need to keep static version , but yes in optional dependencies PR , @gkumbhat is looking into how to cap and we may cap to next major release. Now that CI/CD with automatically pull new release versions , if we see failing builds, we will update accordingly
the errors we were seeing with xla_fsdp_v2
was actually due to code we wrote , which was good to catch and fix . It was not a API change from transformers, but we were setting env variables incorrectly
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In general if there is a specific version that doesn't work, or has a bug ,then we can also ask pip to ignore that particular version.
add more unit tests and refactor
Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com>
Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com>
Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think some more tests can be i:
- if num_epocs and num_gradient_acc steps is 0 , valueerror is returned
- prompt tuning test can test with
"prompt_tuning_init": "RANDOM",
andTEXT
(only Random is tested)
tests/test_sft_trainer.py
Outdated
assert "Simply put, the theory of relativity states that" in output_inference | ||
|
||
|
||
def test_run_train_lora_target_modules(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
whats the difference between this and above test? can we combine to 1?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
My understanding is that first we check that default target modules are used, then the next one is for custom target modules specified by user and the last for all-linear
. I've parameterized it but worth confirming with Anh.
Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com>
@@ -0,0 +1,22 @@ | |||
# Copyright The IBM Tuning Team |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Curious about the copyright notice..Where is this coming from?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
IBM Tuning Team
was suggested by Raghu, the rest is from caikit
invalid_params | ||
) | ||
|
||
with pytest.raises(ValueError, match=exc_msg): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I generally avoid matching exact error message and just check for valueError with a comment explaining why, but will let this go and I dont think we will update the message much
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks so much!!
* Set up fixtures and data for tests Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Add basic unit tests Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Setting upper bound for transformers Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Ignore aim log files Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Include int num_train_epochs Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Fix formatting Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Add copyright notice Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Address review comments Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Run inference on tuned model Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Trainer downloads model Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * add more unit tests and refactor Signed-off-by: Anh-Uong <anh.uong@ibm.com> * Fix formatting Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Add FT unit test and refactor Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Removing transformers upper bound cap Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Address review comments Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> --------- Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> Signed-off-by: Anh-Uong <anh.uong@ibm.com> Co-authored-by: Anh-Uong <anh.uong@ibm.com>
* Set up fixtures and data for tests Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Add basic unit tests Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Setting upper bound for transformers Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Ignore aim log files Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Include int num_train_epochs Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Fix formatting Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Add copyright notice Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Address review comments Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Run inference on tuned model Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Trainer downloads model Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * add more unit tests and refactor Signed-off-by: Anh-Uong <anh.uong@ibm.com> * Fix formatting Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Add FT unit test and refactor Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Removing transformers upper bound cap Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Address review comments Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> --------- Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> Signed-off-by: Anh-Uong <anh.uong@ibm.com> Co-authored-by: Anh-Uong <anh.uong@ibm.com>
Description of the change
Adding unit tests for
pt
andlora
tuning method using dummy model, edge cases, invalid requests, etc.Cont. of PR #79
Related issue number
Closes #74
How to verify the PR
Was the PR tested