Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Skipping test_save_load_low_cpu_mem_usage() for all failing models #29118

Conversation

hackyon
Copy link
Contributor

@hackyon hackyon commented Feb 19, 2024

What does this PR do?

Skips all failing unit tests for test_save_load_low_cpu_mem_usage(). This should be temporary for some of them until the correct tie_weights() have been added to the models.

I created this temporary PR just in case it takes longer to make progress with #29024, and we don't want to block other PRs while we wait for feedback.

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

@amyeroberts

This should be temporary for some fo them until the correct tie_weights()
have been added to the models.
@hackyon
Copy link
Contributor Author

hackyon commented Feb 19, 2024

Hello @amyeroberts,

I came up this PR to simply ignores all the failing tests for test_save_load_low_cpu_mem_usage(). This change should be safe as it only touches tests.

This should help unblock any PRs from being merged, while we work on getting tie_weights() into some of these models with #29024.

I leave it up to you to decide this is worth merging.

Thanks! 🙏

Copy link
Collaborator

@amyeroberts amyeroberts left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@hackyon Awseome work. Thanks for proactively adding this!

cc @ydshieh for reference

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@amyeroberts
Copy link
Collaborator

@hackyon Thanks for opening this and all of your work on adding this feature to our models!

We're going to revert #28948 and then add it back after the next release, which should happen tomorrow.

Unfortunately, our test suite doesn't seem to be correctly picking up all of the necessary tests. I know you've run a check against all models but we want to be confident that everything is in a good and stable state before the release.

We can then combine the work in #29024 with all models having similar changes which address the current review comments.

@hackyon
Copy link
Contributor Author

hackyon commented Feb 20, 2024

Sounds good, thanks for taking care of this.

Once this rollback goes in, I can come up with a single PR that covers everything all in one.

@hackyon hackyon closed this Feb 22, 2024
@hackyon hackyon deleted the fix-ignore-failing-tests-for-tie-weights branch February 22, 2024 23:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants