Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update quality tooling for formatting #760

Merged
merged 7 commits into from
Feb 9, 2023
Merged

Conversation

regisss
Copy link
Contributor

@regisss regisss commented Feb 8, 2023

What does this PR do?

Following the recent changes in Transformers regarding quality tools (see huggingface/transformers#21480 and huggingface/transformers#21493), the package flake8 is replaced by ruff. Files were formatted and corrected accordingly.

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests?

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Feb 8, 2023

The documentation is not available anymore as the PR was closed or merged.

@regisss regisss marked this pull request as ready for review February 8, 2023 11:18
@regisss
Copy link
Contributor Author

regisss commented Feb 8, 2023

@JingyaHuang

  • In optimum/onnxruntime/trainer.py, import fairscale is removed because unused. Is that okay?
  • In optimum/onnxruntime/trainer_seq2seq.py , autocast seems unused. Can we remove the conditional statement where it is imported?

Copy link
Contributor

@fxmarty fxmarty left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome thank you!

Copy link
Collaborator

@echarlaix echarlaix left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for taking care of this @regisss !

Copy link
Contributor

@mht-sharma mht-sharma left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thanks for the PR

Copy link
Member

@michaelbenayoun michaelbenayoun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

Is all the reformating done by ruff? It seems very nice!

@JingyaHuang
Copy link
Collaborator

@JingyaHuang

  • In optimum/onnxruntime/trainer.py, import fairscale is removed because unused. Is that okay?
  • In optimum/onnxruntime/trainer_seq2seq.py , autocast seems unused. Can we remove the conditional statement where it is imported?

Hi @regisss, sure you can remove them, they are legacy code. Thanks for improving it!

Copy link
Collaborator

@JingyaHuang JingyaHuang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's great, thanks for updating the styling tools @regisss! Just left some questions to better understand the change. 💯

optimum/onnxruntime/trainer.py Show resolved Hide resolved
optimum/onnxruntime/trainer_seq2seq.py Outdated Show resolved Hide resolved
Makefile Show resolved Hide resolved
setup.py Outdated Show resolved Hide resolved
@regisss
Copy link
Contributor Author

regisss commented Feb 9, 2023

LGTM!

Is all the reformating done by ruff? It seems very nice!

@michaelbenayoun No, the formatting is still done by black. Ruff is a linter that can also do the same job as isort, so it will reorder imports, remove unused variables etc... It's more a replacement for the flake8 package, which we have never used in Optimum. And it's fast!

@regisss regisss merged commit aa7e57b into huggingface:main Feb 9, 2023
@regisss regisss deleted the format branch February 9, 2023 13:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants