-
Notifications
You must be signed in to change notification settings - Fork 454
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add batchnorm folding transformations (fx) #348
Conversation
The documentation is not available anymore as the PR was closed or merged. |
7ed5ba7
to
c70f023
Compare
81d8caf
to
d29ba45
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Left a few comments but you should be able to merge after, thanks!!!
pip install git+https://github.com/huggingface/transformers.git | ||
- name: Test with unittest | ||
working-directory: tests | ||
run: | | ||
python -m unittest discover -s fx -p 'test_*.py' | ||
python -m pytest fx/optimization/test_transformations.py --exitfirst |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No we need to be able to also discover other files, for instance when quantization will be merged, we will need to discover those tests as well.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes I agree, I did this considering my comment above:
@michaelbenayoun I rolled back to having a subfolder. I use
python -m pytest fx/optimization/test_transformations.py --exitfirst
in the yml becausetest_quantization.py
is currently not run on main, and the tests don't pass; independenty from this PR (python -m unittest discover -s fx -p 'test_*.py'
was not capturing it, see e.g. https://github.com/huggingface/optimum/actions/runs/3068109805/jobs/4955165866 )
… broken (to fix in an other PR)
e3d9a44
to
2aa167f
Compare
Hey, @fxmarty! Found this PR following an old ORT issue to grab your folding implementation, thx! Sorry for digging an old thread, just wanted to comment that there might be an error here and here: In case of affine=False, bias should probably be |
Hi @jogepari Thank you for notifying, will fix! |
What does this PR do?
Add the folding of
nn.BatchNorm2d
intonn.Conv2d
, and ofnn.BatchNorm1d
intonn.Linear
.The former is useful for example for resnet, beit, regnet, dpt, levit, segformer and others.
The latter is useful solely for groupvit, and could be for levit; but there is a flatten inbetween so the current transformation is not directly usable: https://github.com/huggingface/transformers/blob/ab2006e3d6db88654526a4169e65d4bfc52da2e3/src/transformers/models/levit/modeling_levit.py#L153-L156
Fixes microsoft/onnxruntime#12522
Before submitting