-
Notifications
You must be signed in to change notification settings - Fork 431
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature: Transpose optimization for Softmax #1716
Labels
pending on user response
Waiting for more information or validation from user
Comments
@fthielke Yes, this would be a good optimization to have. Do you think you could try to add it to the transpose optimizer and submit a PR?
|
fthielke
pushed a commit
to fthielke/tensorflow-onnx
that referenced
this issue
Sep 23, 2021
In lower opsets, Softmax always coerces its inputs to a 2D tensor, making Transpose operations necessary if the permutation moves axes between the coerced batch and feature dimensions. While one could find and optimize away Transposes that only permute axes in a way that keeps the batch/feature split, I would not consider that a common use case; optimizing only for opset >=13 seems good enough for now.
fthielke
pushed a commit
to fthielke/tensorflow-onnx
that referenced
this issue
Sep 23, 2021
In lower opsets, Softmax always coerces its inputs to a 2D tensor, making Transpose operations necessary if the permutation moves axes between the coerced batch and feature dimensions. While one could find and optimize away Transposes that only permute axes in a way that keeps the batch/feature split, I would not consider that a common use case; optimizing only for opset >=13 seems good enough for now. Signed-off-by: Felix Thielke <felix.thielke@mevis.fraunhofer.de>
fthielke
added a commit
to fthielke/tensorflow-onnx
that referenced
this issue
Oct 19, 2021
In opsets 13 and higher, the axis of the operation is arbitrary and can simply be changed according to the permutation of the Transpose. In lower opsets, Softmax always coerces its inputs to a 2D tensor, making Transpose operations necessary if the permutation moves axes between the coerced batch and feature dimensions. Signed-off-by: fthielke <fthielke@fb3.uni-bremen.de>
fthielke
added a commit
to fthielke/tensorflow-onnx
that referenced
this issue
Oct 19, 2021
In opsets 13 and higher, the axis of the operation is arbitrary and can simply be changed according to the permutation of the Transpose. In lower opsets, Softmax always coerces its inputs to a 2D tensor, making Transpose operations necessary if the permutation moves axes between the coerced batch and feature dimensions. Signed-off-by: fthielke <fthielke@fb3.uni-bremen.de>
fatcat-z
added
the
pending on user response
Waiting for more information or validation from user
label
Mar 16, 2022
Wait for user to fix related PR issue. |
janbernloehr
pushed a commit
to janbernloehr/tensorflow-onnx
that referenced
this issue
Jun 10, 2022
In opsets 13 and higher, the axis of the operation is arbitrary and can simply be changed according to the permutation of the Transpose. In lower opsets, Softmax always coerces its inputs to a 2D tensor, making Transpose operations necessary if the permutation moves axes between the coerced batch and feature dimensions. Signed-off-by: fthielke <fthielke@fb3.uni-bremen.de>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Thanks to the optimizations by tf2onnx, most Transpose operations in converted models are eliminated.
However, Softmax does not seem to be accounted for yet.
Take the following simple SavedModel:
When exporting it via
python -m tf2onnx.convert --saved-model test_softmax --output test_softmax/model.onnx --tag serve --signature_def serving_default --opset 14
, the result looks like this:Same for every other channels_first model I export - the Softmax operation is always surrounded by Transposes.
The text was updated successfully, but these errors were encountered: