-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Fix] in Xnnpack EP, the conversion for fused activation param isn't correct #23115
Open
mszhanyi
wants to merge
17
commits into
main
Choose a base branch
from
zhanyi/activationparam
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
+104
−5
Open
Changes from 16 commits
Commits
Show all changes
17 commits
Select commit
Hold shift + click to select a range
ba52bc0
fix activate parameter in fp16
invalid-email-address 6032820
add test data
invalid-email-address 242c182
rm useless change
invalid-email-address 7c7f16a
node assignment some for FP16
invalid-email-address 3d75696
update
invalid-email-address c4f0455
update
invalid-email-address dd9865f
head file
invalid-email-address d556acb
update
invalid-email-address a4dac51
update1
invalid-email-address ee98190
rename
invalid-email-address 52d099a
typo and lint
invalid-email-address 3cc345d
revert some changes
invalid-email-address 67aa30c
Merge branch 'main' of https://github.com/microsoft/onnxruntime into …
invalid-email-address 0baa34b
fix
invalid-email-address e0e8304
typo
invalid-email-address f1d3b16
update
invalid-email-address 042e5cd
lint
invalid-email-address File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Binary file not shown.
Binary file not shown.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What if
GetType(arg, arg_type)
failed here?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Generally type info is always available, so I think this is ok. Shape info may be missing depending on the model.
The Conv op looks to be setup to allow fp32, u8, s8 and optionally fp16. Should this also handle u8 and s8 or should ClipReluChecker limit fusion to fp32 and fp16?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So far, core runtime Clip fusion only supports float too.
onnxruntime/onnxruntime/core/optimizer/utils.cc
Lines 335 to 349 in c6ba7ed
Shall we update them together?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cc @snnn
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd leave the core Clip fusion as-is for now. Can be a separate PR if we think there's a use-case that would benefit.
Are you planning on updating ClipReluChecker to limit the types?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I may need more time to understand ClipQuantFusion
https://github.com/microsoft/onnxruntime/blob/main/onnxruntime/core/optimizer/qdq_transformer/clip_quantizelinear.cc
But for the known reason, I have no idea about the next plan.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think ClipQuantFusion is a separate topic as that's about ignoring a Clip or Relu when the Q zp and scale make it redundant.
I was asking if the XNNPACK EP ClipReluChecker needs to be updated to either limit the types it allows, or whether FuseActivation needs to handle u8 or s8 input for the Clip min/max.
This has no checks on types:
onnxruntime/onnxruntime/core/providers/xnnpack/detail/node_support_checker.cc
Lines 42 to 44 in 2d05c4b
But FuseActivation always uses a float in the activation params and with this PR is explicitly only checking for fp32 and fp16.
e.g. if there's a Conv node with u8 or s8 input it looks like ClipReluChecker will allow the activation, but FuseActivation won't do the right thing as the Clip min/max would be u8 or s8.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I checked https://onnx.ai/onnx/operators/onnx__Conv.html#type-constraints, Onnx Conv node shouldn't have u8 or s8 inputs.