-
Notifications
You must be signed in to change notification settings - Fork 52
Restrict scale of dequantizeLinear and quantizeLinear to be positive #906
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
index.bs
Outdated
| 1. If [=MLGraphBuilder/validating operand=] with [=this=] and any of |input|, |scale|, and |zeroPoint| returns false, then [=exception/throw=] a {{TypeError}}. | ||
| 1. If |input|'s [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#tensor-limits-quantizelinear)), then [=exception/throw=] a {{TypeError}}. | ||
| 1. If |scale|'s [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#tensor-limits-quantizelinear)), then [=exception/throw=] a {{TypeError}}. | ||
| 1. If |scale|'s values contain non-positive values, then [=exception/throw=] a {{TypeError}}. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
given scale is an MLOperand, not sure if we can synchronously throw a TypeError, . Is it ok to say if the scale contains non-positive values, it's implementation dependent on how the error is handled?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, the spec should just say they are unsupported in prose. We can't actually validate this until runtime.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
...unless we introduce the concept of an MLOperand parameter that must be a build-time constant.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
unless we introduce the concept of an MLOperand parameter that must be a build-time constant
There still remain dynamic cases when these inputs can be computed from another operator's output, but a constant property in the MLOperand may be informative for optimization purposes - for some precedent, there's already the isConstant slot and constant property getter from CreateConstantTensor.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for your reviewing! I've updated these descriptions in new commit, please take another look! ☕
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it's implementation dependent on how the error is handled
@philloooo: 👀I see one existing case with argMin/argMax, "In case of ties, the identity of the return value is implementation dependent". So we could update the current text "Values must be positive and nonzero" to something more like "Values must be positive and nonzero, or else results are implementation dependent"?
fdwr
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👀
fdwr
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍 LGTM after 2 comments.
This is a Spec change PR for #879 where we're discussing about the lack of widespread support for the current negative scale of
dequantizeLinearandquantizeLinearacross all backend.@fdwr @reillyeon @huningxin PTAL, thanks!
Preview | Diff