diff --git a/index.bs b/index.bs
index 113fcd4f..6bb68043 100644
--- a/index.bs
+++ b/index.bs
@@ -915,7 +915,7 @@ When the {{MLContext/[[contextType]]}} is set to [=context type/default=] with t
To validate buffer with descriptor given {{ArrayBufferView}} |bufferView| and {{MLOperandDescriptor}} |descriptor|, run the following steps:
- 1. If |bufferView|'s [=element type=] does not match to |descriptor|.{{MLOperandDescriptor/dataType}} according to [this table](#appendices-mloperanddatatype-arraybufferview-compatibility), return false.
+ 1. If |bufferView|'s [=element type=] does not match to |descriptor|.{{MLOperandDescriptor/dataType}} according to [this table](#appendices-mloperanddatatype-arraybufferview-compatibility), return false.
1. If |bufferView|.\[[ByteLength]] is not equal to |descriptor|'s [=MLOperandDescriptor/byte length=], return false.
@@ -1276,6 +1276,10 @@ The shape [=getter steps=] are to return [=th
Since the {{MLOperand/[[builder]]}} object is bound by the {{MLGraphBuilder/constructor()}} constructor to an {{MLContext}} object, an {{MLOperand}} is also always bound to the same {{MLContext}} object.
+If an operation supports only a subset of {{MLOperandDataType}}s, the allowed data types for each of the operation's input operands, including both positional arguments and options, are given as either an explicit list of {{MLOperandDataType}}s, or a constraint that the operand's [=MLOperand/dataType=] must be the same type as the [=MLOperand/dataType=] of another input operand, or any to allow any {{MLOperandDataType}}.
+
+If an operation requires input operands with a particular [=MLOperand/rank=], the allowed ranks for each of the operation's input operands, including both positional arguments and options, are given as an explicit rank (e.g. 1), or N to allow any dimensionality. More specific constraints are common, such as when an input operand's shape must be [=/unidirectionally broadcastable=] to or [=/bidirectionally broadcastable=] with another input operand; in these cases, the [=/allowed ranks=] are listed as a range, with specific validation given as steps in the operation.
+
{{MLOperatorOptions}} has the following members:
: label
@@ -1283,7 +1287,6 @@ Since the {{MLOperand/[[builder]]}} object is bound by the {{MLGraphBuilder/cons
Optionally provided when an [=operator=] is created using {{MLGraphBuilder}} methods that create {{MLOperand}}s. The implementation may use this value to initialize the [=operator=]'s [=operator/label=].
-
### Creating an {{MLOperand}} ### {#api-mloperand-create}
The {{MLOperand}} objects are created by the methods of {{MLGraphBuilder}}, internally using the following algorithms.
@@ -1559,6 +1562,22 @@ partial dictionary MLOpSupportLimits {
**Returns:** an {{MLOperand}}. The N-D tensor of the reduced shape. The values must be of type |options|.{{MLArgMinMaxOptions/outputDataType}} in the range [0, N-1] where N is the size of the input dimension specified by axis.
+
+ Constraints for {{MLGraphBuilder/argMin()}}/{{MLGraphBuilder/argMax()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ [=/any data type|any=] |
+ [=/any rank|N=] |
+
+
+
{{MLOpSupportLimits}} has the following members for {{MLGraphBuilder/argMin()}} and {{MLGraphBuilder/argMax()}}:
: argMin
@@ -1664,6 +1683,42 @@ partial dictionary MLOpSupportLimits {
**Returns:** an {{MLOperand}}. The batch-normalized N-D tensor of the same shape as *input*.
+
+ Constraints for {{MLGraphBuilder/batchNormalization()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ {{MLOperandDataType/"float32"}}, {{MLOperandDataType/"float16"}} |
+ [=/any rank|N=] |
+
+
+ {{mean}} |
+ [=/same as=] {{input}} |
+ 1 |
+
+
+ {{variance}} |
+ [=/same as=] {{input}} |
+ 1 |
+
+
+ {{MLBatchNormalizationOptions/scale}} |
+ [=/same as=] {{input}} |
+ 1 |
+
+
+ {{MLBatchNormalizationOptions/bias}} |
+ [=/same as=] {{input}} |
+ 1 |
+
+
+
{{MLBatchNormalizationSupportLimits}} has the following members:
: input
@@ -1692,18 +1747,18 @@ partial dictionary MLOpSupportLimits {
1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
1. If [=MLGraphBuilder/validating operand=] with [=this=] and any of |input|, |mean|, |variance|, |options|.{{MLBatchNormalizationOptions/scale}} (if it [=map/exists=]), and |options|.{{MLBatchNormalizationOptions/bias}} (if it [=map/exists=]) returns false, then [=exception/throw=] a {{TypeError}}.
- 1. If |input|'s [=MLOperand/dataType=] is not {{MLOperandDataType/"float32"}} or {{MLOperandDataType/"float16"}}, then [=exception/throw=] a {{TypeError}}.
+ 1. If |input|'s [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-batchNormalization)), then [=exception/throw=] a {{TypeError}}.
1. If |options|.{{MLBatchNormalizationOptions/axis}} is not in [=the range=] 0 to |input|'s [=MLOperand/rank=], exclusive, then [=exception/throw=] a {{TypeError}}.
- 1. If |mean|'s [=MLOperand/dataType=] is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
+ 1. If |mean|'s [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-batchNormalization)), then [=exception/throw=] a {{TypeError}}.
1. If |mean|'s [=MLOperand/shape=] is not equal to « |input|'s [=MLOperand/shape=][|options|.{{MLBatchNormalizationOptions/axis}}] », then [=exception/throw=] a {{TypeError}}.
- 1. If |variance|'s [=MLOperand/dataType=] is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
+ 1. If |variance|'s [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-batchNormalization)), then [=exception/throw=] a {{TypeError}}.
1. If |variance|'s [=MLOperand/shape=] is not equal to « |input|'s [=MLOperand/shape=][|options|.{{MLBatchNormalizationOptions/axis}}] », then [=exception/throw=] a {{TypeError}}.
1. Set |options|.{{MLBatchNormalizationOptions/epsilon}} to the result of [=casting=] |options|.{{MLBatchNormalizationOptions/epsilon}} to |input|'s [=MLOperand/dataType=].
1. If |options|.{{MLBatchNormalizationOptions/scale}} [=map/exists=]:
- 1. If its [=MLOperand/dataType=] is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
+ 1. If its [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-batchNormalization)), then [=exception/throw=] a {{TypeError}}.
1. If its [=MLOperand/shape=] is not equal to « |input|'s [=MLOperand/shape=][|options|.{{MLBatchNormalizationOptions/axis}}] », then [=exception/throw=] a {{TypeError}}.
1. If |options|.{{MLBatchNormalizationOptions/bias}} [=map/exists=]:
- 1. If its [=MLOperand/dataType=] is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
+ 1. If its [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-batchNormalization)), then [=exception/throw=] a {{TypeError}}.
1. If its [=MLOperand/shape=] is not equal to « |input|'s [=MLOperand/shape=][|options|.{{MLBatchNormalizationOptions/axis}}] », then [=exception/throw=] a {{TypeError}}.
1. *Make graph connections:*
1. Let |operator| be an [=operator=] for the "batchNormalization" operation, given |input|, |mean|, |variance| and |options|.
@@ -1761,6 +1816,22 @@ partial dictionary MLOpSupportLimits {
**Returns:** an {{MLOperand}}. The N-D tensor of the same shape as *input* with each element casted to the target data type.
+
+ Constraints for {{MLGraphBuilder/cast()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ [=/any data type|any=] |
+ [=/any rank|N=] |
+
+
+
{{MLOpSupportLimits}} has the following members for {{MLGraphBuilder/cast()}}:
: cast
@@ -1880,6 +1951,22 @@ partial dictionary MLOpSupportLimits {
- an {{MLOperand}}. The output tensor of the same shape as *input*.
+
+ Constraints for {{MLGraphBuilder/clamp()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ [=/any data type|any=] |
+ [=/any rank|N=] |
+
+
+
{{MLOpSupportLimits}} has the following member for {{MLGraphBuilder/clamp()}}:
: clamp
@@ -1966,6 +2053,22 @@ partial dictionary MLOpSupportLimits {
computed as the sum of all the input sizes of the same dimension.
+
+ Constraints for {{MLGraphBuilder/concat()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{inputs}} |
+ [=/any data type|any=] |
+ [=/any rank|N=] |
+
+
+
{{MLConcatSupportLimits}} has the following members:
: inputs
@@ -2110,6 +2213,32 @@ partial dictionary MLOpSupportLimits {
`outputSize = 1 + (inputSize - (filterSize - 1) * dilation - 1 + beginningPadding + endingPadding) / stride`
+
+ Constraints for {{MLGraphBuilder/conv2d()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ {{MLOperandDataType/"float32"}}, {{MLOperandDataType/"float16"}} |
+ 4 |
+
+
+ {{filter}} |
+ [=/same as=] {{input}} |
+ 4 |
+
+
+ {{MLConv2dOptions/bias}} |
+ [=/same as=] {{input}} |
+ 1 |
+
+
+
{{MLConv2dSupportLimits}} has the following members:
: input
@@ -2157,10 +2286,10 @@ partial dictionary MLOpSupportLimits {
1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
1. If [=MLGraphBuilder/validating operand=] with [=this=] and any of |input|, |filter|, and |options|.{{MLConv2dOptions/bias}} (if it [=map/exists=]) returns false, then [=exception/throw=] a {{TypeError}}.
- 1. If |input|'s [=MLOperand/dataType=] is not {{MLOperandDataType/"float32"}} or {{MLOperandDataType/"float16"}}, then [=exception/throw=] a {{TypeError}}.
- 1. If |input|'s [=MLOperand/rank=] is not 4, then [=exception/throw=] a {{TypeError}}.
- 1. If |filter|'s [=MLOperand/rank=] is not 4, then [=exception/throw=] a {{TypeError}}.
- 1. If |filter|'s [=MLOperand/dataType=] is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
+ 1. If |input|'s [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-conv2d)), then [=exception/throw=] a {{TypeError}}.
+ 1. If |input|'s [=MLOperand/rank=] is not its [=/allowed rank=], then [=exception/throw=] a {{TypeError}}.
+ 1. If |filter|'s [=MLOperand/rank=] is not its [=/allowed rank=], then [=exception/throw=] a {{TypeError}}.
+ 1. If |filter|'s [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-conv2d)), then [=exception/throw=] a {{TypeError}}.
1. If |options|.{{MLConv2dOptions/padding}} does not [=map/exist=], set it to the [=/list=] « 0, 0, 0, 0 ».
1. Otherwise, if |options|.{{MLConv2dOptions/padding}}'s [=list/size=] is not 4, then [=exception/throw=] a {{TypeError}}.
1. If |options|.{{MLConv2dOptions/strides}} does not [=map/exist=], set it to the [=/list=] « 1, 1 ».
@@ -2195,7 +2324,7 @@ partial dictionary MLOpSupportLimits {
1. Otherwise, if |inputChannels| / |options|.{{MLConv2dOptions/groups}} is not equal to |filterInputChannels|, then [=exception/throw=] a {{TypeError}}.
1. If |options|.{{MLConv2dOptions/bias}} [=map/exists=]:
1. If its [=MLOperand/shape=] is not equal to « |outputChannels| », then [=exception/throw=] a {{TypeError}}.
- 1. If its [=MLOperand/dataType=] is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
+ 1. If its [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-conv2d)), then [=exception/throw=] a {{TypeError}}.
1. Let |outputSizes| be the result of [=MLGraphBuilder/calculating conv2d output sizes=] given |inputHeight|, |inputWidth|, |filterHeight|, |filterWidth|, |options|.{{MLConv2dOptions/padding}}, |options|.{{MLConv2dOptions/strides}}, and |options|.{{MLConv2dOptions/dilations}}.
1. Switch on |options|.{{MLConv2dOptions/inputLayout}}:
@@ -2322,6 +2451,32 @@ partial dictionary MLOpSupportLimits {
`outputSize = (inputSize - 1) * stride + (filterSize - 1) * dilation + 1 - beginningPadding - endingPadding + outputPadding`
+
+ Constraints for {{MLGraphBuilder/convTranspose2d()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ {{MLOperandDataType/"float32"}}, {{MLOperandDataType/"float16"}} |
+ 4 |
+
+
+ {{filter}} |
+ [=/same as=] {{input}} |
+ 4 |
+
+
+ {{MLConv2dOptions/bias}} |
+ [=/same as=] {{input}} |
+ 1 |
+
+
+
{{MLOpSupportLimits}} has the following member for {{MLGraphBuilder/convTranspose2d()}}:
: convTranspose2d
@@ -2352,10 +2507,10 @@ partial dictionary MLOpSupportLimits {
1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
1. If [=MLGraphBuilder/validating operand=] with [=this=] and any of |input|, |filter|, and |options|.{{MLConvTranspose2dOptions/bias}} (if it [=map/exists=]) returns false, then [=exception/throw=] a {{TypeError}}.
- 1. If |input|'s [=MLOperand/rank=] is not 4, then [=exception/throw=] a {{TypeError}}.
- 1. If |input|'s [=MLOperand/dataType=] is not {{MLOperandDataType/"float32"}} or {{MLOperandDataType/"float16"}}, then [=exception/throw=] a {{TypeError}}.
- 1. If |filter|'s [=MLOperand/rank=] is not 4, then [=exception/throw=] a {{TypeError}}.
- 1. If |filter|'s [=MLOperand/dataType=] is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
+ 1. If |input|'s [=MLOperand/rank=] is not its [=/allowed rank=], then [=exception/throw=] a {{TypeError}}.
+ 1. If |input|'s [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-convTranspose2d)), then [=exception/throw=] a {{TypeError}}.
+ 1. If |filter|'s [=MLOperand/rank=] is not its [=/allowed rank=], then [=exception/throw=] a {{TypeError}}.
+ 1. If |filter|'s [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-convTranspose2d)), then [=exception/throw=] a {{TypeError}}.
1. If |options|.{{MLConvTranspose2dOptions/padding}} does not [=map/exist=], set it to the [=/list=] « 0, 0, 0, 0 ».
1. Otherwise, if |options|.{{MLConvTranspose2dOptions/padding}}'s [=list/size=] is not 4, then [=exception/throw=] a {{TypeError}}.
1. If |options|.{{MLConvTranspose2dOptions/strides}} does not [=map/exist=], set it to the [=/list=] « 1, 1 ».
@@ -2394,7 +2549,7 @@ partial dictionary MLOpSupportLimits {
1. Let |outputChannels| be |filterOutputChannels| * |options|.{{MLConvTranspose2dOptions/groups}}.
1. If |options|.{{MLConvTranspose2dOptions/bias}} [=map/exists=]:
1. If its [=MLOperand/shape=] is not equal to « |outputChannels| », then [=exception/throw=] a {{TypeError}}.
- 1. If its [=MLOperand/dataType=] is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
+ 1. If its [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-convTranspose2d)), then [=exception/throw=] a {{TypeError}}.
1. If |options|.{{MLConvTranspose2dOptions/outputSizes}} [=map/exists=], let |outputSizes| be |options|.{{MLConvTranspose2dOptions/outputSizes}}.
1. Otherwise, let |outputSizes| be the result of [=MLGraphBuilder/calculating convtranspose2d output sizes=] given |inputHeight|, |inputWidth|, |filterHeight|, |filterWidth|, |options|.{{MLConvTranspose2dOptions/padding}}, |options|.{{MLConvTranspose2dOptions/strides}}, |options|.{{MLConvTranspose2dOptions/dilations}}, and |options|.{{MLConvTranspose2dOptions/outputPadding}}.
1. Switch on |options|.{{MLConvTranspose2dOptions/inputLayout}}:
@@ -2463,7 +2618,28 @@ partial dictionary MLOpSupportLimits {
- *pow*: Compute the values of the values of the first input tensor to the power of the values of the second input tensor, element-wise.
-{{MLOpSupportLimits}} has the following members for elementwise-binary operations:
+
+ Constraints for element-wise binary options
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{a}} |
+ [=/any data type|any=] |
+ [=/any rank|N=] |
+
+
+ {{b}} |
+ [=/same as=] {{a}} |
+ [=/any rank|N=] |
+
+
+
+{{MLOpSupportLimits}} has the following members for element-wise binary operations:
: add
:: Support limits for operator {{MLGraphBuilder/add()}}.
@@ -2604,6 +2780,28 @@ partial dictionary MLOpSupportLimits {
**Returns:** an {{MLOperand}}. The output tensor that contains the result of element-wise comparison of the two input tensors.
+
+ Constraints for element-wise logical options
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{a}} |
+ specified as part of operation steps |
+ [=/any rank|N=] |
+
+
+ {{b}} |
+ [=/same as=] {{a}} |
+ [=/any rank|N=] |
+
+
+
+
{{MLLogicalNotSupportLimits}} has the following members:
: a
@@ -2759,6 +2957,23 @@ partial dictionary MLOpSupportLimits {
tensor is the same as the shape of input tensor.
+
+ Constraints for element-wise unary options
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ specified as part of operation steps |
+ [=/any rank|N=] |
+
+
+
+
{{MLOpSupportLimits}} has the following members for element-wise unary operations:
: abs
@@ -2951,6 +3166,22 @@ partial dictionary MLOpSupportLimits {
- an {{MLOperand}}. The output tensor of the same shape as *input*.
+
+ Constraints for {{MLGraphBuilder/elu()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ {{MLOperandDataType/"float32"}}, {{MLOperandDataType/"float16"}} |
+ [=/any rank|N=] |
+
+
+
{{MLOpSupportLimits}} has the following members for {{MLGraphBuilder/elu()}}:
: elu
@@ -2963,7 +3194,7 @@ partial dictionary MLOpSupportLimits {
1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
1. If [=MLGraphBuilder/validating operand=] with [=this=] and |input| returns false, then [=exception/throw=] a {{TypeError}}.
- 1. If |input|'s [=MLOperand/dataType=] is not {{MLOperandDataType/"float32"}} or {{MLOperandDataType/"float16"}}, then [=exception/throw=] a {{TypeError}}.
+ 1. If |input|'s [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-elu)), then [=exception/throw=] a {{TypeError}}.
1. Set |options|.{{MLEluOptions/alpha}} to the result of [=casting=] |options|.{{MLEluOptions/alpha}} to |input|'s [=MLOperand/dataType=].
1. *Make graph connections:*
1. Let |output| be the result of [=copying an MLOperand=] given |input|.
@@ -3015,6 +3246,22 @@ partial dictionary MLOpSupportLimits {
**Returns:** an {{MLOperand}}. The tensor with expanded size shape.
+
+ Constraints for {{MLGraphBuilder/expand()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ [=/any data type|any=] |
+ [=/any rank|N=] |
+
+
+
{{MLOpSupportLimits}} has the following members for {{MLGraphBuilder/expand()}}:
: expand
@@ -3079,6 +3326,31 @@ partial dictionary MLOpSupportLimits {
**Returns:** an {{MLOperand}}. The output N-D tensor of [=MLOperand/rank=] equal to the [=MLOperand/rank=] of *input* + the [=MLOperand/rank=] of *indices* - 1.
+
+ The {{MLGraphBuilder/gather(input, indices, options)/indices}} parameter to {{MLGraphBuilder/gather()}} can not be clamped to the allowed range when the graph is built because the inputs are not known until execution. Implementations can introduce {{MLGraphBuilder/clamp()}} in the compiled graph if the required clamping behavior is not provided by the underlying platform. Similarly, if the underlying platform does not support negative indices, the implementation can introduce operations in the compiled graph to transform a negative index from the end of the dimension into a positive index.
+
+
+
+ Constraints for {{MLGraphBuilder/gather()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ [=/any data type|any=] |
+ [=/any rank|N=] |
+
+
+ {{indices}} |
+ {{MLOperandDataType/"int32"}}, {{MLOperandDataType/"uint32"}}, {{MLOperandDataType/"int64"}} |
+ [=/any rank|N=] |
+
+
+
{{MLGatherSupportLimits}} has the following members:
: input
@@ -3095,17 +3367,13 @@ partial dictionary MLOpSupportLimits {
:: Support limits for operator {{MLGraphBuilder/gather()}}.
-
- The {{MLGraphBuilder/gather(input, indices, options)/indices}} parameter to {{MLGraphBuilder/gather()}} can not be clamped to the allowed range when the graph is built because the inputs are not known until execution. Implementations can introduce {{MLGraphBuilder/clamp()}} in the compiled graph if the required clamping behavior is not provided by the underlying platform. Similarly, if the underlying platform does not support negative indices, the implementation can introduce operations in the compiled graph to transform a negative index from the end of the dimension into a positive index.
-
-
The gather(|input|, |indices|, |options|) method steps are:
1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
1. If [=MLGraphBuilder/validating operand=] with [=this=] and any of |input| and |indices| returns false, then [=exception/throw=] a {{TypeError}}.
- 1. If |indices|'s [=MLOperand/dataType=] is not {{MLOperandDataType/"int32"}}, {{MLOperandDataType/"uint32"}} or {{MLOperandDataType/"int64"}}, then [=exception/throw=] a {{TypeError}}.
+ 1. If |indices|'s [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-gather)), then [=exception/throw=] a {{TypeError}}.
1. Let |shapeInput| be |input|'s [=MLOperand/shape=] and |rankInput| be |shapeInput|'s [=MLOperand/rank=].
1. Let |shapeIndices| be |indices|'s [=MLOperand/shape=].
1. Let |axis| be |options|.{{MLGatherOptions/axis}}.
@@ -3216,6 +3484,22 @@ partial dictionary MLOpSupportLimits {
- an {{MLOperand}}. The output tensor of the same shape as *input*.
+
+ Constraints for {{MLGraphBuilder/gelu()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ {{MLOperandDataType/"float32"}}, {{MLOperandDataType/"float16"}} |
+ [=/any rank|N=] |
+
+
+
{{MLOpSupportLimits}} has the following member for {{MLGraphBuilder/gelu()}}:
: gelu
@@ -3228,7 +3512,7 @@ partial dictionary MLOpSupportLimits {
1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
1. If [=MLGraphBuilder/validating operand=] with [=this=] and |input| returns false, then [=exception/throw=] a {{TypeError}}.
- 1. If |input|'s [=MLOperand/dataType=] is not {{MLOperandDataType/"float32"}} or {{MLOperandDataType/"float16"}}, then [=exception/throw=] a {{TypeError}}.
+ 1. If |input|'s [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-gelu)), then [=exception/throw=] a {{TypeError}}.
1. *Make graph connections:*
1. Let |output| be the result of [=copying an MLOperand=] given |input|.
1. Let |operator| be an [=operator=] for the "gelu" operation given |options|.
@@ -3316,6 +3600,32 @@ partial dictionary MLOpSupportLimits {
**Returns:** an {{MLOperand}}. The output 2-D tensor of shape *[M, N]* that contains the calculated product of all the inputs.
+
+ Constraints for {{MLGraphBuilder/gemm()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{a}} |
+ {{MLOperandDataType/"float32"}}, {{MLOperandDataType/"float16"}} |
+ 2 |
+
+
+ {{b}} |
+ [=/same as=] {{a}} |
+ 2 |
+
+
+ {{MLGemmOptions/c}} |
+ [=/same as=] {{a}} |
+ 0 to 2 |
+
+
+
{{MLGemmSupportLimits}} has the following members:
: a
@@ -3340,9 +3650,8 @@ partial dictionary MLOpSupportLimits {
1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
1. If [=MLGraphBuilder/validating operand=] with [=this=] and any of |a| and |b| returns false, then [=exception/throw=] a {{TypeError}}.
- 1. If |a|'s [=MLOperand/dataType=] is not {{MLOperandDataType/"float32"}} or {{MLOperandDataType/"float16"}}, then [=exception/throw=] a {{TypeError}}.
- 1. If |b|'s [=MLOperand/dataType=] is not equal to |a|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
- 1. If |a|'s [=MLOperand/rank=] is not 2 or |b|'s [=MLOperand/rank=] is not 2, then [=exception/throw=] a {{TypeError}}.
+ 1. If the [=MLOperand/dataType=] of any of |a| or |b| is not one of its [=/allowed data types=] (according to [this table](#constraints-gemm)), then [=exception/throw=] a {{TypeError}}.
+ 1. If the [=MLOperand/rank=] of any of |a| or |b| is not its [=/allowed rank=], then [=exception/throw=] a {{TypeError}}.
1. Set |options|.{{MLGemmOptions/alpha}} to the result of [=casting=] |options|.{{MLGemmOptions/alpha}} to |a|'s [=MLOperand/dataType=].
1. Set |options|.{{MLGemmOptions/beta}} to the result of [=casting=] |options|.{{MLGemmOptions/beta}} to |a|'s [=MLOperand/dataType=].
1. Let |shapeA| be a [=list/clone=] of |a|'s [=MLOperand/shape=].
@@ -3352,7 +3661,7 @@ partial dictionary MLOpSupportLimits {
1. If |shapeA|[1] is not equal to |shapeB|[0], then [=exception/throw=] a {{TypeError}}.
1. If |options|.{{MLGemmOptions/c}} [=map/exists=]:
1. If it is not [=unidirectionally broadcastable=] to the shape « |shapeA|[0], |shapeB|[1] », then [=exception/throw=] a {{TypeError}}.
- 1. If its [=MLOperand/dataType=] is not equal to |a|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
+ 1. If its [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-gemm)), then [=exception/throw=] a {{TypeError}}.
1. Let |desc| be the result of [=creating an MLOperandDescriptor=] given |a|'s [=MLOperand/dataType=] and « |shapeA|[0], |shapeB|[1] ».
1. *Make graph connections:*
1. Let |output| be the result of [=creating an MLOperand=] given [=this=] and |desc|.
@@ -3493,6 +3802,47 @@ partial dictionary MLOpSupportLimits {
**Returns:** [=sequence=]<{{MLOperand}}>. The first element is a 3-D tensor of shape *[numDirections, batchSize, hiddenSize]*, the cell output from the last time step of the network. Additionally, if |options|.{{MLGruOptions/returnSequence}} is set to true, the second element is the 4-D output tensor of shape *[steps, numDirections, batchSize, hiddenSize]* containing every cell outputs from each time step in the temporal sequence.
+
+ Constraints for {{MLGraphBuilder/gru()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ {{MLOperandDataType/"float32"}}, {{MLOperandDataType/"float16"}} |
+ 3 |
+
+
+ {{weight}} |
+ [=/same as=] {{input}} |
+ 3 |
+
+
+ {{recurrentWeight}} |
+ [=/same as=] {{input}} |
+ 3 |
+
+
+ {{MLGruOptions/bias}} |
+ [=/same as=] {{input}} |
+ 2 |
+
+
+ {{MLGruOptions/recurrentBias}} |
+ [=/same as=] {{input}} |
+ 2 |
+
+
+ {{MLGruOptions/initialHiddenState}} |
+ [=/same as=] {{input}} |
+ 3 |
+
+
+
{{MLGruSupportLimits}} has the following members:
: input
@@ -3523,9 +3873,8 @@ partial dictionary MLOpSupportLimits {
1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
1. If [=MLGraphBuilder/validating operand=] with [=this=] and any of |input|, |weight|, |recurrentWeight|, |options|.{{MLGruOptions/bias}} (if it [=map/exists=]), |options|.{{MLGruOptions/recurrentBias}} (if it [=map/exists=]), and |options|.{{MLGruOptions/initialHiddenState}} (if it [=map/exists=]) returns false, then [=exception/throw=] a {{TypeError}}.
- 1. If |input|'s [=MLOperand/dataType=] is not {{MLOperandDataType/"float32"}} or {{MLOperandDataType/"float16"}}, then [=exception/throw=] a {{TypeError}}.
- 1. If |input|'s [=MLOperand/rank=] is not 3, then [=exception/throw=] a {{TypeError}}.
- 1. If the [=MLOperand/dataType=] of either |weight| or |recurrentWeight| is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
+ 1. If the [=MLOperand/dataType=] of any of |input|, |weight| or |recurrentWeight| is not one of its [=/allowed data types=] (according to [this table](#constraints-gru)), then [=exception/throw=] a {{TypeError}}.
+ 1. If the [=MLOperand/rank=] of any of |input|, |weight| or |recurrentWeight| is not its [=/allowed rank=], then [=exception/throw=] a {{TypeError}}.
1. If |input|'s [=MLOperand/shape=][0] is not equal to |steps|, then [=exception/throw=] a {{TypeError}}.
1. Let |batchSize| be |input|'s [=MLOperand/shape=][1].
1. Let |inputSize| be |input|'s [=MLOperand/shape=][2].
@@ -3538,13 +3887,13 @@ partial dictionary MLOpSupportLimits {
Some underlying platforms operate on a single bias tensor which is a concatenation of {{MLGruOptions/bias}} and {{MLGruOptions/recurrentBias}}. Therefore, 3 * |hiddenSize| + 3 * |hiddenSize| must also be a [=valid dimension=].
1. If |options|.{{MLGruOptions/bias}} [=map/exists=]:
- 1. If its [=MLOperand/dataType=] is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
+ 1. If its [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-gru)), then [=exception/throw=] a {{TypeError}}.
1. If its [=MLOperand/shape=] is not equal to « |numDirections|, 3 * |hiddenSize| », then [=exception/throw=] a {{TypeError}}.
1. If |options|.{{MLGruOptions/recurrentBias}} [=map/exists=]:
- 1. If its [=MLOperand/dataType=] is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
+ 1. If its [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-gru)), then [=exception/throw=] a {{TypeError}}.
1. If its [=MLOperand/shape=] is not equal to « |numDirections|, 3 * |hiddenSize| », then [=exception/throw=] a {{TypeError}}.
1. If |options|.{{MLGruOptions/initialHiddenState}} [=map/exists=]:
- 1. If its [=MLOperand/dataType=] is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
+ 1. If its [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-gru)), then [=exception/throw=] a {{TypeError}}.
1. If its [=MLOperand/shape=] is not equal to « |numDirections|, |batchSize|, |hiddenSize| », then [=exception/throw=] a {{TypeError}}.
1. If |options|.{{MLGruOptions/activations}} [=map/exists=]:
1. If its [=list/size=] is not 2, then [=exception/throw=] a {{TypeError}}.
@@ -3750,6 +4099,42 @@ partial dictionary MLOpSupportLimits {
**Returns:** an {{MLOperand}}. The 2-D tensor of shape *[batchSize, hiddenSize]*, the cell output hidden state of a single time step of the recurrent network.
+
+ Constraints for {{MLGraphBuilder/gruCell()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ {{MLOperandDataType/"float32"}}, {{MLOperandDataType/"float16"}} |
+ 2 |
+
+
+ {{weight}} |
+ [=/same as=] {{input}} |
+ 2 |
+
+
+ {{recurrentWeight}} |
+ [=/same as=] {{input}} |
+ 2 |
+
+
+ {{MLGruCellOptions/bias}} |
+ [=/same as=] {{input}} |
+ 1 |
+
+
+ {{MLGruCellOptions/recurrentBias}} |
+ [=/same as=] {{input}} |
+ 1 |
+
+
+
{{MLGruCellSupportLimits}} has the following members;
: input
@@ -3780,11 +4165,10 @@ partial dictionary MLOpSupportLimits {
1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
1. If [=MLGraphBuilder/validating operand=] with [=this=] and any of |input|, |weight|, |recurrentWeight|, |hiddenState|, |options|.{{MLGruCellOptions/bias}} (if it [=map/exists=]), and |options|.{{MLGruCellOptions/recurrentBias}} (if it [=map/exists=]) returns false, then [=exception/throw=] a {{TypeError}}.
- 1. If |input|'s [=MLOperand/dataType=] is not {{MLOperandDataType/"float32"}} or {{MLOperandDataType/"float16"}}, then [=exception/throw=] a {{TypeError}}.
- 1. If |input|'s [=MLOperand/rank=] is not 2, then [=exception/throw=] a {{TypeError}}.
+ 1. If the [=MLOperand/dataType=] of any of |input|, |weight|, |recurrentWeight|, or |hiddenState| is not one of its [=/allowed data types=] (according to [this table](#constraints-gruCell)), then [=exception/throw=] a {{TypeError}}.
+ 1. If the [=MLOperand/rank=] of any of |input|, |weight|, |recurrentWeight| or |hiddenState| is not its [=/allowed ranks=] (according to [this table](#constraints-gruCell)), then [=exception/throw=] a {{TypeError}}.
1. Let |batchSize| be |input|'s [=MLOperand/shape=][0].
1. Let |inputSize| be |input|'s [=MLOperand/shape=][1].
- 1. If the [=MLOperand/dataType=] of any of |weight|, |recurrentWeight|, or |hiddenState| is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
1. If |weight|'s [=MLOperand/shape=] is not equal to « 3 * |hiddenSize|, |inputSize| », then [=exception/throw=] a {{TypeError}}.
1. If |recurrentWeight|'s [=MLOperand/shape=] is not equal to « 3 * |hiddenSize|, |hiddenSize| », then [=exception/throw=] a {{TypeError}}.
1. If |hiddenState|'s [=MLOperand/shape=] is not equal to « |batchSize|, |hiddenSize| », then [=exception/throw=] a {{TypeError}}.
@@ -3794,10 +4178,10 @@ partial dictionary MLOpSupportLimits {
Some underlying platforms operate on a single bias tensor which is a concatenation of {{MLGruCellOptions/bias}} and {{MLGruCellOptions/recurrentBias}}. Therefore, 3 * |hiddenSize| + 3 * |hiddenSize| must also be a [=valid dimension=].
1. If |options|.{{MLGruCellOptions/bias}} [=map/exists=]:
- 1. If its [=MLOperand/dataType=] is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
+ 1. If its [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-gruCell)), then [=exception/throw=] a {{TypeError}}.
1. If its [=MLOperand/shape=] is not equal to « 3 * |hiddenSize| », then [=exception/throw=] a {{TypeError}}.
1. If |options|.{{MLGruCellOptions/recurrentBias}} [=map/exists=]:
- 1. If its [=MLOperand/dataType=] is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
+ 1. If its [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-gruCell)), then [=exception/throw=] a {{TypeError}}.
1. If its [=MLOperand/shape=] is not equal to « 3 * |hiddenSize| », then [=exception/throw=] a {{TypeError}}.
1. If |options|.{{MLGruCellOptions/activations}} [=map/exists=]:
1. If its [=list/size=] is not 2, then [=exception/throw=] a {{TypeError}}.
@@ -3957,6 +4341,22 @@ partial dictionary MLOpSupportLimits {
- an {{MLOperand}}. The output tensor of the same shape as *input*.
+
+ Constraints for {{MLGraphBuilder/hardSigmoid()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ {{MLOperandDataType/"float32"}}, {{MLOperandDataType/"float16"}} |
+ [=/any rank|N=] |
+
+
+
{{MLOpSupportLimits}} has the following member for {{MLGraphBuilder/hardSigmoid()}}:
: hardSigmoid
@@ -3969,7 +4369,7 @@ partial dictionary MLOpSupportLimits {
1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
1. If [=MLGraphBuilder/validating operand=] with [=this=] and |input| returns false, then [=exception/throw=] a {{TypeError}}.
- 1. If |input|'s [=MLOperand/dataType=] is not {{MLOperandDataType/"float32"}} or {{MLOperandDataType/"float16"}}, then [=exception/throw=] a {{TypeError}}.
+ 1. If |input|'s [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-hardSigmoid)), then [=exception/throw=] a {{TypeError}}.
1. Set |options|.{{MLHardSigmoidOptions/alpha}} to the result of [=casting=] |options|.{{MLHardSigmoidOptions/alpha}} to |input|'s [=MLOperand/dataType=].
1. Set |options|.{{MLHardSigmoidOptions/beta}} to the result of [=casting=] |options|.{{MLHardSigmoidOptions/beta}} to |input|'s [=MLOperand/dataType=].
1. *Make graph connections:*
@@ -4021,6 +4421,22 @@ partial dictionary MLOpSupportLimits {
- an {{MLOperand}}. The output tensor of the same shape as *input*.
+
+ Constraints for {{MLGraphBuilder/hardSwish()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ {{MLOperandDataType/"float32"}}, {{MLOperandDataType/"float16"}} |
+ [=/any rank|N=] |
+
+
+
{{MLOpSupportLimits}} has the following member for {{MLGraphBuilder/hardSwish()}}:
: hardSwish
@@ -4033,7 +4449,7 @@ partial dictionary MLOpSupportLimits {
1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
1. If [=MLGraphBuilder/validating operand=] with [=this=] and |input| returns false, then [=exception/throw=] a {{TypeError}}.
- 1. If |input|'s [=MLOperand/dataType=] is not {{MLOperandDataType/"float32"}} or {{MLOperandDataType/"float16"}}, then [=exception/throw=] a {{TypeError}}.
+ 1. If |input|'s [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-hardSwish)), then [=exception/throw=] a {{TypeError}}.
1. *Make graph connections:*
1. Let |output| be the result of [=copying an MLOperand=] given |input|.
1. Let |operator| be an [=operator=] for the "hardSwish" operation, given |options|.
@@ -4120,6 +4536,32 @@ partial dictionary MLOpSupportLimits {
**Returns:** an {{MLOperand}}. The instance-normalized 4-D tensor of the same shape as *input*.
+
+ Constraints for {{MLGraphBuilder/instanceNormalization()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ {{MLOperandDataType/"float32"}}, {{MLOperandDataType/"float16"}} |
+ 4 |
+
+
+ {{MLInstanceNormalizationOptions/scale}} |
+ [=/same as=] {{input}} |
+ 1 |
+
+
+ {{MLInstanceNormalizationOptions/bias}} |
+ [=/same as=] {{input}} |
+ 1 |
+
+
+
{{MLNormalizationSupportLimits}} has the following members:
: input
@@ -4144,15 +4586,15 @@ partial dictionary MLOpSupportLimits {
1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
1. If [=MLGraphBuilder/validating operand=] with [=this=] and any of |input|, |options|.{{MLInstanceNormalizationOptions/scale}} (if it [=map/exists=]), and |options|.{{MLInstanceNormalizationOptions/bias}} (if it [=map/exists=]) returns false, then [=exception/throw=] a {{TypeError}}.
- 1. If |input|'s [=MLOperand/dataType=] is not {{MLOperandDataType/"float32"}} or {{MLOperandDataType/"float16"}}, then [=exception/throw=] a {{TypeError}}.
- 1. If |input|'s [=MLOperand/rank=] is not 4, then [=exception/throw=] a {{TypeError}}.
+ 1. If |input|'s [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-instanceNormalization)), then [=exception/throw=] a {{TypeError}}.
+ 1. If |input|'s [=MLOperand/rank=] is not its [=/allowed rank=], then [=exception/throw=] a {{TypeError}}.
1. Set |options|.{{MLInstanceNormalizationOptions/epsilon}} to the result of [=casting=] |options|.{{MLInstanceNormalizationOptions/epsilon}} to |input|'s [=MLOperand/dataType=].
1. Let |axis| be 1 if |options|.{{MLInstanceNormalizationOptions/layout}} is {{MLInputOperandLayout/"nchw"}}, and 3 otherwise.
1. If |options|.{{MLInstanceNormalizationOptions/scale}} [=map/exists=]:
- 1. If its [=MLOperand/dataType=] is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
+ 1. If its [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-instanceNormalization)), then [=exception/throw=] a {{TypeError}}.
1. If its [=MLOperand/shape=] is not equal to « |input|'s [=MLOperand/shape=][|axis|] », then [=exception/throw=] a {{TypeError}}.
1. If |options|.{{MLInstanceNormalizationOptions/bias}} [=map/exists=]:
- 1. If its [=MLOperand/dataType=] is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
+ 1. If its [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-instanceNormalization)), then [=exception/throw=] a {{TypeError}}.
1. If its [=MLOperand/shape=] is not equal to « |input|'s [=MLOperand/shape=][|axis|] », then [=exception/throw=] a {{TypeError}}.
1. *Make graph connections:*
1. Let |output| be the result of [=copying an MLOperand=] given |input|.
@@ -4243,6 +4685,32 @@ partial dictionary MLOpSupportLimits {
**Returns:** an {{MLOperand}}. The layer-normalized N-D tensor of the same shape as *input*.
+
+ Constraints for {{MLGraphBuilder/layerNormalization()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ {{MLOperandDataType/"float32"}}, {{MLOperandDataType/"float16"}} |
+ [=/any rank|N=] |
+
+
+ {{MLLayerNormalizationOptions/scale}} |
+ [=/same as=] {{input}} |
+ 0 to {{input}}'s [=MLOperand/rank=] |
+
+
+ {{MLLayerNormalizationOptions/bias}} |
+ [=/same as=] {{input}} |
+ 0 to {{input}}'s [=MLOperand/rank=] |
+
+
+
{{MLOpSupportLimits}} has the following member for {{MLGraphBuilder/layerNormalization()}}:
: layerNormalization
@@ -4255,15 +4723,15 @@ partial dictionary MLOpSupportLimits {
1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
1. If [=MLGraphBuilder/validating operand=] with [=this=] and any of |input|, |options|.{{MLLayerNormalizationOptions/scale}} (if it [=map/exists=]), and |options|.{{MLLayerNormalizationOptions/bias}} (if it [=map/exists=]) returns false, then [=exception/throw=] a {{TypeError}}.
- 1. If |input|'s [=MLOperand/dataType=] is not {{MLOperandDataType/"float32"}} or {{MLOperandDataType/"float16"}}, then [=exception/throw=] a {{TypeError}}.
+ 1. If |input|'s [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-layerNormalization)), then [=exception/throw=] a {{TypeError}}.
1. If |options|.{{MLLayerNormalizationOptions/axes}} does not [=map/exist=], then set |options|.{{MLLayerNormalizationOptions/axes}} to a new [=/list=], either equal to [=the range=] from 1 to |input|'s [=MLOperand/rank=], exclusive, if |input|'s [=MLOperand/rank=] is greater than 1, or an empty [=/list=] otherwise.
1. Otherwise, if |options|.{{MLLayerNormalizationOptions/axes}} contains duplicate values, or if any of its elements is not in [=the range=] 0 to |input|'s [=MLOperand/rank=], exclusive, then return failure.
1. Set |options|.{{MLLayerNormalizationOptions/epsilon}} to the result of [=casting=] |options|.{{MLLayerNormalizationOptions/epsilon}} to |input|'s [=MLOperand/dataType=].
1. If |options|.{{MLLayerNormalizationOptions/scale}} [=map/exists=]:
- 1. If its [=MLOperand/dataType=] is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
+ 1. If its [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-layerNormalization)), then [=exception/throw=] a {{TypeError}}.
1. If its [=MLOperand/rank=] is not equal to |options|.{{MLLayerNormalizationOptions/axes}}'s [=list/size=], then [=exception/throw=] a {{TypeError}}.
1. If |options|.{{MLLayerNormalizationOptions/bias}} [=map/exists=]:
- 1. If its [=MLOperand/dataType=] is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
+ 1. If its [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-layerNormalization)), then [=exception/throw=] a {{TypeError}}.
1. If its [=MLOperand/rank=] is not equal to |options|.{{MLLayerNormalizationOptions/axes}}'s [=list/size=], then [=exception/throw=] a {{TypeError}}.
1. [=list/For each=] |index| in [=the range=] 0 to |options|.{{MLLayerNormalizationOptions/axes}}'s [=list/size=], exclusive:
1. Let |axis| be |options|.{{MLLayerNormalizationOptions/axes}}[|index|].
@@ -4348,6 +4816,22 @@ partial dictionary MLOpSupportLimits {
- an {{MLOperand}}. The output tensor of the same shape as *input*.
+
+ Constraints for {{MLGraphBuilder/leakyRelu()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ {{MLOperandDataType/"float32"}}, {{MLOperandDataType/"float16"}} |
+ [=/any rank|N=] |
+
+
+
{{MLOpSupportLimits}} has the following member for {{MLGraphBuilder/leakyRelu()}}:
: leakyRelu
@@ -4360,7 +4844,7 @@ partial dictionary MLOpSupportLimits {
1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
1. If [=MLGraphBuilder/validating operand=] with [=this=] and |input| returns false, then [=exception/throw=] a {{TypeError}}.
- 1. If |input|'s [=MLOperand/dataType=] is not {{MLOperandDataType/"float32"}} or {{MLOperandDataType/"float16"}}, then [=exception/throw=] a {{TypeError}}.
+ 1. If |input|'s [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-leakyRelu)), then [=exception/throw=] a {{TypeError}}.
1. Set |options|.{{MLLeakyReluOptions/alpha}} to the result of [=casting=] |options|.{{MLLeakyReluOptions/alpha}} to |input|'s [=MLOperand/dataType=].
1. *Make graph connections:*
1. Let |output| be the result of [=copying an MLOperand=] given |input|.
@@ -4425,6 +4909,22 @@ partial dictionary MLOpSupportLimits {
- an {{MLOperand}}. The output tensor of the same shape as *input*.
+
+ Constraints for {{MLGraphBuilder/linear()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ {{MLOperandDataType/"float32"}}, {{MLOperandDataType/"float16"}} |
+ [=/any rank|N=] |
+
+
+
{{MLOpSupportLimits}} has the following member for {{MLGraphBuilder/linear()}}:
: linear
@@ -4437,7 +4937,7 @@ partial dictionary MLOpSupportLimits {
1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
1. If [=MLGraphBuilder/validating operand=] with [=this=] and |input| returns false, then [=exception/throw=] a {{TypeError}}.
- 1. If |input|'s [=MLOperand/dataType=] is not {{MLOperandDataType/"float32"}} or {{MLOperandDataType/"float16"}}, then [=exception/throw=] a {{TypeError}}.
+ 1. If |input|'s [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-linear)), then [=exception/throw=] a {{TypeError}}.
1. Set |options|.{{MLLinearOptions/alpha}} to the result of [=casting=] |options|.{{MLLinearOptions/alpha}} to |input|'s [=MLOperand/dataType=].
1. Set |options|.{{MLLinearOptions/beta}} to the result of [=casting=] |options|.{{MLLinearOptions/beta}} to |input|'s [=MLOperand/dataType=].
1. *Make graph connections:*
@@ -4563,6 +5063,57 @@ partial dictionary MLOpSupportLimits {
**Returns:** [=sequence=]<{{MLOperand}}>. The first element is a 3-D tensor of shape *[numDirections, batchSize, hiddenSize]*, the output hidden state from the last time step of the network. The second element is a 3-D tensor of shape *[numDirections, batchSize, hiddenSize]*, the output cell state from the last time step of the network. Additionally, if |options|.{{MLLstmOptions/returnSequence}} is set to true, the third element is the 4-D output tensor of shape *[steps, numDirections, batchSize, hiddenSize]* containing every output from each time step in the temporal sequence.
+
+ Constraints for {{MLGraphBuilder/lstm()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ {{MLOperandDataType/"float32"}}, {{MLOperandDataType/"float16"}} |
+ 3 |
+
+
+ {{weight}} |
+ [=/same as=] {{input}} |
+ 3 |
+
+
+ {{recurrentWeight}} |
+ [=/same as=] {{input}} |
+ 3 |
+
+
+ {{MLLstmOptions/bias}} |
+ [=/same as=] {{input}} |
+ 2 |
+
+
+ {{MLLstmOptions/recurrentBias}} |
+ [=/same as=] {{input}} |
+ 2 |
+
+
+ {{MLLstmOptions/peepholeWeight}} |
+ [=/same as=] {{input}} |
+ 2 |
+
+
+ {{MLLstmOptions/initialHiddenState}} |
+ [=/same as=] {{input}} |
+ 3 |
+
+
+ {{MLLstmOptions/initialCellState}} |
+ [=/same as=] {{input}} |
+ 3 |
+
+
+
{{MLLstmSupportLimits}} has the following members:
: input
@@ -4598,13 +5149,11 @@ partial dictionary MLOpSupportLimits {
1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
1. If [=MLGraphBuilder/validating operand=] with [=this=] and any of |input|, |weight|, |recurrentWeight|, |options|.{{MLLstmOptions/bias}} (if it [=map/exists=]), |options|.{{MLLstmOptions/recurrentBias}} (if it [=map/exists=]), |options|.{{MLLstmOptions/peepholeWeight}} (if it [=map/exists=]), |options|.{{MLLstmOptions/initialHiddenState}} (if it [=map/exists=]), and |options|.{{MLLstmOptions/initialCellState}} (if it [=map/exists=]) returns false, then [=exception/throw=] a {{TypeError}}.
1. Let |numDirections| be 2 if |options|.{{MLLstmOptions/direction}} is {{MLRecurrentNetworkDirection/"both"}}, or 1 otherwise.
- 1. If |input|'s [=MLOperand/dataType=] is not {{MLOperandDataType/"float32"}} or {{MLOperandDataType/"float16"}}, then [=exception/throw=] a {{TypeError}}.
- 1. If |input|'s [=MLOperand/rank=] is not 3, then [=exception/throw=] a {{TypeError}}.
+ 1. If the [=MLOperand/dataType=] of any of |input|, |weight| or |recurrentWeight| is not one of its [=/allowed data types=] (according to [this table](#constraints-lstm)), then [=exception/throw=] a {{TypeError}}.
+ 1. If the [=MLOperand/rank=] of any of |input|, |weight| or |recurrentWeight| is not its [=/allowed rank=], then [=exception/throw=] a {{TypeError}}.
1. If |input|'s [=MLOperand/shape=][0] is not equal to |steps|, then [=exception/throw=] a {{TypeError}}.
- 1. If the [=MLOperand/dataType=] of either |weight| or |recurrentWeight| is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
1. Let |batchSize| be |input|'s [=MLOperand/shape=][1].
1. Let |inputSize| be |input|'s [=MLOperand/shape=][2].
- 1. If the [=MLOperand/dataType=] of either |weight| or |recurrentWeight| is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
1. If |weight|'s [=MLOperand/shape=] is not equal to « |numDirections|, 4 * |hiddenSize|, |inputSize| », then [=exception/throw=] a {{TypeError}}.
1. If |recurrentWeight|'s [=MLOperand/shape=] is not equal to « |numDirections|, 4 * |hiddenSize|, |hiddenSize| », then [=exception/throw=] a {{TypeError}}.
1. If |hiddenSize| * 8 is not a [=valid dimension=], then [=exception/throw=] a {{TypeError}}.
@@ -4613,19 +5162,19 @@ partial dictionary MLOpSupportLimits {
Some underlying platforms operate on a single bias tensor which is a concatenation of {{MLLstmOptions/bias}} and {{MLLstmOptions/recurrentBias}}. Therefore, 4 * |hiddenSize| + 4 * |hiddenSize| must also be a [=valid dimension=].
1. If |options|.{{MLLstmOptions/bias}} [=map/exists=]:
- 1. If its [=MLOperand/dataType=] is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
+ 1. If its [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-lstm)), then [=exception/throw=] a {{TypeError}}.
1. If its [=MLOperand/shape=] is not equal to « |numDirections|, 4 * |hiddenSize| », then [=exception/throw=] a {{TypeError}}.
1. If |options|.{{MLLstmOptions/recurrentBias}} [=map/exists=]:
- 1. If its [=MLOperand/dataType=] is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
+ 1. If its [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-lstm)), then [=exception/throw=] a {{TypeError}}.
1. If its [=MLOperand/shape=] is not equal to « |numDirections|, 4 * |hiddenSize| », then [=exception/throw=] a {{TypeError}}.
1. If |options|.{{MLLstmOptions/peepholeWeight}} [=map/exists=]:
- 1. If its [=MLOperand/dataType=] is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
+ 1. If its [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-lstm)), then [=exception/throw=] a {{TypeError}}.
1. If its [=MLOperand/shape=] is not equal to « |numDirections|, 3 * |hiddenSize| », then [=exception/throw=] a {{TypeError}}.
1. If |options|.{{MLLstmOptions/initialHiddenState}} [=map/exists=]:
- 1. If its [=MLOperand/dataType=] is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
+ 1. If its [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-lstm)), then [=exception/throw=] a {{TypeError}}.
1. If its [=MLOperand/shape=] is not equal to « |numDirections|, |batchSize|, |hiddenSize| », then [=exception/throw=] a {{TypeError}}.
1. If |options|.{{MLLstmOptions/initialCellState}} [=map/exists=]:
- 1. If its [=MLOperand/dataType=] is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
+ 1. If its [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-lstm)), then [=exception/throw=] a {{TypeError}}.
1. If its [=MLOperand/shape=] is not equal to « |numDirections|, |batchSize|, |hiddenSize| », then [=exception/throw=] a {{TypeError}}.
1. If |options|.{{MLLstmOptions/activations}} [=map/exists=]:
1. If its [=list/size=] is not 3, then [=exception/throw=] a {{TypeError}}.
@@ -4865,6 +5414,57 @@ partial dictionary MLOpSupportLimits {
**Returns:** [=sequence=]<{{MLOperand}}>. The first element is the output hidden state of the current time step of the recurrent network. The following element is the output cell state. Both elements are 2-D tensors of shape *[batchSize, hiddenSize]*.
+
+ Constraints for {{MLGraphBuilder/lstmCell()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ {{MLOperandDataType/"float32"}}, {{MLOperandDataType/"float16"}} |
+ 2 |
+
+
+ {{weight}} |
+ [=/same as=] {{input}} |
+ 2 |
+
+
+ {{recurrentWeight}} |
+ [=/same as=] {{input}} |
+ 2 |
+
+
+ {{hiddenState}} |
+ [=/same as=] {{input}} |
+ 2 |
+
+
+ {{cellState}} |
+ [=/same as=] {{input}} |
+ 2 |
+
+
+ {{MLLstmCellOptions/bias}} |
+ [=/same as=] {{input}} |
+ 1 |
+
+
+ {{MLLstmCellOptions/recurrentBias}} |
+ [=/same as=] {{input}} |
+ 1 |
+
+
+ {{MLLstmCellOptions/peepholeWeight}} |
+ [=/same as=] {{input}} |
+ 1 |
+
+
+
{{MLLstmCellSupportLimits}} has the following members:
: input
@@ -4899,9 +5499,8 @@ partial dictionary MLOpSupportLimits {
1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
1. If [=MLGraphBuilder/validating operand=] with [=this=] and any of |input|, |weight|, |recurrentWeight|, |hiddenState|, |cellState|, |options|.{{MLLstmCellOptions/bias}} (if it [=map/exists=]), |options|.{{MLLstmCellOptions/recurrentBias}} (if it [=map/exists=]), and |options|.{{MLLstmCellOptions/peepholeWeight}} (if it [=map/exists=]) returns false, then [=exception/throw=] a {{TypeError}}.
- 1. If |input|'s [=MLOperand/dataType=] is not {{MLOperandDataType/"float32"}} or {{MLOperandDataType/"float16"}}, then [=exception/throw=] a {{TypeError}}.
- 1. If |input|'s [=MLOperand/rank=] is not equal to 2, then [=exception/throw=] a {{TypeError}}.
- 1. If the [=MLOperand/dataType=] of any of |weight|, |recurrentWeight|, |hiddenState| or |cellState| is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
+ 1. If the [=MLOperand/dataType=] of any of |input|, |weight|, |recurrentWeight|, |hiddenState| or |cellState| is not one of its [=/allowed data types=] (according to [this table](#constraints-lstmCell)), then [=exception/throw=] a {{TypeError}}.
+ 1. If the [=MLOperand/rank=] of any of |input|, |weight|, |recurrentWeight|, |hiddenState| or |cellState| is not its [=/allowed rank=], then [=exception/throw=] a {{TypeError}}.
1. Let |batchSize| be |input|'s [=MLOperand/shape=][0].
1. Let |inputSize| be |input|'s [=MLOperand/shape=][1].
1. If |weight|'s [=MLOperand/shape=] is not equal to « 4 * |hiddenSize|, |inputSize| », then [=exception/throw=] a {{TypeError}}.
@@ -4914,13 +5513,13 @@ partial dictionary MLOpSupportLimits {
Some underlying platforms operate on a single bias tensor which is a concatenation of {{MLLstmCellOptions/bias}} and {{MLLstmCellOptions/recurrentBias}}. Therefore, 4 * |hiddenSize| + 4 * |hiddenSize| must also be a [=valid dimension=].
1. If |options|.{{MLLstmCellOptions/bias}} [=map/exists=]:
- 1. If its [=MLOperand/dataType=] is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
+ 1. If its [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-lstmCell)), then [=exception/throw=] a {{TypeError}}.
1. If its [=MLOperand/shape=] is not equal to « 4 * |hiddenSize| », then [=exception/throw=] a {{TypeError}}.
1. If |options|.{{MLLstmCellOptions/recurrentBias}} [=map/exists=]:
- 1. If its [=MLOperand/dataType=] is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
+ 1. If its [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-lstmCell)), then [=exception/throw=] a {{TypeError}}.
1. If its [=MLOperand/shape=] is not equal to « 4 * |hiddenSize| », then [=exception/throw=] a {{TypeError}}.
1. If |options|.{{MLLstmCellOptions/peepholeWeight}} [=map/exists=]:
- 1. If its [=MLOperand/dataType=] is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
+ 1. If its [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-lstmCell)), then [=exception/throw=] a {{TypeError}}.
1. If its [=MLOperand/shape=] is not equal to « 3 * |hiddenSize| », then [=exception/throw=] a {{TypeError}}.
1. If |options|.{{MLLstmCellOptions/activations}} [=map/exists=]:
1. If its [=list/size=] is not 3, then [=exception/throw=] a {{TypeError}}.
@@ -5099,6 +5698,27 @@ partial dictionary MLOpSupportLimits {
- If either *a* or *b* is `N`-dimensional where `N > 2`, it is treated as a stack of matrices with dimensions corresponding to the last two indices. The matrix multiplication will be [=broadcast=] according to [[!numpy-broadcasting-rule]]. The shapes of *a* and *b*, except the last two dimensions, must be [=bidirectionally broadcastable=]. The output is a `N`-dimensional tensor whose rank is the maximum [=MLOperand/rank=] of the input tensors. For each dimension, except the last two, of the output tensor, its size is the maximum size along that dimension of the input tensors.
+
+ Constraints for {{MLGraphBuilder/matmul()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{a}} |
+ {{MLOperandDataType/"float32"}}, {{MLOperandDataType/"float16"}} |
+ 2 or greater |
+
+
+ {{b}} |
+ [=/same as=] {{a}} |
+ 2 or greater |
+
+
+
{{MLOpSupportLimits}} has the following member for {{MLGraphBuilder/matmul()}}:
: matmul
@@ -5132,8 +5752,7 @@ partial dictionary MLOpSupportLimits {
1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
1. If [=MLGraphBuilder/validating operand=] with [=this=] and any of |a| and |b| returns false, then [=exception/throw=] a {{TypeError}}.
- 1. If |a|'s [=MLOperand/dataType=] is not {{MLOperandDataType/"float32"}} or {{MLOperandDataType/"float16"}}, then [=exception/throw=] a {{TypeError}}.
- 1. If |b|'s [=MLOperand/dataType=] is not equal to |a|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
+ 1. If the [=MLOperand/dataType=] of any of |a| or |b| is not one of its [=/allowed data types=] (according to [this table](#constraints-matmul)), then [=exception/throw=] a {{TypeError}}.
1. Let |outputShape| be the result of [=MLGraphBuilder/calculating matmul output sizes=] given |a| and |b|.
1. If that throws an error, re-[=exception/throw=] the error.
1. Let |desc| be the result of [=creating an MLOperandDescriptor=] given |a|'s [=MLOperand/dataType=] and |outputShape|.
@@ -5196,6 +5815,22 @@ partial dictionary MLOpSupportLimits {
`output size = beginning padding + input size + ending padding`
+
+ Constraints for {{MLGraphBuilder/pad()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ [=/any data type|any=] |
+ [=/any rank|N=] |
+
+
+
{{MLOpSupportLimits}} has the following member for {{MLGraphBuilder/pad()}}:
: pad
@@ -5375,6 +6010,22 @@ partial dictionary MLOpSupportLimits {
`output size = ceil(1 + (input size - filter size + beginning padding + ending padding) / stride)`
+
+ Constraints for pooling operations
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ specified as part of operation steps |
+ 4 |
+
+
+
{{MLOpSupportLimits}} has the following members for pooling operations:
: averagePool2d
@@ -5531,6 +6182,26 @@ partial dictionary MLOpSupportLimits {
- an {{MLOperand}}. The output tensor of the same shape as *input*.
+
+ Constraints for {{MLGraphBuilder/prelu()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ {{MLOperandDataType/"float32"}}, {{MLOperandDataType/"float16"}}, {{MLOperandDataType/"int32"}}, {{MLOperandDataType/"int8"}} |
+ [=/any rank|N=] |
+
+ {{slope}} |
+ [=/same as=] {{input}} |
+ [=/any rank|N=] |
+
+
+
{{MLPreluSupportLimits}} has the following members:
: input
@@ -5553,8 +6224,7 @@ partial dictionary MLOpSupportLimits {
1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
1. If [=MLGraphBuilder/validating operand=] with [=this=] and any of |input| and |slope| returns false, then [=exception/throw=] a {{TypeError}}.
- 1. If |input|'s [=MLOperand/dataType=] is not {{MLOperandDataType/"float32"}}, {{MLOperandDataType/"float16"}}, {{MLOperandDataType/"int32"}}, or {{MLOperandDataType/"int8"}}, then [=exception/throw=] a {{TypeError}}.
- 1. If |slope|'s [=MLOperand/dataType=] is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
+ 1. If the [=MLOperand/dataType=] of any of |input| or |slope| is not one of its [=/allowed data types=] (according to [this table](#constraints-prelu)), then [=exception/throw=] a {{TypeError}}.
1. Let |outputShape| be to the result of [=bidirectionally broadcasting=] |slope|'s [=MLOperand/shape=] and |input|'s [=MLOperand/shape=].
1. If that returns failure, then [=exception/throw=] a {{TypeError}}.
1. Let |descriptor| be the result of [=creating an MLOperandDescriptor=] given |input|'s [=MLOperand/dataType=] and |outputShape|.
@@ -5644,6 +6314,22 @@ partial dictionary MLOpSupportLimits {
**Returns:** an {{MLOperand}}. The reduced output tensor. If the input operand is a scalar, the reduction function is applied to the scalar value, and the output is also a scalar.
+
+ Constraints for reduction-operations
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ specified as part of operation steps |
+ [=/any rank|N=] |
+
+
+
{{MLOpSupportLimits}} has the following members for reduction operations:
: reduceL1
@@ -5841,6 +6527,22 @@ partial dictionary MLOpSupportLimits {
- an {{MLOperand}}. The output tensor of the same shape as *input*.
+
+ Constraints for {{MLGraphBuilder/relu()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ {{MLOperandDataType/"float32"}}, {{MLOperandDataType/"float16"}}, {{MLOperandDataType/"int32"}}, {{MLOperandDataType/"int8"}} |
+ [=/any rank|N=] |
+
+
+
{{MLOpSupportLimits}} has the following member for {{MLGraphBuilder/relu()}}:
: relu
@@ -5853,7 +6555,7 @@ partial dictionary MLOpSupportLimits {
1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
1. If [=MLGraphBuilder/validating operand=] with [=this=] and |input| returns false, then [=exception/throw=] a {{TypeError}}.
- 1. If |input|'s [=MLOperand/dataType=] is not {{MLOperandDataType/"float32"}}, {{MLOperandDataType/"float16"}}, {{MLOperandDataType/"int32"}}, or {{MLOperandDataType/"int8"}}, then [=exception/throw=] a {{TypeError}}.
+ 1. If |input|'s [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-relu)), then [=exception/throw=] a {{TypeError}}.
1. *Make graph connections:*
1. Let |output| be the result of [=copying an MLOperand=] given |input|.
1. Let |operator| be an [=operator=] for the "relu" operation, given |options|.
@@ -5931,6 +6633,22 @@ partial dictionary MLOpSupportLimits {
The default value is [2, 3].
+
+ Constraints for {{MLGraphBuilder/resample2d()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ {{MLOperandDataType/"float32"}}, {{MLOperandDataType/"float16"}} |
+ 4 |
+
+
+
{{MLOpSupportLimits}} has the following member for {{MLGraphBuilder/resample2d()}}:
: resample2d
@@ -5969,8 +6687,8 @@ partial dictionary MLOpSupportLimits {
1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
1. If [=MLGraphBuilder/validating operand=] with [=this=] and |input| returns false, then [=exception/throw=] a {{TypeError}}.
- 1. If |input|'s [=MLOperand/dataType=] is not {{MLOperandDataType/"float32"}} or {{MLOperandDataType/"float16"}}, then [=exception/throw=] a {{TypeError}}.
- 1. If |input|'s [=MLOperand/rank=] is not 4, then [=exception/throw=] a {{TypeError}}.
+ 1. If |input|'s [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-resample2d)), then [=exception/throw=] a {{TypeError}}.
+ 1. If |input|'s [=MLOperand/rank=] is not its [=/allowed rank=], then [=exception/throw=] a {{TypeError}}.
1. If [=MLGraphBuilder/checking resample options=] given |options| and |input| returns false, then [=exception/throw=] a {{TypeError}}.
1. Let |desc| be the result of [=MLGraphBuilder/calculating resample output sizes=] given |input| and |options|. If that returns failure, then [=exception/throw=] a {{TypeError}}.
1. *Make graph connections:*
@@ -6008,6 +6726,22 @@ partial dictionary MLOpSupportLimits {
tensor is specified by the *newShape* argument.
+
+ Constraints for {{MLGraphBuilder/reshape()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ [=/any data type|any=] |
+ [=/any rank|N=] |
+
+
+
{{MLOpSupportLimits}} has the following member for {{MLGraphBuilder/reshape()}}:
: reshape
@@ -6058,6 +6792,22 @@ partial dictionary MLOpSupportLimits {
- an {{MLOperand}}. The output tensor of the same shape as *input*.
+
+ Constraints for {{MLGraphBuilder/sigmoid()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ {{MLOperandDataType/"float32"}}, {{MLOperandDataType/"float16"}} |
+ [=/any rank|N=] |
+
+
+
{{MLOpSupportLimits}} has the following member for {{MLGraphBuilder/sigmoid()}}:
: sigmoid
@@ -6070,7 +6820,7 @@ partial dictionary MLOpSupportLimits {
1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
1. If [=MLGraphBuilder/validating operand=] with [=this=] and |input| returns false, then [=exception/throw=] a {{TypeError}}.
- 1. If |input|'s [=MLOperand/dataType=] is not {{MLOperandDataType/"float32"}} or {{MLOperandDataType/"float16"}}, then [=exception/throw=] a {{TypeError}}.
+ 1. If |input|'s [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-sigmoid)), then [=exception/throw=] a {{TypeError}}.
1. *Make graph connections:*
1. Let |output| be the result of [=copying an MLOperand=] given |input|.
1. Let |operator| be an [=operator=] for the "sigmoid" operation, given |options|.
@@ -6120,6 +6870,22 @@ partial dictionary MLOpSupportLimits {
**Returns:** an {{MLOperand}}. The output tensor of the same rank as the input tensor with tensor values stripped to the specified starting and ending indices in each dimension.
+
+ Constraints for {{MLGraphBuilder/slice()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ [=/any data type|any=] |
+ [=/any rank|N=] |
+
+
+
{{MLOpSupportLimits}} has the following member for {{MLGraphBuilder/slice()}}:
: slice
@@ -6175,6 +6941,22 @@ partial dictionary MLOpSupportLimits {
- an {{MLOperand}}. The output N-D tensor that contains the softmax results, of the same shape as *input*.
+
+ Constraints for {{MLGraphBuilder/softmax()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ {{MLOperandDataType/"float32"}}, {{MLOperandDataType/"float16"}} |
+ [=/any rank|N=] |
+
+
+
{{MLOpSupportLimits}} has the following member for {{MLGraphBuilder/softmax()}}:
: softmax
@@ -6187,7 +6969,7 @@ partial dictionary MLOpSupportLimits {
1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
1. If [=MLGraphBuilder/validating operand=] with [=this=] and |input| returns false, then [=exception/throw=] a {{TypeError}}.
- 1. If |input|'s [=MLOperand/dataType=] is not {{MLOperandDataType/"float32"}} or {{MLOperandDataType/"float16"}}, then [=exception/throw=] a {{TypeError}}.
+ 1. If |input|'s [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-softmax)), then [=exception/throw=] a {{TypeError}}.
1. If |axis| is greater than or equal to |input|'s [=MLOperand/rank=], then [=exception/throw=] a {{TypeError}}.
1. *Make graph connections:*
1. Let |output| be the result of [=copying an MLOperand=] given |input|.
@@ -6242,6 +7024,22 @@ partial dictionary MLOpSupportLimits {
- an {{MLOperand}}. The output tensor of the same shape as *input*.
+
+ Constraints for {{MLGraphBuilder/softplus()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ {{MLOperandDataType/"float32"}}, {{MLOperandDataType/"float16"}} |
+ [=/any rank|N=] |
+
+
+
{{MLOpSupportLimits}} has the following member for {{MLGraphBuilder/softplus()}}:
: softplus
@@ -6254,7 +7052,7 @@ partial dictionary MLOpSupportLimits {
1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
1. If [=MLGraphBuilder/validating operand=] with [=this=] and |input| returns false, then [=exception/throw=] a {{TypeError}}.
- 1. If |input|'s [=MLOperand/dataType=] is not {{MLOperandDataType/"float32"}} or {{MLOperandDataType/"float16"}}, then [=exception/throw=] a {{TypeError}}.
+ 1. If |input|'s [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-softplus)), then [=exception/throw=] a {{TypeError}}.
1. *Make graph connections:*
1. Let |output| be the result of [=copying an MLOperand=] given |input|.
1. Let |operator| be an [=operator=] for the "softplus" operation and |options|.
@@ -6314,6 +7112,22 @@ partial dictionary MLOpSupportLimits {
- an {{MLOperand}}. The output tensor of the same shape as *input*.
+
+ Constraints for {{MLGraphBuilder/softsign()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ {{MLOperandDataType/"float32"}}, {{MLOperandDataType/"float16"}} |
+ [=/any rank|N=] |
+
+
+
{{MLOpSupportLimits}} has the following member for {{MLGraphBuilder/softsign()}}:
: softsign
@@ -6326,7 +7140,7 @@ partial dictionary MLOpSupportLimits {
1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
1. If [=MLGraphBuilder/validating operand=] with [=this=] and |input| returns false, then [=exception/throw=] a {{TypeError}}.
- 1. If |input|'s [=MLOperand/dataType=] is not {{MLOperandDataType/"float32"}} or {{MLOperandDataType/"float16"}}, then [=exception/throw=] a {{TypeError}}.
+ 1. If |input|'s [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-softsign)), then [=exception/throw=] a {{TypeError}}.
1. *Make graph connections:*
1. Let |output| be the result of [=copying an MLOperand=] given |input|.
1. Let |operator| be an [=operator=] for the "softsign" operation and |options|.
@@ -6376,6 +7190,21 @@ partial dictionary MLOpSupportLimits {
The dimension along which to split. Its value must be in the range [0, N-1] where N is the [=MLOperand/rank=] of the input tensor.
+
+ Constraints for {{MLGraphBuilder/split()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ [=/any data type|any=] |
+ [=/any rank|N=] |
+
+
{{MLSplitSupportLimits}} has the following members:
@@ -6471,6 +7300,22 @@ partial dictionary MLOpSupportLimits {
- an {{MLOperand}}. The output tensor of the same shape as *input*.
+
+ Constraints for {{MLGraphBuilder/tanh()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ {{MLOperandDataType/"float32"}}, {{MLOperandDataType/"float16"}} |
+ [=/any rank|N=] |
+
+
+
{{MLOpSupportLimits}} has the following member for {{MLGraphBuilder/tanh()}}:
: tanh
@@ -6483,7 +7328,7 @@ partial dictionary MLOpSupportLimits {
1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
1. If [=MLGraphBuilder/validating operand=] with [=this=] and |input| returns false, then [=exception/throw=] a {{TypeError}}.
- 1. If |input|'s [=MLOperand/dataType=] is not {{MLOperandDataType/"float32"}} or {{MLOperandDataType/"float16"}}, then [=exception/throw=] a {{TypeError}}.
+ 1. If |input|'s [=MLOperand/dataType=] is not one of its [=/allowed data types=] (according to [this table](#constraints-tanh)), then [=exception/throw=] a {{TypeError}}.
1. *Make graph connections:*
1. Let |output| be the result of [=copying an MLOperand=] given |input|.
1. Let |operator| be an [=operator=] for the "tanh" operation, given |options|.
@@ -6545,6 +7390,22 @@ partial dictionary MLOpSupportLimits {
**Returns:** an {{MLOperand}}. The permuted or transposed N-D tensor.
+
+ Constraints for {{MLGraphBuilder/transpose()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ [=/any data type|any=] |
+ [=/any rank|N=] |
+
+
+
{{MLOpSupportLimits}} has the following member for {{MLGraphBuilder/transpose()}}:
: transpose
@@ -6607,6 +7468,22 @@ partial dictionary MLOpSupportLimits {
**Returns:** an {{MLOperand}}. The output tensor representing a triangular matrix, or batch of matrices which is the same shape as the input.
+
+ Constraints for {{MLGraphBuilder/triangular()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ [=/any data type|any=] |
+ 2 or greater |
+
+
+
{{MLOpSupportLimits}} has the following member for {{MLGraphBuilder/triangular()}}:
: triangular
@@ -6619,7 +7496,7 @@ partial dictionary MLOpSupportLimits {
1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
1. If [=MLGraphBuilder/validating operand=] with [=this=] and |input| returns false, then [=exception/throw=] a {{TypeError}}.
- 1. If |input|'s [=MLOperand/rank=] is less than 2, then [=exception/throw=] a {{TypeError}}.
+ 1. If |input|'s [=MLOperand/rank=] is not one of its [=/allowed ranks=] (according to [this table](#constraints-triangular)), then [=exception/throw=] a {{TypeError}}.
1. *Make graph connections:*
1. Let |output| be the result of [=copying an MLOperand=] given |input|.
1. Let |operator| be an [=operator=] for the "triangular" operation, given |options|.
@@ -6725,6 +7602,32 @@ partial dictionary MLOpSupportLimits {
**Returns:** an {{MLOperand}}. The output tensor that contains the values selected element-wise from either the trueValue or the falseValue tensor.
+
+ Constraints for {{MLGraphBuilder/where()}}
+
+
+ input operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{condition}} |
+ {{MLOperandDataType/"uint8"}} |
+ [=/any rank|N=] |
+
+
+ {{trueValue}} |
+ [=/any data type|any=] |
+ [=/any rank|N=] |
+
+
+ {{falseValue}} |
+ [=/same as=] {{trueValue}} |
+ [=/any rank|N=] |
+
+
+
{{MLWhereSupportLimits}} has the following members:
: condition
@@ -6750,8 +7653,7 @@ partial dictionary MLOpSupportLimits {
1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
1. If [=MLGraphBuilder/validating operand=] with [=this=] and any of |condition|, |trueValue|, and |falseValue| returns false, then [=exception/throw=] a {{TypeError}}.
- 1. If |condition|'s [=MLOperand/dataType=] is not equal to {{MLOperandDataType/"uint8"}}, then [=exception/throw=] a {{TypeError}}.
- 1. If |trueValue|'s [=MLOperand/dataType=] is not equal to |falseValue|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
+ 1. If the [=MLOperand/dataType=] of any of |condition|, |trueValue|, or |falseValue| is not one of its [=/allowed data types=] (according to [this table](#constraints-where)), then [=exception/throw=] a {{TypeError}}.
1. Let |outputShape| be the result of [=bidirectionally broadcasting=] |trueValue|'s [=MLOperand/shape=] and |falseValue|'s [=MLOperand/shape=].
1. If that returns failure, then [=exception/throw=] a {{TypeError}}.
1. Set |outputShape| to the result of [=bidirectionally broadcasting=] |condition|'s [=MLOperand/shape=] and |outputShape].
@@ -7106,35 +8008,37 @@ Operations present in other neural network inference APIs can often be emulated
## {{MLOperandDataType}} and {{ArrayBufferView}} compatibility ## {#appendices-mloperanddatatype-arraybufferview-compatibility}
-
-
- {{MLOperandDataType/float32}}
- | {{Float32Array}}
- |
- {{MLOperandDataType/float16}}
- | {{Float16Array}}
- |
- {{MLOperandDataType/int64}}
- | {{BigInt64Array}}
- |
- {{MLOperandDataType/uint64}}
- | {{BigUint64Array}}
- |
- {{MLOperandDataType/int32}}
- | {{Int32Array}}
- |
- {{MLOperandDataType/uint32}}
- | {{Uint32Array}}
+
+ |
+ {{MLOperandDataType/float32}} |
+ {{Float32Array}} |
+
+ {{MLOperandDataType/float16}} |
+ {{Float16Array}} |
+
+ {{MLOperandDataType/int64}} |
+ {{BigInt64Array}} |
+
+ {{MLOperandDataType/uint64}} |
+ {{BigUint64Array}} |
+
+ {{MLOperandDataType/int32}} |
+ {{Int32Array}} |
+
+ {{MLOperandDataType/uint32}} |
+ {{Uint32Array}} |
+
+ {{MLOperandDataType/int8}} |
+ {{Int8Array}} |
+
+ {{MLOperandDataType/uint8}} |
+ {{Uint8Array}} |
+
{{Float16Array}} is at ECMA Stage 3 signaling its design is finished. Implementers wanting to enable this type ahead native implementations can emulate the type by passing raw bits via {{Uint16Array}}. [Issue webnn#373]