From 3bcbe74842d4a4be239a151b10264e4fcbca5e11 Mon Sep 17 00:00:00 2001 From: Austin Sullivan Date: Mon, 8 Jul 2024 11:37:20 -0700 Subject: [PATCH 1/4] replace MLActivation with MLRecurrentNetworkActivation --- index.bs | 338 +++++++------------------------------------------------ 1 file changed, 39 insertions(+), 299 deletions(-) diff --git a/index.bs b/index.bs index dd2c6b6e..5532eb56 100644 --- a/index.bs +++ b/index.bs @@ -641,9 +641,9 @@ The {{MLGraph}} interface represents a compiled computational graph that is immu The {{MLGraphBuilder}} interface serves as a builder (factory) to construct a [=computational graph=] (its graph) that is then compiled to create an {{MLGraph}}. -In WebNN, a [=computational graph=] is composed of operators which act on data, and are the nodes of the graph. {{MLOperand}}s are a representation of data that flows within the computational graph, and are the edges of the graph. {{MLOperand}}s include a [=computational graph=]'s input values for inference, constants (including trained weights) used for inference, intermediate values (often referred to as activations) computed during inference, as well as the output values of inference. An [=operator=]'s input is one or more {{MLOperand}}s. An [=operator=]'s output is one or more {{MLOperand}}s. [=Operators=] have operator-specific parameters that control their behavior, which can include zero or more activation functions, which are {{MLActivation}}s. +In WebNN, a [=computational graph=] is composed of operators which act on data, and are the nodes of the graph. {{MLOperand}}s are a representation of data that flows within the computational graph, and are the edges of the graph. {{MLOperand}}s include a [=computational graph=]'s input values for inference, constants (including trained weights) used for inference, intermediate values (often referred to as activations) computed during inference, as well as the output values of inference. An [=operator=]'s input is one or more {{MLOperand}}s. An [=operator=]'s output is one or more {{MLOperand}}s. [=Operators=] have operator-specific parameters that control their behavior, which can include zero or more activation functions. -A key part of the {{MLGraphBuilder}} interface are methods such as {{MLGraphBuilder/gemm()}} and {{MLGraphBuilder/relu()}} which create an [=operator=] which represents the actual operation to perform on the input data when the computation is run, and return a new {{MLOperand}} or {{MLActivation}} holding the operator. Methods that create an {{MLOperand}} connect any [=operator/inputs=] and [=operator/activations=] to the operator. Each method invocation returns a distinct new value, without changing the value of any other {{MLOperand}}. +A key part of the {{MLGraphBuilder}} interface are methods such as {{MLGraphBuilder/gemm()}} and {{MLGraphBuilder/relu()}} which create an [=operator=] which represents the actual operation to perform on the input data when the computation is run, and return a new {{MLOperand}} holding the operator. Methods that create an {{MLOperand}} connect any [=operator/inputs=] and [=operator/activations=] to the operator. Each method invocation returns a distinct new value, without changing the value of any other {{MLOperand}}. At inference time, every {{MLOperand}} will be bound to a tensor (the actual data), which are essentially multidimensional arrays. The representation of the tensors is implementation dependent, but it typically includes the array data stored in some buffer (memory) and some metadata describing the array data (such as its shape). @@ -1189,58 +1189,6 @@ Return a shape of the {{MLOperand}}. 1. Return [=this=]'s [=MLOperand/shape=]. -## {{MLActivation}} interface ## {#api-mlactivation} - -Objects implementing the {{MLActivation}} interface represent activation function types. - - - -
-{{MLActivation}} has the following internal slots: -
- : \[[name]] of type [=string=] - :: - The {{MLActivation}}'s name. - : \[[builder]] of type {{MLGraphBuilder}} - :: - A dictionary containing {{MLActivation}} options. - : \[[operator]] of type [=operator=] - :: - Reference to {{MLActivation}}'s corresponding [=operator=]. -
-
- -
-These activations functions are used to configure the behavior of recurrent operators such as {{MLGraphBuilder/gru()}} or {{MLGraphBuilder/lstm()}}. -
- -Each {{MLActivation}} has associated validation steps, which is an algorithm accepting an {{MLOperandDescriptor}} and returning a boolean. The default activation validation steps are to return true. - -### Creating {{MLActivation}} ### {#api-mlactivation-create} -
-The {{MLActivation}} objects (including the ones passed as input to methods) are created by the methods of {{MLGraphBuilder}} and are identified by their name. The |options| dictionary is defined by those methods. The actual creation of the activation function e.g. a {{MLGraphBuilder/sigmoid()}} or {{MLGraphBuilder/relu()}} can then be deferred until when the rest of the graph is ready to connect with it such as during the construction of {{MLGraphBuilder/lstm()}} for example. -
- -
- - To create an MLActivation given {{MLGraphBuilder}} |builder|, [=string=] |name|, optional [=ordered map=] |options|, and optional algorithm |validation steps|, run the following steps: - - 1. Let |activation| be a new {{MLActivation}}. - 1. Set |activation|.{{MLActivation/[[builder]]}} to |builder|. - 1. Set |activation|.{{MLActivation/[[name]]}} to |name|. - 1. Let |operator| be an [=operator=] for the |name| operation, given |options|. - 1. Set |activation|.{{MLActivation/[[operator]]}} to |operator|. - 1. Set |activation|'s [=MLActivation/validation steps=] to |validation steps| if given, or the [=MLActivation/default activation validation steps=] otherwise. - 1. Return |activation|. -
- -

-To validate activation given {{MLGraphBuilder}} |builder| and {{MLActivation}} |activation|, return true if |activation|.{{MLOperand/[[builder]]}} is |builder|, and false otherwise. -

- ## {{MLGraphBuilder}} interface ## {#api-mlgraphbuilder} The {{MLGraphBuilder}} interface defines a set of operations as identified by the [[#usecases]] that can be composed into a computational graph. It also represents the intermediate state of a graph building session. @@ -2558,7 +2506,6 @@ dictionary MLEluOptions { partial interface MLGraphBuilder { MLOperand elu(MLOperand input, optional MLEluOptions options = {}); - MLActivation elu(optional MLEluOptions options = {}); }; @@ -2614,26 +2561,6 @@ partial interface MLGraphBuilder { -#### {{MLGraphBuilder/elu(options)}} #### {#api-mlgraphbuilder-elu-options} -
- **Arguments:** - - *options*: an optional {{MLEluOptions}}. The optional parameters of the operation. - - **Returns:** - - an {{MLActivation}}. The activation function representing the elu operation. -
- -
- - The elu(|options|) method steps are: - - 1. Let |validationSteps| given {{MLOperandDescriptor}} |descriptor| be these steps: - 1. Set |options|.{{MLEluOptions/alpha}} to the result of [=casting=] |options|.{{MLEluOptions/alpha}} to |descriptor|.{{MLOperandDescriptor/dataType}}. - 1. Return true. - 1. Let |op| be the result of [=creating an MLActivation=] given [=this=], "elu", |options|, and |validationSteps|. - 1. Return |op|. -
- ### expand ### {#api-mlgraphbuilder-expand} Expand any dimension of size 1 of the input tensor to a larger size according to the new shape. The expansion is consistent with [[!numpy-broadcasting-rule]]. The input dimensions must have the size of 1 or match the sizes of the corresponding output dimensions according to the new shape. @@ -2849,23 +2775,6 @@ partial interface MLGraphBuilder { -#### {{MLGraphBuilder/gelu()}} #### {#api-mlgraphbuilder-gelu} -
- **Arguments:** - - None. - - **Returns:** - - an {{MLActivation}}. The activation function representing the gelu operation. -
- -
- - The gelu() method steps are: - - 1. Let |op| be the result of [=creating an MLActivation=] given [=this=] and "gelu". - 1. Return |op|. -
- ### gemm ### {#api-mlgraphbuilder-gemm} Calculate the [general matrix multiplication of the Basic Linear Algebra Subprograms](https://en.wikipedia.org/wiki/Basic_Linear_Algebra_Subprograms#Level_3). The calculation follows the expression `alpha * A * B + beta * C`, where `A` is a 2-D tensor with shape *[M, K]* or *[K, M]*, `B` is a 2-D tensor with shape *[K, N]* or *[N, K]*, and `C` is [=unidirectionally broadcastable=] to the shape *[M, N]*. `A` and `B` may optionally be transposed prior to the calculation. @@ -2980,6 +2889,12 @@ enum MLGruWeightLayout { "rzn" // reset-update-new gate ordering }; +enum MLRecurrentNetworkActivation { + "relu", + "sigmoid", + "tanh" +}; + enum MLRecurrentNetworkDirection { "forward", "backward", @@ -2994,7 +2909,7 @@ dictionary MLGruOptions { boolean returnSequence = false; MLRecurrentNetworkDirection direction = "forward"; MLGruWeightLayout layout = "zrn"; - sequence activations; + sequence activations; }; partial interface MLGraphBuilder { @@ -3040,7 +2955,7 @@ partial interface MLGraphBuilder { : activations :: - Specifies a pair of activation functions with the first function used for the update and reset gate, and the second used for the new gate. When not specified, implementations SHOULD use the pair of sigmoid ("sigmoid") and the hyperbolic tangent ("tanh") functions, respectively. + Specifies a pair of [=operator/activation functions=] with the first function used for the update and reset gate, and the second used for the new gate. When not specified, defaults to the "`sigmoid`" and "`tanh`" functions, respectively.
@@ -3060,7 +2975,6 @@ partial interface MLGraphBuilder { The gru(|input|, |weight|, |recurrentWeight|, |steps|, |hiddenSize|, |options|) method steps are: 1. If [=MLGraphBuilder/validating operand=] with [=this=] and any of |input|, |weight|, |recurrentWeight|, |options|.{{MLGruOptions/bias}} (if it [=map/exists=]), |options|.{{MLGruOptions/recurrentBias}} (if it [=map/exists=]), and |options|.{{MLGruOptions/initialHiddenState}} (if it [=map/exists=]) returns false, then [=exception/throw=] a {{TypeError}}. - 1. If |options|.{{MLGruOptions/activations}} [=map/exists=], and [=MLGraphBuilder/validating activation=] with [=this=] and any [=list/item=] in it returns false, then [=exception/throw=] a {{TypeError}}. 1. If |input|'s [=MLOperand/dataType=] is not {{MLOperandDataType/"float32"}} or {{MLOperandDataType/"float16"}}, then [=exception/throw=] a {{TypeError}}. 1. If |input|'s [=MLOperand/rank=] is not 3, then [=exception/throw=] a {{TypeError}}. 1. If the [=MLOperand/dataType=] of either |weight| or |recurrentWeight| is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}. @@ -3084,12 +2998,11 @@ partial interface MLGraphBuilder { 1. If |options|.{{MLGruOptions/initialHiddenState}} [=map/exists=]: 1. If its [=MLOperand/dataType=] is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}. 1. If its [=MLOperand/shape=] is not equal to « |numDirections|, |batchSize|, |hiddenSize| », then [=exception/throw=] a {{TypeError}}. - 1. If |options|.{{MLGruOptions/activations}} [=map/exists=] and its [=list/size=] is not 2, then [=exception/throw=] a {{TypeError}}. 1. If |options|.{{MLGruOptions/activations}} [=map/exists=]: - 1. Let |gateDescriptor| be a new {{MLOperandDescriptor}}. - 1. Set |gateDescriptor|.{{MLOperandDescriptor/dimensions}} to « |batchSize|, |hiddenSize| ». - 1. Set |gateDescriptor|.{{MLOperandDescriptor/dataType}} to |input|'s [=MLOperand/dataType=]. - 1. If running the [=MLActivation/validation steps=] of any [=list/item=] in |options|.{{MLGruOptions/activations}} with |gateDescriptor| returns false, then [=exception/throw=] a {{TypeError}}. + 1. If its [=list/size=] is not 2, then [=exception/throw=] a {{TypeError}}. + 1. Let |activations| be a [=list/clone=] of |options|.{{MLGruOptions/activations}}. + 1. Otherwise: + 1. Let |activations| be « "`sigmoid`", "`tanh`" ». 1. *Calculate the output shape:* 1. Let |desc0| be a new {{MLOperandDescriptor}}. 1. Set |desc0|.{{MLOperandDescriptor/dimensions}} to the [=/list=] « |numDirections|, |batchSize|, |hiddenSize| ». @@ -3112,7 +3025,7 @@ partial interface MLGraphBuilder { 1. If |options|.{{MLGruOptions/bias}} [=map/exists=], then add it to |operator|'s [=operator/inputs=]. 1. If |options|.{{MLGruOptions/recurrentBias}} [=map/exists=], then add it to |operator|'s [=operator/inputs=]. 1. If |options|.{{MLGruOptions/initialHiddenState}} [=map/exists=], then add it to |operator|'s [=operator/inputs=]. - 1. If |options|.{{MLGruOptions/activations}} [=map/exists=], then add its [=list/items=] to |operator|'s [=operator/activation functions=]. + 1. Set |operator|'s [=operator/activation functions=] to a [=list/clone=] of |activations|. 1. Set |operator|'s [=operator/output=] to |output|. 1. Return |output|. @@ -3231,7 +3144,7 @@ dictionary MLGruCellOptions { MLOperand recurrentBias; boolean resetAfter = true; MLGruWeightLayout layout = "zrn"; - sequence activations; + sequence activations; }; partial interface MLGraphBuilder { @@ -3264,7 +3177,7 @@ partial interface MLGraphBuilder { : activations :: - Specifies a pair of activation functions with the first function used for the update and reset gate, and the second used for the new gate. When not specified, implementations SHOULD use the pair of sigmoid ("sigmoid") and the hyperbolic tangent ("tanh") functions, respectively. + Specifies a pair of [=operator/activation functions=] with the first function used for the update and reset gate, and the second used for the new gate. When not specified, defaults to the "`sigmoid`" and "`tanh`" functions, respectively.
@@ -3284,7 +3197,6 @@ partial interface MLGraphBuilder { The gruCell(|input|, |weight|, |recurrentWeight|, |hiddenState|, |hiddenSize|, |options|) method steps are: 1. If [=MLGraphBuilder/validating operand=] with [=this=] and any of |input|, |weight|, |recurrentWeight|, |hiddenState|, |options|.{{MLGruCellOptions/bias}} (if it [=map/exists=]), and |options|.{{MLGruCellOptions/recurrentBias}} (if it [=map/exists=]) returns false, then [=exception/throw=] a {{TypeError}}. - 1. If |options|.{{MLGruCellOptions/activations}} [=map/exists=], and [=MLGraphBuilder/validating activation=] with [=this=] and any [=list/item=] in it returns false, then [=exception/throw=] a {{TypeError}}. 1. If |input|'s [=MLOperand/dataType=] is not {{MLOperandDataType/"float32"}} or {{MLOperandDataType/"float16"}}, then [=exception/throw=] a {{TypeError}}. 1. If |input|'s [=MLOperand/rank=] is not 2, then [=exception/throw=] a {{TypeError}}. 1. Let |batchSize| be |input|'s [=MLOperand/shape=][0]. @@ -3298,17 +3210,20 @@ partial interface MLGraphBuilder { Why |hiddenSize| * 6 ? Some underlying platforms operate on a single bias tensor which is a concatenation of {{MLGruCellOptions/bias}} and {{MLGruCellOptions/recurrentBias}}. Therefore, 3 * |hiddenSize| + 3 * |hiddenSize| must also be a [=valid dimension=]. - 1. If |options|.{{MLGruOptions/bias}} [=map/exists=]: + 1. If |options|.{{MLGruCellOptions/bias}} [=map/exists=]: 1. If its [=MLOperand/dataType=] is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}. 1. If its [=MLOperand/shape=] is not equal to « 3 * |hiddenSize| », then [=exception/throw=] a {{TypeError}}. - 1. If |options|.{{MLGruOptions/recurrentBias}} [=map/exists=]: + 1. If |options|.{{MLGruCellOptions/recurrentBias}} [=map/exists=]: 1. If its [=MLOperand/dataType=] is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}. 1. If its [=MLOperand/shape=] is not equal to « 3 * |hiddenSize| », then [=exception/throw=] a {{TypeError}}. - 1. If |options|.{{MLGruOptions/activations}} [=map/exists=] and its [=list/size=] is not 2, then [=exception/throw=] a {{TypeError}}. + 1. If |options|.{{MLGruCellOptions/activations}} [=map/exists=]: + 1. If its [=list/size=] is not 2, then [=exception/throw=] a {{TypeError}}. + 1. Let |activations| be a [=list/clone=] of |options|.{{MLGruCellOptions/activations}}. + 1. Otherwise: + 1. Let |activations| be « "`sigmoid`", "`tanh`" ». 1. Let |desc| be a new {{MLOperandDescriptor}}. 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to the [=/list=] « |batchSize|, |hiddenSize| ». 1. Set |desc|.{{MLOperandDescriptor/dataType}} to |input|'s [=MLOperand/dataType=]. - 1. If |options|.{{MLGruCellOptions/activations}} [=map/exists=], and running the [=MLActivation/validation steps=] of any [=list/item=] in it with |desc| returns false, then [=exception/throw=] a {{TypeError}}. 1. *Make graph connections:* 1. Let |output| be the result of [=creating an MLOperand=] given [=this=] and |desc|. 1. Let |operator| be an [=operator=] for the "gruCell" operation, given |weight|, |recurrentWeight|, |hiddenState|, |hiddenSize| and |options| as parameters. @@ -3316,7 +3231,7 @@ partial interface MLGraphBuilder { 1. Set |operator|'s [=operator/inputs=] to |input|, |weight|, |recurrentWeight|, and |hiddenState|. 1. If |options|.{{MLGruCellOptions/bias}} [=map/exists=], then add it to |operator|'s [=operator/inputs=]. 1. If |options|.{{MLGruCellOptions/recurrentBias}} [=map/exists=], then add it to |operator|'s [=operator/inputs=]. - 1. If |options|.{{MLGruCellOptions/activations}} [=map/exists=], then add its [=list/items=] to |operator|'s [=operator/activation functions=]. + 1. Set |operator|'s [=operator/activation functions=] to a [=list/clone=] of |activations|. 1. Set |operator|'s [=operator/output=] to |output|. 1. Return |output|. @@ -3324,7 +3239,7 @@ partial interface MLGraphBuilder {
- The behavior of this operation when the weight layout is the default {{MLGruWeightLayout/"zrn"}} layout, and the activation functions of the update/reset gate and new gate are {{MLGraphBuilder/sigmoid()}} and {{MLGraphBuilder/tanh()}} respectively can be [EMULATED] + The behavior of this operation when the weight layout is the default {{MLGruWeightLayout/"zrn"}} layout, and the [=operator/activation functions=] of the update/reset gate and new gate are {{MLGraphBuilder/sigmoid()}} and {{MLGraphBuilder/tanh()}} respectively can be [EMULATED]
     function gruCell(
@@ -3435,7 +3350,6 @@ dictionary MLHardSigmoidOptions {
 
 partial interface MLGraphBuilder {
   MLOperand hardSigmoid(MLOperand input, optional MLHardSigmoidOptions options = {});
-  MLActivation hardSigmoid(optional MLHardSigmoidOptions options = {});
 };
 
 
@@ -3495,33 +3409,11 @@ partial interface MLGraphBuilder {
   
-#### {{MLGraphBuilder/hardSigmoid(options)}} #### {#api-mlgraphbuilder-hardsigmoid-options} -
- **Arguments:** - - *options*: an optional {{MLHardSigmoidOptions}}. The optional parameters of the operation. - - **Returns:** - - an {{MLActivation}}. The activation function representing the hard sigmoid operation. -
- -
- - The hardSigmoid(|options|) method steps are: - - 1. Let |validationSteps| given {{MLOperandDescriptor}} |descriptor| be these steps: - 1. Set |options|.{{MLHardSigmoidOptions/alpha}} to the result of [=casting=] |options|.{{MLHardSigmoidOptions/alpha}} to |descriptor|.{{MLOperandDescriptor/dataType}}. - 1. Set |options|.{{MLHardSigmoidOptions/beta}} to the result of [=casting=] |options|.{{MLHardSigmoidOptions/beta}} to |descriptor|.{{MLOperandDescriptor/dataType}}. - 1. Return true. - 1. Let |op| be the result of [=creating an MLActivation=] given [=this=], "hardSigmoid", |options|, and |validationSteps|. - 1. Return |op|. -
- ### hardSwish ### {#api-mlgraphbuilder-hard-swish} Computes the nonlinear function `y = x * max(0, min(6, (x + 3))) / 6` that is introduced by [[MobileNetV3]] on the input tensor element-wise. @@ -3570,24 +3462,6 @@ partial interface MLGraphBuilder {
-#### {{MLGraphBuilder/hardSwish()}} #### {#api-mlgraphbuilder-hardswish} -
- **Arguments:** - - None. - - **Returns:** - - an {{MLActivation}}. The activation function representing the hard-swish operation. -
- -
- - The hardSwish() method steps are: - - 1. Let |op| be the result of [=creating an MLActivation=] given [=this=] and "hardSwish". - 1. Return |op|. -
- - ### instanceNormalization ### {#api-mlgraphbuilder-instancenorm} Normalize the input using [[Instance-Normalization]]. Unlike {{MLGraphBuilder/batchNormalization()}} where the mean and variance values used in the normalization are computed across all the samples in the batch dimension while the model is trained, the mean and variance values used in the instance normalization are computed on the fly for each input feature of each individual sample in the batch. @@ -3808,7 +3682,6 @@ dictionary MLLeakyReluOptions { partial interface MLGraphBuilder { MLOperand leakyRelu(MLOperand input, optional MLLeakyReluOptions options = {}); - MLActivation leakyRelu(optional MLLeakyReluOptions options = {}); }; @@ -3862,26 +3735,6 @@ partial interface MLGraphBuilder {
-#### {{MLGraphBuilder/leakyRelu(options)}} #### {#api-mlgraphbuilder-leaky-relu-options} -
- **Arguments:** - - *options*: an optional {{MLLeakyReluOptions}}. The optional parameters of the operation. - - **Returns:** - - an {{MLActivation}}. The activation function representing the leaky relu operation. -
- -
- - The leakyRelu(|options|) method steps are: - - 1. Let |validationSteps| given {{MLOperandDescriptor}} |descriptor| be these steps: - 1. Set |options|.{{MLLeakyReluOptions/alpha}} to the result of [=casting=] |options|.{{MLLeakyReluOptions/alpha}} to |descriptor|.{{MLOperandDescriptor/dataType}}. - 1. Return true. - 1. Let |op| be the result of [=creating an MLActivation=] given [=this=], "leakyRelu", |options|, and |validationSteps|. - 1. Return |op|. -
- ### linear ### {#api-mlgraphbuilder-linear} Calculate a linear function `y = alpha * x + beta` on the input tensor. @@ -3893,7 +3746,6 @@ dictionary MLLinearOptions { partial interface MLGraphBuilder { MLOperand linear(MLOperand input, optional MLLinearOptions options = {}); - MLActivation linear(optional MLLinearOptions options = {}); }; @@ -3949,27 +3801,6 @@ partial interface MLGraphBuilder { -#### {{MLGraphBuilder/linear(options)}} #### {#api-mlgraphbuilder-linear-options} -
- **Arguments:** - - *options*: an optional {{MLLinearOptions}}. The optional parameters of the operation. - - **Returns:** - - an {{MLActivation}}. The activation function representing the linear operation. -
- -
- - The linear(|options|) method steps are: - - 1. Let |validationSteps| given {{MLOperandDescriptor}} |descriptor| be these steps: - 1. Set |options|.{{MLLinearOptions/alpha}} to the result of [=casting=] |options|.{{MLLinearOptions/alpha}} to |descriptor|.{{MLOperandDescriptor/dataType}}. - 1. Set |options|.{{MLLinearOptions/beta}} to the result of [=casting=] |options|.{{MLLinearOptions/beta}} to |descriptor|.{{MLOperandDescriptor/dataType}}. - 1. Return true. - 1. Let |op| be the result of [=creating an MLActivation=] given [=this=], "linear", |options|, and |validationSteps|. - 1. Return |op|. -
- ### lstm ### {#api-mlgraphbuilder-lstm} Long Short-Term Memory [[LSTM]] recurrent network uses an input, output, forget, and cell gate to compute the output state that rolls into the output across the temporal sequence of the network. @@ -3988,7 +3819,7 @@ dictionary MLLstmOptions { boolean returnSequence = false; MLRecurrentNetworkDirection direction = "forward"; MLLstmWeightLayout layout = "iofg"; - sequence activations; + sequence activations; }; partial interface MLGraphBuilder { @@ -4037,7 +3868,7 @@ partial interface MLGraphBuilder { : activations :: - A list of three activation functions, the first one is used for the `input (i)`, `forget (f)`, and `output (o)` gate, the second one is used for the `cell (g)` gate, and the last used for filtering the output cell state before combining it with the result of the output gate to form the output hidden state. When not specified, implementations SHOULD use the sequence of the sigmoid function ("sigmoid") followed by two hyperbolic tangent functions ("tanh") respectively. + A list of three [=operator/activation functions=], the first one is used for the `input (i)`, `forget (f)`, and `output (o)` gate, the second one is used for the `cell (g)` gate, and the last used for filtering the output cell state before combining it with the result of the output gate to form the output hidden state. When not specified, defaults to a sequence of the "`sigmoid`", "`tanh`", and "`tanh`" functions, respectively.
@@ -4057,7 +3888,6 @@ partial interface MLGraphBuilder { The lstm(|input|, |weight|, |recurrentWeight|, |steps|, |hiddenSize|, |options|) method steps are: 1. If [=MLGraphBuilder/validating operand=] with [=this=] and any of |input|, |weight|, |recurrentWeight|, |options|.{{MLLstmOptions/bias}} (if it [=map/exists=]), |options|.{{MLLstmOptions/recurrentBias}} (if it [=map/exists=]), |options|.{{MLLstmOptions/peepholeWeight}} (if it [=map/exists=]), |options|.{{MLLstmOptions/initialHiddenState}} (if it [=map/exists=]), and |options|.{{MLLstmOptions/initialCellState}} (if it [=map/exists=]) returns false, then [=exception/throw=] a {{TypeError}}. - 1. If |options|.{{MLLstmOptions/activations}} [=map/exists=], and [=MLGraphBuilder/validating activation=] with [=this=] and any [=list/item=] in it returns false, then [=exception/throw=] a {{TypeError}}. 1. Let |numDirections| be 2 if |options|.{{MLLstmOptions/direction}} is {{MLRecurrentNetworkDirection/"both"}}, or 1 otherwise. 1. If |input|'s [=MLOperand/dataType=] is not {{MLOperandDataType/"float32"}} or {{MLOperandDataType/"float16"}}, then [=exception/throw=] a {{TypeError}}. 1. If |input|'s [=MLOperand/rank=] is not 3, then [=exception/throw=] a {{TypeError}}. @@ -4090,10 +3920,9 @@ partial interface MLGraphBuilder { 1. If its [=MLOperand/shape=] is not equal to « |numDirections|, |batchSize|, |hiddenSize| », then [=exception/throw=] a {{TypeError}}. 1. If |options|.{{MLLstmOptions/activations}} [=map/exists=]: 1. If its [=list/size=] is not 3, then [=exception/throw=] a {{TypeError}}. - 1. Let |gateDescriptor| be a new {{MLOperandDescriptor}}. - 1. Set |gateDescriptor|.{{MLOperandDescriptor/dimensions}} to the [=/list=] « |batchSize|, |hiddenSize| ». - 1. Set |gateDescriptor|.{{MLOperandDescriptor/dataType}} to |input|'s [=MLOperand/dataType=]. - 1. If running the [=MLActivation/validation steps=] of any [=list/item=] in |options|.{{MLLstmOptions/activations}} with |gateDescriptor| returns false, then [=exception/throw=] a {{TypeError}}. + 1. Let |activations| be a [=list/clone=] of |options|.{{MLLstmOptions/activations}}. + 1. Otherwise: + 1. Let |activations| be « "`sigmoid`", "`tanh`", "`tanh`" ». 1. *Calculate the output shape:* 1. Let |desc| be a new {{MLOperandDescriptor}}. 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to the [=/list=] « |numDirections|, |batchSize|, |hiddenSize| ». @@ -4119,7 +3948,7 @@ partial interface MLGraphBuilder { 1. If |options|.{{MLLstmOptions/peepholeWeight}} [=map/exists=], then add it to |operator|'s [=operator/inputs=]. 1. If |options|.{{MLLstmOptions/initialHiddenState}} [=map/exists=], then add it to |operator|'s [=operator/inputs=]. 1. If |options|.{{MLLstmOptions/initialCellState}} [=map/exists=], then add it to |operator|'s [=operator/inputs=]. - 1. If |options|.{{MLLstmOptions/activations}} [=map/exists=], then add its [=list/items=] to |operator|'s [=operator/activation functions=]. + 1. Set |operator|'s [=operator/activation functions=] to a [=list/clone=] of |activations|. 1. Set |operator|'s [=operator/output=] to |output|. 1. Return |output|. @@ -4265,7 +4094,7 @@ dictionary MLLstmCellOptions { MLOperand recurrentBias; MLOperand peepholeWeight; MLLstmWeightLayout layout = "iofg"; - sequence activations; + sequence activations; }; partial interface MLGraphBuilder { @@ -4299,7 +4128,7 @@ partial interface MLGraphBuilder { : activations :: - A list of three activation functions, the first one is used for the `input (i)`, `forget (f)`, and `output (o)` gate, the second one is used for the `cell (g)` gate, and the last used for filtering the output cell state before combining it with the result of the output gate to form the output hidden state. When not specified, they are assumed to be of the sigmoid function ("sigmoid") followed by two hyperbolic tangent functions ("tanh") respectively. + A list of three [=operator/activation functions=], the first one is used for the `input (i)`, `forget (f)`, and `output (o)` gate, the second one is used for the `cell (g)` gate, and the last used for filtering the output cell state before combining it with the result of the output gate to form the output hidden state. When not specified, defaults to a sequence of the "`sigmoid`", "`tanh`", and "`tanh`" functions, respectively.
@@ -4320,7 +4149,6 @@ partial interface MLGraphBuilder { The lstmCell(|input|, |weight|, |recurrentWeight|, |hiddenState|, |cellState|, |hiddenSize|, |options|) method steps are: 1. If [=MLGraphBuilder/validating operand=] with [=this=] and any of |input|, |weight|, |recurrentWeight|, |hiddenState|, |cellState|, |options|.{{MLLstmCellOptions/bias}} (if it [=map/exists=]), |options|.{{MLLstmCellOptions/recurrentBias}} (if it [=map/exists=]), and |options|.{{MLLstmCellOptions/peepholeWeight}} (if it [=map/exists=]) returns false, then [=exception/throw=] a {{TypeError}}. - 1. If |options|.{{MLLstmCellOptions/activations}} [=map/exists=], and [=MLGraphBuilder/validating activation=] with [=this=] and any [=list/item=] in it returns false, then [=exception/throw=] a {{TypeError}}. 1. If |input|'s [=MLOperand/dataType=] is not {{MLOperandDataType/"float32"}} or {{MLOperandDataType/"float16"}}, then [=exception/throw=] a {{TypeError}}. 1. If |input|'s [=MLOperand/rank=] is not equal to 2, then [=exception/throw=] a {{TypeError}}. 1. If the [=MLOperand/dataType=] of any of |weight|, |recurrentWeight|, |hiddenState| or |cellState| is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}. @@ -4346,10 +4174,12 @@ partial interface MLGraphBuilder { 1. If its [=MLOperand/shape=] is not equal to « 3 * |hiddenSize| », then [=exception/throw=] a {{TypeError}}. 1. If |options|.{{MLLstmCellOptions/activations}} [=map/exists=]: 1. If its [=list/size=] is not 3, then [=exception/throw=] a {{TypeError}}. + 1. Let |activations| be a [=list/clone=] of |options|.{{MLLstmCellOptions/activations}}. + 1. Otherwise: + 1. Let |activations| be « "`sigmoid`", "`tanh`", "`tanh`" ». 1. Let |desc| be a new {{MLOperandDescriptor}}. 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to the [=/list=] « |batchSize|, |hiddenSize| ». 1. Set |desc|.{{MLOperandDescriptor/dataType}} to |input|'s [=MLOperand/dataType=]. - 1. If |options|.{{MLLstmCellOptions/activations}} [=map/exists=], and running the [=MLActivation/validation steps=] of any [=list/item=] in it with |desc| returns false, then [=exception/throw=] a {{TypeError}}. 1. *Make graph connections:* 1. Let |output0| be the result of [=creating an MLOperand=] given [=this=] and |desc|. 1. Let |output1| be the result of [=creating an MLOperand=] given [=this=] and |desc|. @@ -4360,7 +4190,7 @@ partial interface MLGraphBuilder { 1. If |options|.{{MLLstmCellOptions/bias}} [=map/exists=], then add it to |operator|'s [=operator/inputs=]. 1. If |options|.{{MLLstmCellOptions/recurrentBias}} [=map/exists=], then add it to |operator|'s [=operator/inputs=]. 1. If |options|.{{MLLstmCellOptions/peepholeWeight}} [=map/exists=], then add it to |operator|'s [=operator/inputs=]. - 1. If |options|.{{MLLstmCellOptions/activations}} [=map/exists=], then add its [=list/items=] to |operator|'s [=operator/activation functions=]. + 1. Set |operator|'s [=operator/activation functions=] to a [=list/clone=] of |activations|. 1. Set |operator|'s [=operator/output=] to |output|. 1. Return |output|. @@ -4368,7 +4198,7 @@ partial interface MLGraphBuilder {
- The behavior of this operation when the weight layout is the default {{MLLstmWeightLayout/"iofg"}} layout, and the activation functions of the input/forget/output gate and the cell gate/the cell state's filter for the output hidden state are {{MLGraphBuilder/sigmoid()}} and {{MLGraphBuilder/tanh()}} respectively can be [EMULATED] + The behavior of this operation when the weight layout is the default {{MLLstmWeightLayout/"iofg"}} layout, and the [=operator/activation functions=] of the input/forget/output gate and the cell gate/the cell state's filter for the output hidden state are {{MLGraphBuilder/sigmoid()}} and {{MLGraphBuilder/tanh()}} respectively can be [EMULATED]
     function lstmCell(
@@ -5143,7 +4973,6 @@ Compute the 
 
 
@@ -5184,23 +5013,6 @@ partial interface MLGraphBuilder {
   
-#### {{MLGraphBuilder/relu()}} #### {#api-mlgraphbuilder-relu} -
- **Arguments:** - - None. - - **Returns:** - - an {{MLActivation}}. The activation function representing the relu operation. -
- -
- - The relu() method steps are: - - 1. Let |op| be the result of [=creating an MLActivation=] given [=this=] and "relu". - 1. Return |op|. -
- ### resample2d ### {#api-mlgraphbuilder-resample2d-method} Resample the tensor values from the source to the destination spatial dimensions according to the scaling factors. @@ -5385,23 +5196,6 @@ partial interface MLGraphBuilder {
-#### {{MLGraphBuilder/sigmoid()}} #### {#api-mlgraphbuilder-sigmoid} -
- **Arguments:** - - None. - - **Returns:** - - an {{MLActivation}}. The activation function representing the sigmoid operation. -
- -
- - The sigmoid() method steps are: - - 1. Let |op| be the result of [=creating an MLActivation=] given [=this=] and "sigmoid". - 1. Return |op|. -
- ### slice ### {#api-mlgraphbuilder-slice} Produce a slice of the input tensor. @@ -5547,29 +5340,11 @@ partial interface MLGraphBuilder {
-#### {{MLGraphBuilder/softplus()}} #### {#api-mlgraphbuilder-softplus} -
- **Arguments:** - - None. - - **Returns:** - - an {{MLActivation}}. The activation function representing the softplus operation. -
- -
- - The softplus() method steps are: - - 1. Let |op| be the result of [=creating an MLActivation=] given [=this=] and "softplus". - 1. Return |op|. -
- ### softsign ### {#api-mlgraphbuilder-softsign-method} Compute the
softsign function of the input tensor. The calculation follows the expression `x / (1 + |x|)`. @@ -5612,23 +5387,6 @@ partial interface MLGraphBuilder { 1. Return |output|. -#### {{MLGraphBuilder/softsign()}} #### {#api-mlgraphbuilder-softsign} -
- **Arguments:** - - None. - - **Returns:** - - an {{MLActivation}}. The activation function representing the softsign operation. -
- -
- - The softsign() method steps are: - - 1. Let |op| be the result of [=creating an MLActivation=] given [=this=] and "softsign". - 1. Return |op|. -
- ### split ### {#api-mlgraphbuilder-split} Split the input tensor into a number of sub tensors along the given axis. @@ -5770,23 +5527,6 @@ partial interface MLGraphBuilder { -#### {{MLGraphBuilder/tanh()}} #### {#api-mlgraphbuilder-tanh} -
- **Arguments:** - - None. - - **Returns:** - - an {{MLActivation}}. The activation function representing the tanh operation. -
- -
- - The tanh() method steps are: - - 1. Let |op| be the result of [=creating an MLActivation=] given [=this=] and "tanh". - 1. Return |op|. -
- ### transpose ### {#api-mlgraphbuilder-transpose} Permute the dimensions of the input tensor according to the *permutation* argument. -#### {{MLGraphBuilder/gelu(input)}} #### {#api-mlgraphbuilder-gelu-input}
**Arguments:** - *input*: an {{MLOperand}}. The input tensor. @@ -3363,7 +3360,6 @@ partial interface MLGraphBuilder { A scalar addition. -#### {{MLGraphBuilder/hardSigmoid(input, options)}} #### {#api-mlgraphbuilder-hardsigmoid-input-options}
**Arguments:** - *input*: an {{MLOperand}}. The input tensor. @@ -3417,7 +3413,6 @@ partial interface MLGraphBuilder { }; -#### {{MLGraphBuilder/hardSwish(input)}} #### {#api-mlgraphbuilder-hardswish-input}
**Arguments:** - *input*: an {{MLOperand}}. The input tensor. @@ -3692,7 +3687,6 @@ partial interface MLGraphBuilder { A scalar multiplier. -#### {{MLGraphBuilder/leakyRelu(input, options)}} #### {#api-mlgraphbuilder-leaky-relu-input-options}
**Arguments:** - *input*: an {{MLOperand}}. The input tensor. @@ -3759,7 +3753,6 @@ partial interface MLGraphBuilder { A scalar addition. -#### {{MLGraphBuilder/linear(input, options)}} #### {#api-mlgraphbuilder-linear-input-options}
**Arguments:** - *input*: an {{MLOperand}}. The input tensor. @@ -4976,7 +4969,6 @@ partial interface MLGraphBuilder { }; -#### {{MLGraphBuilder/relu(input)}} #### {#api-mlgraphbuilder-relu-input}
**Arguments:** - *input*: an {{MLOperand}}. The input tensor. @@ -5156,7 +5148,6 @@ partial interface MLGraphBuilder { }; -#### {{MLGraphBuilder/sigmoid(input)}} #### {#api-mlgraphbuilder-sigmoid-input}
**Arguments:** - *input*: an {{MLOperand}}. The input tensor. @@ -5302,7 +5293,6 @@ partial interface MLGraphBuilder { }; -#### {{MLGraphBuilder/softplus(input)}} #### {#api-mlgraphbuilder-softplus-input}
**Arguments:** - *input*: an {{MLOperand}}. The input tensor. @@ -5363,7 +5353,6 @@ partial interface MLGraphBuilder {
-#### {{MLGraphBuilder/softsign(input)}} #### {#api-mlgraphbuilder-softsign-input}
**Arguments:** - *input*: an {{MLOperand}}. The input tensor. @@ -5484,7 +5473,6 @@ partial interface MLGraphBuilder { }; -#### {{MLGraphBuilder/tanh(input)}} #### {#api-mlgraphbuilder-tanh-input}
**Arguments:** - *input*: an {{MLOperand}}. The input tensor. From 1dec5841bae8873e8d08c6150c507bbcdb96a823 Mon Sep 17 00:00:00 2001 From: Austin Sullivan Date: Mon, 22 Jul 2024 08:51:46 -0700 Subject: [PATCH 4/4] fix nested "input" dfns --- index.bs | 22 +++++++++++----------- 1 file changed, 11 insertions(+), 11 deletions(-) diff --git a/index.bs b/index.bs index 1ba62622..1f70b6e0 100644 --- a/index.bs +++ b/index.bs @@ -2656,7 +2656,7 @@ partial interface MLGraphBuilder { :: A scalar multiplier. -
+
**Arguments:** - input: an {{MLOperand}}. The input tensor. - options: an optional {{MLEluOptions}}. The optional parameters of the operation. @@ -2875,7 +2875,7 @@ partial interface MLGraphBuilder { }; -
+
**Arguments:** - input: an {{MLOperand}}. The input tensor. @@ -3508,7 +3508,7 @@ partial interface MLGraphBuilder { A scalar addition. -
+
**Arguments:** - input: an {{MLOperand}}. The input tensor. - options: an optional {{MLHardSigmoidOptions}}. The optional parameters of the operation. @@ -3562,7 +3562,7 @@ partial interface MLGraphBuilder { }; -
+
**Arguments:** - input: an {{MLOperand}}. The input tensor. @@ -3839,7 +3839,7 @@ partial interface MLGraphBuilder { A scalar multiplier. -
+
**Arguments:** - input: an {{MLOperand}}. The input tensor. - options: an optional {{MLLeakyReluOptions}}. The optional parameters of the operation. @@ -3906,7 +3906,7 @@ partial interface MLGraphBuilder { A scalar addition. -
+
**Arguments:** - input: an {{MLOperand}}. The input tensor. - options: an optional {{MLLinearOptions}}. The optional parameters of the operation. @@ -5130,7 +5130,7 @@ partial interface MLGraphBuilder { }; -
+
**Arguments:** - input: an {{MLOperand}}. The input tensor. @@ -5312,7 +5312,7 @@ partial interface MLGraphBuilder { }; -
+
**Arguments:** - input: an {{MLOperand}}. The input tensor. @@ -5460,7 +5460,7 @@ partial interface MLGraphBuilder { }; -
+
**Arguments:** - input: an {{MLOperand}}. The input tensor. @@ -5521,7 +5521,7 @@ partial interface MLGraphBuilder {
-
+
**Arguments:** - input: an {{MLOperand}}. The input tensor. @@ -5643,7 +5643,7 @@ partial interface MLGraphBuilder { }; -
+
**Arguments:** - input: an {{MLOperand}}. The input tensor.