diff --git a/index.bs b/index.bs index b3878864..1f70b6e0 100644 --- a/index.bs +++ b/index.bs @@ -687,9 +687,9 @@ The {{MLGraph}} interface represents a compiled computational graph that is immu The {{MLGraphBuilder}} interface serves as a builder (factory) to construct a [=computational graph=] (its graph) that is then compiled to create an {{MLGraph}}. -In WebNN, a [=computational graph=] is composed of operators which act on data, and are the nodes of the graph. {{MLOperand}}s are a representation of data that flows within the computational graph, and are the edges of the graph. {{MLOperand}}s include a [=computational graph=]'s input values for inference, constants (including trained weights) used for inference, intermediate values (often referred to as activations) computed during inference, as well as the output values of inference. An [=operator=]'s input is one or more {{MLOperand}}s. An [=operator=]'s output is one or more {{MLOperand}}s. [=Operators=] have operator-specific parameters that control their behavior, which can include zero or more activation functions, which are {{MLActivation}}s. +In WebNN, a [=computational graph=] is composed of operators which act on data, and are the nodes of the graph. {{MLOperand}}s are a representation of data that flows within the computational graph, and are the edges of the graph. {{MLOperand}}s include a [=computational graph=]'s input values for inference, constants (including trained weights) used for inference, intermediate values (often referred to as activations) computed during inference, as well as the output values of inference. An [=operator=]'s input is one or more {{MLOperand}}s. An [=operator=]'s output is one or more {{MLOperand}}s. [=Operators=] have operator-specific parameters that control their behavior, which can include zero or more activation functions. -A key part of the {{MLGraphBuilder}} interface are methods such as {{MLGraphBuilder/gemm()}} and {{MLGraphBuilder/relu()}} which create an [=operator=] which represents the actual operation to perform on the input data when the computation is run, and return a new {{MLOperand}} or {{MLActivation}} holding the operator. Methods that create an {{MLOperand}} connect any [=operator/inputs=] and [=operator/activations=] to the operator. Each method invocation returns a distinct new value, without changing the value of any other {{MLOperand}}. +A key part of the {{MLGraphBuilder}} interface are methods such as {{MLGraphBuilder/gemm()}} and {{MLGraphBuilder/relu()}} which create an [=operator=] which represents the actual operation to perform on the input data when the computation is run, and return a new {{MLOperand}} holding the operator. Methods that create an {{MLOperand}} connect any [=operator/inputs=] and [=operator/activations=] to the operator. Each method invocation returns a distinct new value, without changing the value of any other {{MLOperand}}. At inference time, every {{MLOperand}} will be bound to a tensor (the actual data), which are essentially multidimensional arrays. The representation of the tensors is implementation dependent, but it typically includes the array data stored in some buffer (memory) and some metadata describing the array data (such as its shape). @@ -1245,58 +1245,6 @@ Returns the shape of the {{MLOperand}}. 1. Return [=this=]'s [=MLOperand/shape=]. -## {{MLActivation}} interface ## {#api-mlactivation} - -Objects implementing the {{MLActivation}} interface represent activation function types. - - - -
-To validate activation given {{MLGraphBuilder}} |builder| and {{MLActivation}} |activation|, return true if |activation|.{{MLOperand/[[builder]]}} is |builder|, and false otherwise. -
- ## {{MLGraphBuilder}} interface ## {#api-mlgraphbuilder} The {{MLGraphBuilder}} interface defines a set of operations as identified by the [[#usecases]] that can be composed into a computational graph. It also represents the intermediate state of a graph building session. @@ -2699,7 +2647,6 @@ dictionary MLEluOptions { partial interface MLGraphBuilder { MLOperand elu(MLOperand input, optional MLEluOptions options = {}); - MLActivation elu(optional MLEluOptions options = {}); }; @@ -2709,8 +2656,6 @@ partial interface MLGraphBuilder { :: A scalar multiplier. - -#### {{MLGraphBuilder/elu(input, options)}} #### {#api-mlgraphbuilder-elu-input-options}function gruCell( @@ -3585,7 +3495,6 @@ dictionary MLHardSigmoidOptions { partial interface MLGraphBuilder { MLOperand hardSigmoid(MLOperand input, optional MLHardSigmoidOptions options = {}); - MLActivation hardSigmoid(optional MLHardSigmoidOptions options = {}); }; @@ -3599,7 +3508,6 @@ partial interface MLGraphBuilder { A scalar addition. -#### {{MLGraphBuilder/hardSigmoid(input, options)}} #### {#api-mlgraphbuilder-hardsigmoid-input-options}**Arguments:** - input: an {{MLOperand}}. The input tensor. @@ -3646,38 +3554,14 @@ partial interface MLGraphBuilder {-#### {{MLGraphBuilder/hardSigmoid(options)}} #### {#api-mlgraphbuilder-hardsigmoid-options} -- **Arguments:** - - options: an optional {{MLHardSigmoidOptions}}. The optional parameters of the operation. - - **Returns:** - - an {{MLActivation}}. The activation function representing the hard sigmoid operation. -- --- ### hardSwish ### {#api-mlgraphbuilder-hard-swish} Computes the nonlinear function `y = x * max(0, min(6, (x + 3))) / 6` that is introduced by [[MobileNetV3]] on the input tensor element-wise. -#### {{MLGraphBuilder/hardSwish(input)}} #### {#api-mlgraphbuilder-hardswish-input}- The hardSigmoid(|options|) method steps are: -
- 1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}. - 1. Let |validationSteps| given {{MLOperandDescriptor}} |descriptor| be these steps: - 1. Set |options|.{{MLHardSigmoidOptions/alpha}} to the result of [=casting=] |options|.{{MLHardSigmoidOptions/alpha}} to |descriptor|.{{MLOperandDescriptor/dataType}}. - 1. Set |options|.{{MLHardSigmoidOptions/beta}} to the result of [=casting=] |options|.{{MLHardSigmoidOptions/beta}} to |descriptor|.{{MLOperandDescriptor/dataType}}. - 1. Return true. - 1. Let |op| be the result of [=creating an MLActivation=] given [=this=], "hardSigmoid", |options|, and |validationSteps|. - 1. Return |op|. -**Arguments:** - input: an {{MLOperand}}. The input tensor. @@ -3723,25 +3607,6 @@ partial interface MLGraphBuilder {-#### {{MLGraphBuilder/hardSwish()}} #### {#api-mlgraphbuilder-hardswish} -- **Arguments:** - - None. - - **Returns:** - - an {{MLActivation}}. The activation function representing the hard-swish operation. -- --- - ### instanceNormalization ### {#api-mlgraphbuilder-instancenorm} Normalize the input using [[Instance-Normalization]]. Unlike {{MLGraphBuilder/batchNormalization()}} where the mean and variance values used in the normalization are computed across all the samples in the batch dimension while the model is trained, the mean and variance values used in the instance normalization are computed on the fly for each input feature of each individual sample in the batch. @@ -3964,7 +3829,6 @@ dictionary MLLeakyReluOptions { partial interface MLGraphBuilder { MLOperand leakyRelu(MLOperand input, optional MLLeakyReluOptions options = {}); - MLActivation leakyRelu(optional MLLeakyReluOptions options = {}); }; @@ -3975,7 +3839,6 @@ partial interface MLGraphBuilder { A scalar multiplier. -#### {{MLGraphBuilder/leakyRelu(input, options)}} #### {#api-mlgraphbuilder-leaky-relu-input-options}- The hardSwish() method steps are: -
- 1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}. - 1. Let |op| be the result of [=creating an MLActivation=] given [=this=] and "hardSwish". - 1. Return |op|. -**Arguments:** - input: an {{MLOperand}}. The input tensor. @@ -4019,27 +3882,6 @@ partial interface MLGraphBuilder {-#### {{MLGraphBuilder/leakyRelu(options)}} #### {#api-mlgraphbuilder-leaky-relu-options} -- **Arguments:** - - options: an optional {{MLLeakyReluOptions}}. The optional parameters of the operation. - - **Returns:** - - an {{MLActivation}}. The activation function representing the leaky relu operation. -- --- ### linear ### {#api-mlgraphbuilder-linear} Calculate a linear function `y = alpha * x + beta` on the input tensor. @@ -4051,7 +3893,6 @@ dictionary MLLinearOptions { partial interface MLGraphBuilder { MLOperand linear(MLOperand input, optional MLLinearOptions options = {}); - MLActivation linear(optional MLLinearOptions options = {}); }; @@ -4065,7 +3906,6 @@ partial interface MLGraphBuilder { A scalar addition. -#### {{MLGraphBuilder/linear(input, options)}} #### {#api-mlgraphbuilder-linear-input-options}- The leakyRelu(|options|) method steps are: -
- 1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}. - 1. Let |validationSteps| given {{MLOperandDescriptor}} |descriptor| be these steps: - 1. Set |options|.{{MLLeakyReluOptions/alpha}} to the result of [=casting=] |options|.{{MLLeakyReluOptions/alpha}} to |descriptor|.{{MLOperandDescriptor/dataType}}. - 1. Return true. - 1. Let |op| be the result of [=creating an MLActivation=] given [=this=], "leakyRelu", |options|, and |validationSteps|. - 1. Return |op|. -**Arguments:** - input: an {{MLOperand}}. The input tensor. @@ -4108,28 +3948,6 @@ partial interface MLGraphBuilder {-#### {{MLGraphBuilder/linear(options)}} #### {#api-mlgraphbuilder-linear-options} -- **Arguments:** - - options: an optional {{MLLinearOptions}}. The optional parameters of the operation. - - **Returns:** - - an {{MLActivation}}. The activation function representing the linear operation. -- --- ### lstm ### {#api-mlgraphbuilder-lstm} Long Short-Term Memory [[LSTM]] recurrent network uses an input, output, forget, and cell gate to compute the output state that rolls into the output across the temporal sequence of the network. @@ -4148,7 +3966,7 @@ dictionary MLLstmOptions { boolean returnSequence = false; MLRecurrentNetworkDirection direction = "forward"; MLLstmWeightLayout layout = "iofg"; - sequence- The linear(|options|) method steps are: -
- 1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}. - 1. Let |validationSteps| given {{MLOperandDescriptor}} |descriptor| be these steps: - 1. Set |options|.{{MLLinearOptions/alpha}} to the result of [=casting=] |options|.{{MLLinearOptions/alpha}} to |descriptor|.{{MLOperandDescriptor/dataType}}. - 1. Set |options|.{{MLLinearOptions/beta}} to the result of [=casting=] |options|.{{MLLinearOptions/beta}} to |descriptor|.{{MLOperandDescriptor/dataType}}. - 1. Return true. - 1. Let |op| be the result of [=creating an MLActivation=] given [=this=], "linear", |options|, and |validationSteps|. - 1. Return |op|. -activations; + sequence activations; }; partial interface MLGraphBuilder { @@ -4197,7 +4015,7 @@ partial interface MLGraphBuilder { : activations :: - A list of three activation functions, the first one is used for the `input (i)`, `forget (f)`, and `output (o)` gate, the second one is used for the `cell (g)` gate, and the last used for filtering the output cell state before combining it with the result of the output gate to form the output hidden state. When not specified, implementations SHOULD use the sequence of the sigmoid function ("sigmoid") followed by two hyperbolic tangent functions ("tanh") respectively. + A list of three [=operator/activation functions=], the first one is used for the `input (i)`, `forget (f)`, and `output (o)` gate, the second one is used for the `cell (g)` gate, and the last used for filtering the output cell state before combining it with the result of the output gate to form the output hidden state. When not specified, defaults to a sequence of the {{MLRecurrentNetworkActivation/"sigmoid"}}, {{MLRecurrentNetworkActivation/"tanh"}}, and {{MLRecurrentNetworkActivation/"tanh"}} functions, respectively. @@ -4218,7 +4036,6 @@ partial interface MLGraphBuilder { 1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}. 1. If [=MLGraphBuilder/validating operand=] with [=this=] and any of |input|, |weight|, |recurrentWeight|, |options|.{{MLLstmOptions/bias}} (if it [=map/exists=]), |options|.{{MLLstmOptions/recurrentBias}} (if it [=map/exists=]), |options|.{{MLLstmOptions/peepholeWeight}} (if it [=map/exists=]), |options|.{{MLLstmOptions/initialHiddenState}} (if it [=map/exists=]), and |options|.{{MLLstmOptions/initialCellState}} (if it [=map/exists=]) returns false, then [=exception/throw=] a {{TypeError}}. - 1. If |options|.{{MLLstmOptions/activations}} [=map/exists=], and [=MLGraphBuilder/validating activation=] with [=this=] and any [=list/item=] in it returns false, then [=exception/throw=] a {{TypeError}}. 1. Let |numDirections| be 2 if |options|.{{MLLstmOptions/direction}} is {{MLRecurrentNetworkDirection/"both"}}, or 1 otherwise. 1. If |input|'s [=MLOperand/dataType=] is not {{MLOperandDataType/"float32"}} or {{MLOperandDataType/"float16"}}, then [=exception/throw=] a {{TypeError}}. 1. If |input|'s [=MLOperand/rank=] is not 3, then [=exception/throw=] a {{TypeError}}. @@ -4251,10 +4068,9 @@ partial interface MLGraphBuilder { 1. If its [=MLOperand/shape=] is not equal to « |numDirections|, |batchSize|, |hiddenSize| », then [=exception/throw=] a {{TypeError}}. 1. If |options|.{{MLLstmOptions/activations}} [=map/exists=]: 1. If its [=list/size=] is not 3, then [=exception/throw=] a {{TypeError}}. - 1. Let |gateDescriptor| be a new {{MLOperandDescriptor}}. - 1. Set |gateDescriptor|.{{MLOperandDescriptor/dimensions}} to the [=/list=] « |batchSize|, |hiddenSize| ». - 1. Set |gateDescriptor|.{{MLOperandDescriptor/dataType}} to |input|'s [=MLOperand/dataType=]. - 1. If running the [=MLActivation/validation steps=] of any [=list/item=] in |options|.{{MLLstmOptions/activations}} with |gateDescriptor| returns false, then [=exception/throw=] a {{TypeError}}. + 1. Let |activations| be a [=list/clone=] of |options|.{{MLLstmOptions/activations}}. + 1. Otherwise: + 1. Let |activations| be « {{MLRecurrentNetworkActivation/"sigmoid"}}, {{MLRecurrentNetworkActivation/"tanh"}}, {{MLRecurrentNetworkActivation/"tanh"}} ». 1. *Calculate the output shape:* 1. Let |desc| be a new {{MLOperandDescriptor}}. 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to the [=/list=] « |numDirections|, |batchSize|, |hiddenSize| ». @@ -4280,7 +4096,7 @@ partial interface MLGraphBuilder { 1. If |options|.{{MLLstmOptions/peepholeWeight}} [=map/exists=], then add it to |operator|'s [=operator/inputs=]. 1. If |options|.{{MLLstmOptions/initialHiddenState}} [=map/exists=], then add it to |operator|'s [=operator/inputs=]. 1. If |options|.{{MLLstmOptions/initialCellState}} [=map/exists=], then add it to |operator|'s [=operator/inputs=]. - 1. If |options|.{{MLLstmOptions/activations}} [=map/exists=], then add its [=list/items=] to |operator|'s [=operator/activation functions=]. + 1. Set |operator|'s [=operator/activation functions=] to a [=list/clone=] of |activations|. 1. Set |operator|'s [=operator/output=] to |output|. 1. Return |output|. @@ -4426,7 +4242,7 @@ dictionary MLLstmCellOptions { MLOperand recurrentBias; MLOperand peepholeWeight; MLLstmWeightLayout layout = "iofg"; - sequenceactivations; + sequence activations; }; partial interface MLGraphBuilder { @@ -4460,7 +4276,7 @@ partial interface MLGraphBuilder { : activations :: - A list of three activation functions, the first one is used for the `input (i)`, `forget (f)`, and `output (o)` gate, the second one is used for the `cell (g)` gate, and the last used for filtering the output cell state before combining it with the result of the output gate to form the output hidden state. When not specified, they are assumed to be of the sigmoid function ("sigmoid") followed by two hyperbolic tangent functions ("tanh") respectively. + A list of three [=operator/activation functions=], the first one is used for the `input (i)`, `forget (f)`, and `output (o)` gate, the second one is used for the `cell (g)` gate, and the last used for filtering the output cell state before combining it with the result of the output gate to form the output hidden state. When not specified, defaults to a sequence of the {{MLRecurrentNetworkActivation/"sigmoid"}}, {{MLRecurrentNetworkActivation/"tanh"}}, and {{MLRecurrentNetworkActivation/"tanh"}} functions, respectively. @@ -4482,7 +4298,6 @@ partial interface MLGraphBuilder { 1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}. 1. If [=MLGraphBuilder/validating operand=] with [=this=] and any of |input|, |weight|, |recurrentWeight|, |hiddenState|, |cellState|, |options|.{{MLLstmCellOptions/bias}} (if it [=map/exists=]), |options|.{{MLLstmCellOptions/recurrentBias}} (if it [=map/exists=]), and |options|.{{MLLstmCellOptions/peepholeWeight}} (if it [=map/exists=]) returns false, then [=exception/throw=] a {{TypeError}}. - 1. If |options|.{{MLLstmCellOptions/activations}} [=map/exists=], and [=MLGraphBuilder/validating activation=] with [=this=] and any [=list/item=] in it returns false, then [=exception/throw=] a {{TypeError}}. 1. If |input|'s [=MLOperand/dataType=] is not {{MLOperandDataType/"float32"}} or {{MLOperandDataType/"float16"}}, then [=exception/throw=] a {{TypeError}}. 1. If |input|'s [=MLOperand/rank=] is not equal to 2, then [=exception/throw=] a {{TypeError}}. 1. If the [=MLOperand/dataType=] of any of |weight|, |recurrentWeight|, |hiddenState| or |cellState| is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}. @@ -4508,10 +4323,12 @@ partial interface MLGraphBuilder { 1. If its [=MLOperand/shape=] is not equal to « 3 * |hiddenSize| », then [=exception/throw=] a {{TypeError}}. 1. If |options|.{{MLLstmCellOptions/activations}} [=map/exists=]: 1. If its [=list/size=] is not 3, then [=exception/throw=] a {{TypeError}}. + 1. Let |activations| be a [=list/clone=] of |options|.{{MLLstmCellOptions/activations}}. + 1. Otherwise: + 1. Let |activations| be « {{MLRecurrentNetworkActivation/"sigmoid"}}, {{MLRecurrentNetworkActivation/"tanh"}}, {{MLRecurrentNetworkActivation/"tanh"}} ». 1. Let |desc| be a new {{MLOperandDescriptor}}. 1. Set |desc|.{{MLOperandDescriptor/dimensions}} to the [=/list=] « |batchSize|, |hiddenSize| ». 1. Set |desc|.{{MLOperandDescriptor/dataType}} to |input|'s [=MLOperand/dataType=]. - 1. If |options|.{{MLLstmCellOptions/activations}} [=map/exists=], and running the [=MLActivation/validation steps=] of any [=list/item=] in it with |desc| returns false, then [=exception/throw=] a {{TypeError}}. 1. *Make graph connections:* 1. Let |output0| be the result of [=creating an MLOperand=] given [=this=] and |desc|. 1. Let |output1| be the result of [=creating an MLOperand=] given [=this=] and |desc|. @@ -4522,7 +4339,7 @@ partial interface MLGraphBuilder { 1. If |options|.{{MLLstmCellOptions/bias}} [=map/exists=], then add it to |operator|'s [=operator/inputs=]. 1. If |options|.{{MLLstmCellOptions/recurrentBias}} [=map/exists=], then add it to |operator|'s [=operator/inputs=]. 1. If |options|.{{MLLstmCellOptions/peepholeWeight}} [=map/exists=], then add it to |operator|'s [=operator/inputs=]. - 1. If |options|.{{MLLstmCellOptions/activations}} [=map/exists=], then add its [=list/items=] to |operator|'s [=operator/activation functions=]. + 1. Set |operator|'s [=operator/activation functions=] to a [=list/clone=] of |activations|. 1. Set |operator|'s [=operator/output=] to |output|. 1. Return |output|. @@ -4530,7 +4347,7 @@ partial interface MLGraphBuilder {-#### {{MLGraphBuilder/softsign(input)}} #### {#api-mlgraphbuilder-softsign-input}- The behavior of this operation when the weight layout is the default {{MLLstmWeightLayout/"iofg"}} layout, and the activation functions of the input/forget/output gate and the cell gate/the cell state's filter for the output hidden state are {{MLGraphBuilder/sigmoid()}} and {{MLGraphBuilder/tanh()}} respectively can be [EMULATED] + The behavior of this operation when the weight layout is the default {{MLLstmWeightLayout/"iofg"}} layout, and the [=operator/activation functions=] of the input/forget/output gate and the cell gate/the cell state's filter for the output hidden state are {{MLGraphBuilder/sigmoid()}} and {{MLGraphBuilder/tanh()}} respectively can be [EMULATED]
function lstmCell( @@ -5310,11 +5127,9 @@ Compute the -#### {{MLGraphBuilder/relu(input)}} #### {#api-mlgraphbuilder-relu-input}**Arguments:** - input: an {{MLOperand}}. The input tensor. @@ -5352,24 +5167,6 @@ partial interface MLGraphBuilder {-#### {{MLGraphBuilder/relu()}} #### {#api-mlgraphbuilder-relu} -- **Arguments:** - - None. - - **Returns:** - - an {{MLActivation}}. The activation function representing the relu operation. -- --- ### resample2d ### {#api-mlgraphbuilder-resample2d-method} Resample the tensor values from the source to the destination spatial dimensions according to the scaling factors. -#### {{MLGraphBuilder/sigmoid(input)}} #### {#api-mlgraphbuilder-sigmoid-input}- The relu() method steps are: -
- 1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}. - 1. Let |op| be the result of [=creating an MLActivation=] given [=this=] and "relu". - 1. Return |op|. -**Arguments:** - input: an {{MLOperand}}. The input tensor. @@ -5557,24 +5352,6 @@ partial interface MLGraphBuilder {-#### {{MLGraphBuilder/sigmoid()}} #### {#api-mlgraphbuilder-sigmoid} -- **Arguments:** - - None. - - **Returns:** - - an {{MLActivation}}. The activation function representing the sigmoid operation. -- --- ### slice ### {#api-mlgraphbuilder-slice} Produce a slice of the input tensor. -#### {{MLGraphBuilder/softplus(input)}} #### {#api-mlgraphbuilder-softplus-input}- The sigmoid() method steps are: -
- 1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}. - 1. Let |op| be the result of [=creating an MLActivation=] given [=this=] and "sigmoid". - 1. Return |op|. -**Arguments:** - input: an {{MLOperand}}. The input tensor. @@ -5723,30 +5498,11 @@ partial interface MLGraphBuilder {-#### {{MLGraphBuilder/softplus()}} #### {#api-mlgraphbuilder-softplus} -- **Arguments:** - - None. - - **Returns:** - - an {{MLActivation}}. The activation function representing the softplus operation. -- --- ### softsign ### {#api-mlgraphbuilder-softsign-method} Compute the softsign function of the input tensor. The calculation follows the expression `x / (1 + |x|)`. @@ -5765,7 +5521,6 @@ partial interface MLGraphBuilder {- The softplus() method steps are: -
- 1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}. - 1. Let |op| be the result of [=creating an MLActivation=] given [=this=] and "softplus". - 1. Return |op|. -**Arguments:** - input: an {{MLOperand}}. The input tensor. @@ -5790,24 +5545,6 @@ partial interface MLGraphBuilder { 1. Return |output|. -#### {{MLGraphBuilder/softsign()}} #### {#api-mlgraphbuilder-softsign} -- **Arguments:** - - None. - - **Returns:** - - an {{MLActivation}}. The activation function representing the softsign operation. -- --- ### split ### {#api-mlgraphbuilder-split} Split the input tensor into a number of sub tensors along the given axis. -#### {{MLGraphBuilder/tanh(input)}} #### {#api-mlgraphbuilder-tanh-input}- The softsign() method steps are: -
- 1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}. - 1. Let |op| be the result of [=creating an MLActivation=] given [=this=] and "softsign". - 1. Return |op|. -**Arguments:** - input: an {{MLOperand}}. The input tensor. @@ -5951,24 +5686,6 @@ partial interface MLGraphBuilder {-#### {{MLGraphBuilder/tanh()}} #### {#api-mlgraphbuilder-tanh} -- **Arguments:** - - None. - - **Returns:** - - an {{MLActivation}}. The activation function representing the tanh operation. -- --- ### transpose ### {#api-mlgraphbuilder-transpose} Permute the dimensions of the input tensor according to the *permutation* argument.- The tanh() method steps are: -
- 1. If [=this=].{{MLGraphBuilder/[[hasBuilt]]}} is true, then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}. - 1. Let |op| be the result of [=creating an MLActivation=] given [=this=] and "tanh". - 1. Return |op|. -