We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Checklist for serialization support of Axon models.
dense
bilinear
embedding
conv
conv_transpose
depthwise_conv
separable_conv2d
separable_conv3d
dropout
spatial_dropout
alpha_dropout
feature_alpha_dropout
adaptive_avg_pool
adaptive_max_pool
adaptive_lp_pool
avg_pool
max_pool
lp_pool
global_avg_pool
global_max_pool
global_lp_pool
batch_norm
layer_norm
instance_norm
group_norm
conv_lstm
lstm
gru
concatenate
add
subtract
multiply
celu
elu
exp
gelu
hard_sigmoid
hard_silu
hard_tanh
leaky_relu
linear
log_sigmoid
mish
relu
relu6
selu
sigmoid
silu
softmax
softplus
softsign
tanh
constant
nx
layer
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Checklist for serialization support of Axon models.
Linear
dense
bilinear
embedding
Convolutional
conv
conv_transpose
depthwise_conv
separable_conv2d
separable_conv3d
Dropout
dropout
spatial_dropout
alpha_dropout
feature_alpha_dropout
Pooling
adaptive_avg_pool
adaptive_max_pool
adaptive_lp_pool
avg_pool
max_pool
lp_pool
global_avg_pool
global_max_pool
global_lp_pool
Normalization
batch_norm
layer_norm
instance_norm
group_norm
Recurrent
conv_lstm
lstm
gru
Combinators
concatenate
add
subtract
multiply
Activations
celu
elu
exp
gelu
hard_sigmoid
hard_silu
hard_tanh
leaky_relu
linear
log_sigmoid
mish
relu
relu6
selu
sigmoid
silu
softmax
softplus
softsign
tanh
Special
constant
nx
layer
The text was updated successfully, but these errors were encountered: