Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bidirectional LSTM, activation functions are not converted #2211

Closed
quentin-baduel-numalis opened this issue Jul 25, 2023 · 1 comment
Closed
Labels
bug An unexpected problem or unintended behavior

Comments

@quentin-baduel-numalis
Copy link

Describe the bug
Hello, thank you for developing this tool!
When Converting a Keras model containing a Bidirectional LSTM layer, the activation function attribute (activations) is not set in the corresponding lstm onnx operator.
The bug does not seem to appear if we are not in the "Bidirectional" context.

System information

  • OS Platform and Distribution (e.g., Linux Ubuntu 20.04*):
  • TensorFlow Version: 2.11
  • tf2onnx: 1.14.0
  • Python version: 3.10

To Reproduce

1: Creating the Keras network with the activation functions set to "relu":

#! /usr/bin/env python3

from tensorflow.keras.models import Model
from tensorflow.keras.layers import Input, LSTM, Bidirectional
input = Input(shape=(1,1))
model = Bidirectional(LSTM(units=1, return_sequences=True, activation="relu", recurrent_activation="relu"))(input)
model = Model(input, model)
model.save(f"lstm_issue_activation_function.h5")

2: converting the model


#! /usr/bin/env python3

import tensorflow as tf
import onnx
import sys

import tf2onnx

target_model = sys.argv[1]
model = tf.keras.models.load_model(target_model)
spec = (tf.TensorSpec((None,1, 1), tf.float32, name="input"),)
output_path = model.name + ".onnx"
model_proto, _ = tf2onnx.convert.from_keras(model, input_signature=spec, opset=15, output_path=output_path)
onnx.save(model_proto, output_path)

3: checking the converted model
While inspecting the resulting onnx model, we can see that the attribute "activations" of the lstm node is not set and will default to "sigmoid/tanh" rather than the expected "relu" functions

Screenshots
image

@quentin-baduel-numalis quentin-baduel-numalis added the bug An unexpected problem or unintended behavior label Jul 25, 2023
MaximeChurin pushed a commit to MaximeChurin/tensorflow-onnx that referenced this issue Sep 11, 2023
MaximeChurin pushed a commit to MaximeChurin/tensorflow-onnx that referenced this issue Sep 11, 2023
Signed-off-by: Me <me@example.com>
@MaximeChurin
Copy link
Contributor

Thanks for the lead, I had the same issue. I have a PR ongoing that should fix it: #2234

MaximeChurin pushed a commit to MaximeChurin/tensorflow-onnx that referenced this issue Oct 11, 2023
Signed-off-by: Me <me@example.com>
MaximeChurin pushed a commit to MaximeChurin/tensorflow-onnx that referenced this issue Oct 12, 2023
Signed-off-by: Me <me@example.com>
MaximeChurin pushed a commit to MaximeChurin/tensorflow-onnx that referenced this issue Oct 12, 2023
Signed-off-by: Me <me@example.com>
MaximeChurin pushed a commit to MaximeChurin/tensorflow-onnx that referenced this issue Oct 12, 2023
Signed-off-by: Me <me@example.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug An unexpected problem or unintended behavior
Projects
None yet
Development

No branches or pull requests

2 participants