About keras mask layers #143367
-
Bodymodel buidinga = Input(shape=[5]) #Error information Attempt to convert a value (None) with an unsupported type (<class 'NoneType'>) to a Tensor. Arguments received by BroadcastTo.call(): Guidelines
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
The error you’re seeing, Here’s the solution and reasoning to address this problem clearly: Solution
Here’s how the corrected model code would look: from tensorflow.keras.layers import Input, Embedding, Concatenate, LSTM, Dense, Masking
from tensorflow.keras.models import Model
# Define inputs
a = Input(shape=(5,))
b = Input(shape=(10,))
# Embedding layers without mask_zero=True
emb_a = Embedding(8, 5)(a)
emb_b = Embedding(20, 5)(b)
# Concatenate embeddings along the time-step axis
cat = Concatenate(axis=1)([emb_a, emb_b])
# Optional: Add a Masking layer to handle masking after concatenation
masked_cat = Masking()(cat)
# LSTM and Dense layers
lstm = LSTM(16)(masked_cat)
dense = Dense(1)(lstm)
# Model
model = Model(inputs=[a, b], outputs=[dense]) Explanation
Additional TipsIf your model requires masking, another way is to add a preprocessing step that identifies padding tokens and masks them separately. |
Beta Was this translation helpful? Give feedback.
The error you’re seeing,
ValueError: Exception encountered when calling BroadcastTo.call()
, usually happens because there’s a mismatch in shapes or an issue with how Keras layers handle masking. The key issue is in usingmask_zero=True
in theEmbedding
layer. Thismask_zero=True
argument is supposed to mask out padding tokens, but it sometimes doesn't work well when passed through certain layers likeConcatenate
andLSTM
.Here’s the solution and reasoning to address this problem clearly:
Solution
Remove
mask_zero=True
from the Embedding layers.Masking with
mask_zero=True
inEmbedding
layers can lead to issues when concatenating embeddings or passing them to certain layers that don't su…