-
-
Notifications
You must be signed in to change notification settings - Fork 455
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Masking time steps in order to use TCN for variable length sequences #240
Comments
@fsbashiri thanks for reporting! I propose an explanation. I'm not 100% sure, you can challenge me. My clue is the TCN works a bit like an RNN even though it has no states like a LSTM would have. The last outputs depend on the end of the sequence but also on the beginning.
For your second point |
This problem is mentioned on issue #89. Author states "Con1d by keras lacking supports for Masking layer". |
Describe the bug
In my project, I am using TCN for sequence-to-sequence analysis of time series data that have variable lengths. I have defined a subclass of the Sequence class that pads each batch of data to its maximum sequence length (similar to what is suggested here). As for the model, I use a masking layer to compute and pass a mask to TCN (as suggested here issue #234). Supposedly, layers that support masking will automatically propagate the mask to the next layer. In the simplest form of my model, I have a masking layer, followed by a TCN, and a Dense layer with 1 unit.
Here are two issues that I've got:
_keras_mask
.Paste a snippet
Please see the following simple code:
The output of the code:
Dependencies
I am using:
keras 2.4.3
keras-tan 3.1.1
Tensorflow-gpu 2.3.1
The text was updated successfully, but these errors were encountered: