-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Seq2seq #989
Conversation
tensorlayer/models/seq2seq.py
Outdated
decoder_seq_length: int | ||
The length of your target sequence | ||
cell_enc : str, tf.function | ||
The RNN function cell for your encoder stack, i.e. tf.keras.layers.GRUCell |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
'i.e.' means 'in other words'. Please use 'e.g.' (means 'for example') here and other places accordingly.
tensorlayer/models/seq2seq.py
Outdated
n_layer : int | ||
The number of your RNN layers for both encoder and decoder block | ||
embbedding_layer : tl.function | ||
The embedding layer function, i.e. tl.layers.Embedding(vocabulary_size=voc_size, embedding_size=emb_dim) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Typo here and please change this comment to:
embedding_layer : tl.Layer
A embedding layer, e.g. tl.layers.Embedding(vocabulary_size=voc_size, embedding_size=emb_dim)
tensorlayer/models/seq2seq.py
Outdated
The number of your RNN layers for both encoder and decoder block | ||
embbedding_layer : tl.function | ||
The embedding layer function, i.e. tl.layers.Embedding(vocabulary_size=voc_size, embedding_size=emb_dim) | ||
is_train : bool |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is_train
is never used in init
and it is not going to be supported when a model is constructed in future versions. Please remove it.
tensorlayer/models/seq2seq.py
Outdated
n_layer=3, | ||
embedding_layer=None, | ||
is_train=True, | ||
name="seq2seq_" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
name=None
by default. There is an auto-naming mechanism in tl.models.Model()
.
tensorlayer/models/seq2seq.py
Outdated
self.reshape_layer_individual_sequence = tl.layers.Reshape([-1, 1, self.vocabulary_size]) | ||
|
||
def inference(self, encoding, seq_length, start_token, top_n): | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some documentation is needed here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Did you use beam search? Or just random sampling?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Random sampling
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please resolve the problems.
ADD Stacked Layer Seq2Seq