-
Notifications
You must be signed in to change notification settings - Fork 146
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Multiple TS for the same target variable #7
Comments
Yes, you can do either of two options:
* Instantiate the three different rnn's in three name scopes. Pythonic:
create each cell within a new context manager. Actually, only do this
option when you have a very good reason for having three separate rnn's. In
99% of the cases, you'll want the second option.
* Stack the three time series. and run them in one rnn. So
([1,sl],[1,sl],[1,sl]) goes ([3,sl])
…On 1 June 2017 at 04:37, Pat Cobhan ***@***.***> wrote:
Hi there, thank you for this awesome tutorial, it is of great interest for
my master research work!
I was trying to adapt the code for my problem, as for each user I want to
classify I have three time series of the same length. Firstly I edited
load_data and sample_batch in order to have: t1_train,t1_val,t1_test,t2_
train,t2_val,t2_test,t3_train,t3_val,t3_test,y_train,y_val,y_test
and
t1_batch,t2_batch,t3_batch,y_batch
In the Model itself is where I'm sure I'm doing something wrong, up to now
I've edited your code this way:
self.input1 = tf.placeholder(tf.float32, [None, sl], name = 'input_1')
self.input2 = tf.placeholder(tf.float32, [None, sl], name = 'input_2')
self.input3 = tf.placeholder(tf.float32, [None, sl], name = 'input_3')
self.labels = tf.placeholder(tf.int64, [None], name='labels')
self.keep_prob = tf.placeholder("float", name = 'Drop_out_keep_prob')
with tf.name_scope("LSTM_setup") as scope:
def single_cell1():
return tf.contrib.rnn.DropoutWrapper(LSTMCell(hidden_size),output_keep_prob=self.keep_prob)
cell1 = tf.contrib.rnn.MultiRNNCell([single_cell1() for _ in range(num_layers)])
initial_state1 = cell1.zero_state(self.batch_size, tf.float32)
def single_cell2():
return tf.contrib.rnn.DropoutWrapper(LSTMCell(hidden_size),output_keep_prob=self.keep_prob)
cell2 = tf.contrib.rnn.MultiRNNCell([single_cell2() for _ in range(num_layers)])
initial_state2 = cell2.zero_state(self.batch_size, tf.float32)
def single_cell3():
return tf.contrib.rnn.DropoutWrapper(LSTMCell(hidden_size),output_keep_prob=self.keep_prob)
cell3 = tf.contrib.rnn.MultiRNNCell([single_cell3() for _ in range(num_layers)])
initial_state3 = cell3.zero_state(self.batch_size, tf.float32)
input_list1 = tf.unstack(tf.expand_dims(self.input1,axis=2),axis=1)
input_list2 = tf.unstack(tf.expand_dims(self.input2,axis=2),axis=1)
input_list3 = tf.unstack(tf.expand_dims(self.input3,axis=2),axis=1)
outputs1,_ = core_rnn.static_rnn(cell1, input_list1, dtype=tf.float32)
outputs2,_ = core_rnn.static_rnn(cell2, input_list2, dtype=tf.float32)
outputs3,_ = core_rnn.static_rnn(cell3, input_list3, dtype=tf.float32)
output = tf.concat([outputs1[-1], outputs2[-1], outputs3[-1]], 0)
When I try to train the network this is the error that I get:
ValueError: Attempt to have a second RNNCell use the weights of a variable
scope that already has weights: 'rnn/multi_rnn_cell/cell_0/lstm_cell';
and the cell was not constructed as LSTMCell(..., reuse=True). To share the
weights of an RNNCell, simply reuse it in your second calculation, or
create a new one with the argument reuse=True.
Would you have any suggestion or tip for me to get it working? Thank you
in advance!
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#7>, or mute the thread
<https://github.com/notifications/unsubscribe-auth/APbLxV4W4nZVIWG-xnV9r3AQQVFxE2Wrks5r_iP2gaJpZM4NsbFi>
.
|
Good evening Rob and thank you for your clear answer. I was thinking about the same, as the three rnn would know nothing about each other and that seems wrong to me. |
This should help you import tensorflow as tf
sl = 63
x1 = tf.placeholder(tf.float32, [None,sl],name="x1")
x2 = tf.placeholder(tf.float32, [None,sl],name="x2")
x3 = tf.placeholder(tf.float32, [None,sl],name="x3")
x123 = tf.stack([x1,x2,x3]) # Stacks on inner dimension
x123t = tf.transpose(x123,[1,0,2]) # transpose to whatever you need
print(x123)
print(x123t)
>>> Tensor("stack_1:0", shape=(3, ?, 63), dtype=float32)
>>> Tensor("transpose_1:0", shape=(?, 3, 63), dtype=float32) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi there, thank you for this awesome tutorial, it is of great interest for my master research work!
I was trying to adapt the code to my problem, as for each user I want to classify I have three time series of the same length. Firstly I edited load_data and sample_batch in order to have:
t1_train,t1_val,t1_test,t2_train,t2_val,t2_test,t3_train,t3_val,t3_test,y_train,y_val,y_test
and
t1_batch,t2_batch,t3_batch,y_batch
In the Model itself is where I'm sure I'm doing something wrong, up to now I've edited your code this way:
When I try to train the network this is the error that I get:
ValueError: Attempt to have a second RNNCell use the weights of a variable scope that already has weights: 'rnn/multi_rnn_cell/cell_0/lstm_cell'; and the cell was not constructed as LSTMCell(..., reuse=True). To share the weights of an RNNCell, simply reuse it in your second calculation, or create a new one with the argument reuse=True.
Would you have any suggestion or tip for me to get it working? Thank you in advance!
The text was updated successfully, but these errors were encountered: