Skip to content
This repository has been archived by the owner on Feb 12, 2022. It is now read-only.

RuntimeError: size mismatch if use the window size of 2 #19

Open
dpogosov opened this issue Aug 3, 2018 · 0 comments
Open

RuntimeError: size mismatch if use the window size of 2 #19

dpogosov opened this issue Aug 3, 2018 · 0 comments

Comments

@dpogosov
Copy link

dpogosov commented Aug 3, 2018

Hi,

I use QRNN, and it works well with the window size of 1 (default), but when I tried the window if size 2, it gave me the following error?

Could you please help me with it?

RuntimeErrorTraceback (most recent call last)

/deep_learning/rup/.../model_semi_parallel.py in forward(self, state)
--> 173 h_new, states = self.rnn(input_enc, states)

/opt/conda/lib/python3.5/site-packages/torch/nn/modules/module.py in call(self, *input, **kwargs)
475 result = self._slow_forward(*input, **kwargs)
476 else:
--> 477 result = self.forward(*input, **kwargs)
478 for hook in self._forward_hooks.values():
479 hook_result = hook(self, input, result)

/opt/conda/lib/python3.5/site-packages/torchqrnn/qrnn.py in forward(self, input, hidden)
162
163 for i, layer in enumerate(self.layers):
--> 164 input, hn = layer(input, None if hidden is None else hidden[i])
165 next_hidden.append(hn)
166

/opt/conda/lib/python3.5/site-packages/torch/nn/modules/module.py in call(self, *input, **kwargs)
475 result = self._slow_forward(*input, **kwargs)
476 else:
--> 477 result = self.forward(*input, **kwargs)
478 for hook in self._forward_hooks.values():
479 hook_result = hook(self, input, result)

/opt/conda/lib/python3.5/site-packages/torchqrnn/qrnn.py in forward(self, X, hidden)
68
69 # Matrix multiplication for the three outputs: Z, F, O
---> 70 Y = self.linear(source)
71 # Convert the tensor back to (batch, seq_len, len([Z, F, O]) * hidden_size)
72 if self.output_gate:

/opt/conda/lib/python3.5/site-packages/torch/nn/modules/module.py in call(self, *input, **kwargs)
475 result = self._slow_forward(*input, **kwargs)
476 else:
--> 477 result = self.forward(*input, **kwargs)
478 for hook in self._forward_hooks.values():
479 hook_result = hook(self, input, result)

/opt/conda/lib/python3.5/site-packages/torch/nn/modules/linear.py in forward(self, input)
53
54 def forward(self, input):
---> 55 return F.linear(input, self.weight, self.bias)
56
57 def extra_repr(self):

/opt/conda/lib/python3.5/site-packages/torch/nn/functional.py in linear(input, weight, bias)
1024 return torch.addmm(bias, input, weight.t())
1025
-> 1026 output = input.matmul(weight.t())
1027 if bias is not None:
1028 output += bias

RuntimeError: size mismatch, m1: [20 x 1920], m2: [640 x 1920] at /opt/conda/conda-bld/pytorch_1532576276790/work/aten/src/THC/generic/THCTensorMathBlas.cu:249

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant