You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
C:\Users\phama\AppData\Local\Temp\ipykernel_13640\996063355.py:62: UserWarning: To copy construct from a tensor, it is recommended to use sourceTensor.clone().detach() or sourceTensor.clone().detach().requires_grad_(True), rather than torch.tensor(sourceTensor).
trans_nn_human_x.append( t( torch.tensor(self.nn_human_x[i], dtype=torch.float32).view(1, -1)) )
~\anaconda3\lib\site-packages\torch\nn\modules\module.py in _call_impl(self, *input, **kwargs)
1100 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks
1101 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1102 return forward_call(*input, **kwargs)
1103 # Do not call functions when jit is used
1104 full_backward_hooks, non_full_backward_hooks = [], []
~\anaconda3\lib\site-packages\torch\nn\modules\module.py in _call_impl(self, *input, **kwargs)
1100 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks
1101 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1102 return forward_call(*input, **kwargs)
1103 # Do not call functions when jit is used
1104 full_backward_hooks, non_full_backward_hooks = [], []
~\anaconda3\lib\site-packages\torch\nn\modules\container.py in forward(self, input)
139 def forward(self, input):
140 for module in self:
--> 141 input = module(input)
142 return input
143
~\anaconda3\lib\site-packages\torch\nn\modules\module.py in _call_impl(self, *input, **kwargs)
1100 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks
1101 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1102 return forward_call(*input, **kwargs)
1103 # Do not call functions when jit is used
1104 full_backward_hooks, non_full_backward_hooks = [], []
~\anaconda3\lib\site-packages\torch\nn\modules\instancenorm.py in _check_input_dim(self, input)
130 def _check_input_dim(self, input):
131 if input.dim() == 2:
--> 132 raise ValueError(
133 'InstanceNorm1d returns 0-filled tensor to 2D tensor.'
134 'This is because InstanceNorm1d reshapes inputs to'
ValueError: InstanceNorm1d returns 0-filled tensor to 2D tensor.This is because InstanceNorm1d reshapes inputs to(1, N * C, ...) from (N, C,...) and this makesvariances 0.
Problem
It seems like the data copy part
Code
for i in range(NUM_CLASSES):
idxs = a_train == i
temp_x = X_train[idxs]
mean = temp_x.mean(axis=0)
knn = KNeighborsClassifier().fit(temp_x, list(range(len(temp_x))))
idx = knn.kneighbors(X=mean.reshape(1,-1), n_neighbors=1, return_distance=False)
p_idxs.append(idx.item())
nn_human_x.append( temp_x[idx.item()].tolist() )
nn_human_x = np.array(nn_human_x)
#### Training
model = PWNet().eval()
model.nn_human_x.data.copy_( torch.tensor(nn_human_x) )
It is passing from NUM_CLASSES x LATENT_SIZE to NUM_PROTOTYPES x LATENT_SIZE. The InstanceNorm1d should be normalizing amongst the CLASSES (I think?).
When going through the transformation loop, the x[i] shape reduce to 1D, LATENT_SIZE only. Should this nn_human_x shape be NUM_PROTOTYPES x NUM_CLASSES x LATENT_SIZE instead, then passing through the transformation loop its gonna be NUM_CLASSES x LATENT_SIZE, then reaching the InstanceNorm1d it will be NUM_CLASSES x LATENT_SIZE?
Sorry if I understand it incorrectly, really look forward to your help.
The text was updated successfully, but these errors were encountered:
Size & shapes
self.nn_human_x.shape: torch.Size([6, 1536])
self.nn_human_x Parameter containing:
tensor([[0., 0., 0., ..., 0., 0., 0.],
[0., 0., 0., ..., 0., 0., 0.],
[0., 0., 0., ..., 0., 0., 0.],
[0., 0., 0., ..., 0., 0., 0.],
[0., 0., 0., ..., 0., 0., 0.],
[0., 0., 0., ..., 0., 0., 0.]])
self.nn_human_x[i] tensor([0., 0., 0., ..., 0., 0., 0.])
Shape x [i]: torch.Size([1536])
trans_nn_human_x []
Errors
C:\Users\phama\AppData\Local\Temp\ipykernel_13640\996063355.py:62: UserWarning: To copy construct from a tensor, it is recommended to use sourceTensor.clone().detach() or sourceTensor.clone().detach().requires_grad_(True), rather than torch.tensor(sourceTensor).
trans_nn_human_x.append( t( torch.tensor(self.nn_human_x[i], dtype=torch.float32).view(1, -1)) )
ValueError Traceback (most recent call last)
~\AppData\Local\Temp\ipykernel_13640\1820000454.py in
77
78 model.eval()
---> 79 current_acc = evaluate_loader(model, train_loader, cce_loss)
80 model.train()
81
~\AppData\Local\Temp\ipykernel_13640\1992294143.py in evaluate_loader(model, loader, cce_loss)
8 imgs, labels = data
9 imgs, labels = imgs.to(DEVICE), labels.to(DEVICE)
---> 10 logits = model(imgs)
11 loss = cce_loss(logits, labels)
12 preds = torch.argmax(logits, dim=1)
~\anaconda3\lib\site-packages\torch\nn\modules\module.py in _call_impl(self, *input, **kwargs)
1100 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks
1101 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1102 return forward_call(*input, **kwargs)
1103 # Do not call functions when jit is used
1104 full_backward_hooks, non_full_backward_hooks = [], []
~\AppData\Local\Temp\ipykernel_13640\996063355.py in forward(self, x)
60 print("trans_nn_human_x", trans_nn_human_x)
61 # trans_nn_human_x.append(t(self.nn_human_x[i]).view(1, -1))
---> 62 trans_nn_human_x.append( t( torch.tensor(self.nn_human_x[i], dtype=torch.float32).view(1, -1)) )
63 # trans_nn_human_x.append(t(self.nn_human_x[i].clone().detach().to(torch.float32).view(1, -1)))
64
~\anaconda3\lib\site-packages\torch\nn\modules\module.py in _call_impl(self, *input, **kwargs)
1100 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks
1101 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1102 return forward_call(*input, **kwargs)
1103 # Do not call functions when jit is used
1104 full_backward_hooks, non_full_backward_hooks = [], []
~\anaconda3\lib\site-packages\torch\nn\modules\container.py in forward(self, input)
139 def forward(self, input):
140 for module in self:
--> 141 input = module(input)
142 return input
143
~\anaconda3\lib\site-packages\torch\nn\modules\module.py in _call_impl(self, *input, **kwargs)
1100 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks
1101 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1102 return forward_call(*input, **kwargs)
1103 # Do not call functions when jit is used
1104 full_backward_hooks, non_full_backward_hooks = [], []
~\anaconda3\lib\site-packages\torch\nn\modules\instancenorm.py in forward(self, input)
54
55 def forward(self, input: Tensor) -> Tensor:
---> 56 self._check_input_dim(input)
57 return F.instance_norm(
58 input, self.running_mean, self.running_var, self.weight, self.bias,
~\anaconda3\lib\site-packages\torch\nn\modules\instancenorm.py in _check_input_dim(self, input)
130 def _check_input_dim(self, input):
131 if input.dim() == 2:
--> 132 raise ValueError(
133 'InstanceNorm1d returns 0-filled tensor to 2D tensor.'
134 'This is because InstanceNorm1d reshapes inputs to'
ValueError: InstanceNorm1d returns 0-filled tensor to 2D tensor.This is because InstanceNorm1d reshapes inputs to(1, N * C, ...) from (N, C,...) and this makesvariances 0.
Problem
It seems like the data copy part
Code
It is passing from NUM_CLASSES x LATENT_SIZE to NUM_PROTOTYPES x LATENT_SIZE. The InstanceNorm1d should be normalizing amongst the CLASSES (I think?).
When going through the transformation loop, the x[i] shape reduce to 1D, LATENT_SIZE only. Should this nn_human_x shape be NUM_PROTOTYPES x NUM_CLASSES x LATENT_SIZE instead, then passing through the transformation loop its gonna be NUM_CLASSES x LATENT_SIZE, then reaching the InstanceNorm1d it will be NUM_CLASSES x LATENT_SIZE?
Sorry if I understand it incorrectly, really look forward to your help.
The text was updated successfully, but these errors were encountered: