You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
/usr/local/lib/python3.7/dist-packages/torch/nn/functional.py in linear(input, weight, bias)
1846 if has_torch_function_variadic(input, weight, bias):
1847 return handle_torch_function(linear, (input, weight, bias), input, weight, bias=bias)
-> 1848 return torch._C._nn.linear(input, weight, bias)
1849
1850
TypeError: linear(): argument 'input' (position 1) must be Tensor, not torch.return_types.max
Without very deep investigation, I assume this problem is caused by torch.max()'s output, which is torch.return_types.max instead of a torch tensor, while the linear layer expect the input to be a tensor.
I guess the fix of this bug would be change torch.max(**kwargs) to torch.max(**kwargs)[0] to make the output data type to be tensor. But I am new to this project and don't know how to write some fixable codes. Can you check if this is actually a bug and how can we fix it?
The text was updated successfully, but these errors were encountered:
I was trying to convert a Keras model to PyTorch through ONNX but failed.
The information of my targeted model is as follows
The script to reproduce is as follows:
You may access the code here:
https://colab.research.google.com/drive/1EtuxhHjy3QdmCf4v6DSpN9jsde2SeNpW?usp=sharing
The crash information is as follows:
Without very deep investigation, I assume this problem is caused by torch.max()'s output, which is
torch.return_types.max
instead of a torch tensor, while the linear layer expect the input to be a tensor.I guess the fix of this bug would be change
torch.max(**kwargs)
totorch.max(**kwargs)[0]
to make the output data type to be tensor. But I am new to this project and don't know how to write some fixable codes. Can you check if this is actually a bug and how can we fix it?The text was updated successfully, but these errors were encountered: