-
Notifications
You must be signed in to change notification settings - Fork 66
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
dimension of torch tensor drops to zero when selecting a single element of a torch tensor #1164
Comments
Unlike R, torch has support for scalar values (0d tensors). This differentation is required in some specific situations in torch. Torch scalar tensors don't have dimensions, by definition, thus it's consistent that:
It's also consistent witth Since R doesn't really have scalar values IMO it's consistent that if The default value of Do you have an specific use case where returning scalar is causing problems to you? |
The use case is when using index in a multi-dimensional tensor without drop=FALSE. The dimension of the resulting tensor will depend on whether the index selects one or more elements. I stumbled on this when implementing a seq2seq model with attention in R, which was inspired from pythong tutorial. In my implementation of the decoder (you can see the code here), I am using an index (
In the current implementation, Actually, I am not sure what is the best behaviour. One should expect that a drop of dimension(s) when one selects a unique element in a multi-dimensional array. I guess the issue is whether to stop dropping dimension(s) at 1d array for consistency with R or if we should follow Python. What is weird is that |
Selecting a single element of a tensor yields a tensor of size and dimension 0!
Let's define a torch tensor (size does not matter):
and lets select a single element
The size is 0 and the dimension is 0:
It is not consistent with
length
It is also not consistent with the dimension of the tensor
obtained when creating a tensor with a single element (see above).
This always happens unless one uses
drop=FALSE
Should really the dimension really drop to zero in this case?
What are the rules for 0d and 1d tensors in R torch?
The text was updated successfully, but these errors were encountered: