-
Notifications
You must be signed in to change notification settings - Fork 28
Add support for permute #130
Comments
In a similar vein, I would also very much appreciate transpose! Specifically I'd like to do |
@JCBrouwer - transpose should already be implemented and available. |
Hmm I've installed using both: But when I run: noises = [torch.randn((timesteps, 1, size, size)) for size in [4, 8, 8, 16, 16, 32, 32, 64, 64]]
print(nested_tensor(noises).shape)
print(nested_tensor(noises).transpose(0, 1).shape) # or torch.transpose(..., 0, 1) I get:
|
Oh I see. The first dimension is a nested dimension. Perhaps you want
will print
Otherwise, could you describe the behavior you expect? It's possible to transpose a nested and dense dimension, but it'll require multiple nested dimensions, which we currently don't support. |
I have multiple inputs to a network of the form This way I can easily distribute the inputs like: dataset = TensorDataset(latents, noises, truncations)
dataloader = FancyDistributedDataloader(dataset, **kwargs) or for result in torch.multiprocessing.map_async(inference_fn, zip(latents, noises, truncations)):
... While it might possible to try to rewrite the above things to use something like the following, this can be a lot of extra effort depending on what's consuming my no-longer-zipped inputs (i.e. for the TensorDataset example it would be pretty easy to override __getitem__, but for the map_async example I'd need an extra for-loop like below to prepare the zipped list). for idx in range(n):
lat, noi, trunc = latents[idx], noises[:, idx], trunc[idx]
... |
I see. For now you'll unfortunately need to do this via unbind and zip.
Each entry has nested_size
You can now unbind each of these NestedTensors again and get a plain list of constituent Tensors. Then you can zip these lists together, stack the resulting tuples and construct a new NestedTensor. I don't think this will buy you much however in comparison to doing this in plain Python first and then constructing a NestedTensor. EDIT: Actually I think your case might not be representable without more nested dimensions altogether. You want a list of length 785 with entries each of length 9. Each of these entries contains Tensors of variable sizes. You can however currently only construct NestedTensors that a list of Tensors. That means you'd need to concatenate or stack these Tensors. However, they're all of different shapes. I think without introducing more than one nested dimension we can't help you here yet. However, there's nothing preventing you from maintaining a list of NestedTensors. You just have to loop over them, which is inefficient. |
aten reference: permute.
Semantics here are limited if we're to maintain the view functionality of pytorch.
A user may permute tensor dimensions xor nestedtensor dimensions. Permutations at a per-tensor dimension is simply done with a map operation. Permutations at a nestedtensor dimension requires us to implement a "rotation" within the tree. This is a useful operation in general and may live in the csrc/utils folder.
The text was updated successfully, but these errors were encountered: