You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Nov 15, 2022. It is now read-only.
I have a model where I would love to use NestedTensor, I have a lot of padding going on and nested tensors would save a lot of memory, the net where I would like to use them is composed by a linear layer followed by batchnorm and Relu, finally a max operation is done over the channels.
Foward looks like this
def forward(self, inputs):
x = self.linear(inputs)
x = self.norm(x.permute(0, 2, 1).contiguous()).permute(0, 2, 1).contiguous()
x = F.relu(x)
x_max = torch.max(x, dim=1, keepdim=True)[0]
return x_max
Is it possible to use Nested Tensors? The project supports python 3.6+, pytorch 0.4.1+.