Skip to content

torch.gather behavior changed from 1.5.1 to master #41532

Closed
@zou3519

Description

@zou3519

🐛 Bug

The documentation states that for torch.gather(input, dim, index), the index tensor must have the same size in all dimensions as input, except for dimension dim. This check is respected in PyTorch 1.5.1, but not on master.

To Reproduce

On 1.5.1:

>>> t = torch.tensor([[1,2],[3,4]])
>>> index = torch.tensor([[0]])
>>> torch.gather(t, 1, index)
RuntimeError: Size does not match at dimension 0 get 2 vs 1

On master:

t = torch.tensor([[1,2],[3,4]])
index = torch.tensor([[0]])
torch.gather(t, 1, index)

Expected behavior

I'm not sure. Either we should update the documentation or add an error check.

Metadata

Metadata

Assignees

No one assigned

    Labels

    triagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions