Skip to content

Conversation

@nicolas-chaulet
Copy link
Member

No description provided.

Copy link
Collaborator

@tchaton tchaton left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for finding this bug !

@tchaton tchaton merged commit 90bae82 into master Feb 5, 2020
@tchaton tchaton deleted the changetoken branch February 5, 2020 08:24
Copy link
Collaborator

@humanpose1 humanpose1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's okay to have -1 instead of the size of the batch for the shadow neighbors. But I think it would be more flexible to have the choice for the token.

}
// Reserve the memory

const int token = -1;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the token can be a parameter of the function so we can put anything we want.

at::Tensor idx = torch::full({y.size(0), nsample}, x.size(0),
at::device(y.device()).dtype(at::ScalarType::Long));
at::Tensor idx =
torch::full({y.size(0), nsample}, -1, at::device(y.device()).dtype(at::ScalarType::Long));
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same remark

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants