simplification of the sincos positional encoding in patchembedding.py #7605
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Fixes #7564 .
Description
As discussed, a small simplification for the creation of sincos positional encoding where we don't need to use the
torch.no_grad()
context or copy the tensor withcopy_
from torch which doesn't preserve therequires_grad
attribute here.The changes are simple and are linked to the corresponding comment #7564, the output is already in float32 so it doesn't seem particularly necessary to apply the conversion previously done.
Types of changes
./runtests.sh -f -u --net --coverage
../runtests.sh --quick --unittests --disttests
.make html
command in thedocs/
folder.