Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
simplification of the sincos positional encoding in patchembedding.py (…
…#7605) Fixes #7564 . ### Description As discussed, a small simplification for the creation of sincos positional encoding where we don't need to use the `torch.no_grad()` context or copy the tensor with `copy_` from torch which doesn't preserve the `requires_grad` attribute here. The changes are simple and are linked to the corresponding comment #7564, the output is already in float32 so it doesn't seem particularly necessary to apply the conversion previously done. ### Types of changes <!--- Put an `x` in all the boxes that apply, and remove the not applicable items --> - [x] Non-breaking change (fix or new feature that would not break existing functionality). - [ ] Breaking change (fix or new feature that would cause existing functionality to change). - [ ] New tests added to cover the changes. - [x] Integration tests passed locally by running `./runtests.sh -f -u --net --coverage`. - [x] Quick tests passed locally by running `./runtests.sh --quick --unittests --disttests`. - [ ] In-line docstrings updated. - [ ] Documentation updated, tested `make html` command in the `docs/` folder. Signed-off-by: Lucas Robinet <robinet.lucas@iuct-oncopole.fr> Co-authored-by: YunLiu <55491388+KumoLiu@users.noreply.github.com>
- Loading branch information