Pytorch implementation of the paper "Self-Attention with Relative Position Representations"
For the entire Seq2Seq framework, you can refer to this repo.
-
Notifications
You must be signed in to change notification settings - Fork 21
evelinehong/Transformer_Relative_Position_PyTorch
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
Implement the paper "Self-Attention with Relative Position Representations"
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published