My main modification in the global_attention(/onmt/modules/global_attention.py) combined the idea of gated attention in the paper 'Not all attention is needed' with Luong global attention methods in the paper 'Effective Approaches to Attention-based Neural Machine Translation'. I also modified the interface function between the source code of Open NMT(encoder and decoder files) and the newly added content. I did not make many changes to the source code, which will allow me to compare other models later.
-
Notifications
You must be signed in to change notification settings - Fork 1
License
xiuzbl/Gated-attention-with-Open_NMT
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
No description, website, or topics provided.
Resources
License
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published