Skip to content

Code for "Context-Aware Recurrent Encoder for Neural Machine Translation" (TASLP 2017)

License

Notifications You must be signed in to change notification settings

DeepLearnXMU/CAEncoder-NMT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CAEncoder-NMT

Source code for A Context-Aware Recurrent Encoder for Neural Machine Translation. Our model is much faster than the standard encoder-attention-decoder model, and obtains a BLEU point of 22.57 on English-German translation task, compared with that of 20.87 yielded by dl4mt.

If you use this code, please cite our paper:

@article{Zhang:2017:CRE:3180104.3180106,
 author = {Zhang, Biao and Xiong, Deyi and Su, Jinsong and Duan, Hong},
 title = {A Context-Aware Recurrent Encoder for Neural Machine Translation},
 journal = {IEEE/ACM Trans. Audio, Speech and Lang. Proc.},
 issue_date = {December 2017},
 volume = {25},
 number = {12},
 month = dec,
 year = {2017},
 issn = {2329-9290},
 pages = {2424--2432},
 numpages = {9},
 url = {https://doi.org/10.1109/TASLP.2017.2751420},
 doi = {10.1109/TASLP.2017.2751420},
 acmid = {3180106},
 publisher = {IEEE Press},
 address = {Piscataway, NJ, USA},
}

How to Run?

A demo case is provided in the work directory

Training

You need process your training data and set up a configuration file, as the german.py does. The train.py script is used for training.

Testing

All you need is the sample.py script. Of course, the directory for vocabularies and model files are required.

For any comments or questions, please email Biao Zhang.

About

Code for "Context-Aware Recurrent Encoder for Neural Machine Translation" (TASLP 2017)

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published