papers AI Learner
The Github is limit! Click to go to the new site.

Code-Switching for Enhancing NMT with Pre-Specified Translation

2019-04-19
Kai Song, Yue Zhang, Heng Yu, Weihua Luo, Kun Wang, Min Zhang

Abstract

Leveraging user-provided translation to constrain NMT has practical significance. Existing methods can be classified into two main categories, namely the use of placeholder tags for lexicon words and the use of hard constraints during decoding. Both methods can hurt translation fidelity for various reasons. We investigate a data augmentation method, making code-switched training data by replacing source phrases with their target translations. Our method does not change the MNT model or decoding algorithm, allowing the model to learn lexicon translations by copying source-side target words. Extensive experiments show that our method achieves consistent improvements over existing approaches, improving translation of constrained words without hurting unconstrained words.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1904.09107

PDF

http://arxiv.org/pdf/1904.09107


Comments

Content