papers AI Learner
The Github is limit! Click to go to the new site.

Exploiting Cross-Sentence Context for Neural Machine Translation

2017-07-23
Longyue Wang, Zhaopeng Tu, Andy Way, Qun Liu

Abstract

In translation, considering the document as a whole can help to resolve ambiguities and inconsistencies. In this paper, we propose a cross-sentence context-aware approach and investigate the influence of historical contextual information on the performance of neural machine translation (NMT). First, this history is summarized in a hierarchical way. We then integrate the historical representation into NMT in two strategies: 1) a warm-start of encoder and decoder states, and 2) an auxiliary context source for updating decoder states. Experimental results on a large Chinese-English translation task show that our approach significantly improves upon a strong attention-based NMT system by up to +2.1 BLEU points.

Abstract (translated by Google)
URL

https://arxiv.org/abs/1704.04347

PDF

https://arxiv.org/pdf/1704.04347


Similar Posts

Comments