papers AI Learner
The Github is limit! Click to go to the new site.

Multi-Domain Neural Machine Translation

2018-05-06
Sander Tars, Mark Fishel

Abstract

We present an approach to neural machine translation (NMT) that supports multiple domains in a single model and allows switching between the domains when translating. The core idea is to treat text domains as distinct languages and use multilingual NMT methods to create multi-domain translation systems, we show that this approach results in significant translation quality gains over fine-tuning. We also explore whether the knowledge of pre-specified text domains is necessary, turns out that it is after all, but also that when it is not known quite high translation quality can be reached.

Abstract (translated by Google)
URL

https://arxiv.org/abs/1805.02282

PDF

https://arxiv.org/pdf/1805.02282


Comments

Content