papers AI Learner
The Github is limit! Click to go to the new site.

Massively Multilingual Neural Machine Translation

2019-02-28
Roee Aharoni, Melvin Johnson, Orhan Firat

Abstract

Multilingual neural machine translation (NMT) enables training a single model that supports translation from multiple source languages into multiple target languages. In this paper, we push the limits of multilingual NMT in terms of number of languages being used. We perform extensive experiments in training massively multilingual NMT models, translating up to 102 languages to and from English within a single model. We explore different setups for training such models and analyze the trade-offs between translation quality and various modeling decisions. We report results on the publicly available TED talks multilingual corpus where we show that massively multilingual many-to-many models are effective in low resource settings, outperforming the previous state-of-the-art while supporting up to 59 languages. Our experiments on a large-scale dataset with 102 languages to and from English and up to one million examples per direction also show promising results, surpassing strong bilingual baselines and encouraging future work on massively multilingual NMT.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1903.00089

PDF

http://arxiv.org/pdf/1903.00089


Comments

Content