papers AI Learner
The Github is limit! Click to go to the new site.

A Teacher-Student Framework for Zero-Resource Neural Machine Translation

2017-05-02
Yun Chen, Yang Liu, Yong Cheng, Victor O.K. Li

Abstract

While end-to-end neural machine translation (NMT) has made remarkable progress recently, it still suffers from the data scarcity problem for low-resource language pairs and domains. In this paper, we propose a method for zero-resource NMT by assuming that parallel sentences have close probabilities of generating a sentence in a third language. Based on this assumption, our method is able to train a source-to-target NMT model (“student”) without parallel corpora available, guided by an existing pivot-to-target NMT model (“teacher”) on a source-pivot parallel corpus. Experimental results show that the proposed method significantly improves over a baseline pivot-based model by +3.0 BLEU points across various language pairs.

Abstract (translated by Google)
URL

https://arxiv.org/abs/1705.00753

PDF

https://arxiv.org/pdf/1705.00753


Comments

Content