papers AI Learner
The Github is limit! Click to go to the new site.

Unsupervised Neural Machine Translation with Weight Sharing

2018-04-24
Zhen Yang, Wei Chen, Feng Wang, Bo Xu

Abstract

Unsupervised neural machine translation (NMT) is a recently proposed approach for machine translation which aims to train the model without using any labeled data. The models proposed for unsupervised NMT often use only one shared encoder to map the pairs of sentences from different languages to a shared-latent space, which is weak in keeping the unique and internal characteristics of each language, such as the style, terminology, and sentence structure. To address this issue, we introduce an extension by utilizing two independent encoders but sharing some partial weights which are responsible for extracting high-level representations of the input sentences. Besides, two different generative adversarial networks (GANs), namely the local GAN and global GAN, are proposed to enhance the cross-language translation. With this new approach, we achieve significant improvements on English-German, English-French and Chinese-to-English translation tasks.

Abstract (translated by Google)
URL

https://arxiv.org/abs/1804.09057

PDF

https://arxiv.org/pdf/1804.09057


Comments

Content