papers AI Learner
The Github is limit! Click to go to the new site.

Boosting Neural Machine Translation

2017-10-03
Dakun Zhang, Jungi Kim, Josep Crego, Jean Senellart

Abstract

Training efficiency is one of the main problems for Neural Machine Translation (NMT). Deep networks need for very large data as well as many training iterations to achieve state-of-the-art performance. This results in very high computation cost, slowing down research and industrialisation. In this paper, we propose to alleviate this problem with several training methods based on data boosting and bootstrap with no modifications to the neural network. It imitates the learning process of humans, which typically spend more time when learning “difficult” concepts than easier ones. We experiment on an English-French translation task showing accuracy improvements of up to 1.63 BLEU while saving 20% of training time.

Abstract (translated by Google)
URL

https://arxiv.org/abs/1612.06138

PDF

https://arxiv.org/pdf/1612.06138


Comments

Content